Class com.hadoop.compression.lzo.LzoCodec not found for Spark
Here is the solution to solve the Class com.hadoop.compression.lzo.LzoCodec not found for Spark:
Put the following command at spark-env.sh
, or simply run it before submit the spark jobs.
export SPARK_CLASSPATH=”$(ls ${HADOOP_PREFIX}/share/hadoop/common/hadoop-gpl-compression.jar)”