hi Akhil,

I just use property key LD_LIBRARY_PATH in conf/spark-env.xml instead of
SPARK_LIBRARY_PATH which points to the path of native, it works.

thanks.

On Tue, Sep 8, 2015 at 6:14 PM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> Looks like you are having different versions of snappy library. Here's a
> similar discussion if you haven't seen it already
> http://stackoverflow.com/questions/22150417/hadoop-mapreduce-java-lang-unsatisfiedlinkerror-org-apache-hadoop-util-nativec
>
> Thanks
> Best Regards
>
> On Mon, Sep 7, 2015 at 7:41 AM, dong.yajun <dongt...@gmail.com> wrote:
>
>> hi all,
>>
>> I met problem that can't read the file with snappy encoding from HDFS in
>> Spark1.4.1,
>>
>> I have configured the SPARK_LIBRARY_PATH property in conf/spark-env.sh to
>> the native path of Hadoop and restarted the spark cluster
>>
>> SPARK_LIBRARY_PATH=$SPARK_LIBRARY_PATH:/opt/app/install/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop/lib/native
>>
>>
>> the partial exception:
>>
>> Caused by: org.apache.hadoop.hbase.io.hfile.CorruptHFileException: Problem 
>> reading HFile Trailer from file
>> hdfs://nameservice1/hbase/data/default/IM_ItemBase/02296539242087aea77877dced9ba3d5/BaseInfo/9fe36f74334c4d30ba1bfc17bbd717f5
>>
>>          at
>> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:478)
>>
>>          at
>> org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:521)
>>
>>          at
>> com.newegg.ec.bigdata.dump.CombineHFileRecordReader.<init>(CombineHFileRecordReader.java:33)
>>
>>          ... 19 more
>>
>> Caused by: java.lang.UnsatisfiedLinkError:
>> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>>
>>          at
>> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native Method)
>>
>>          at
>> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
>>
>>          at
>> org.apache.hadoop.io.compress.SnappyCodec.getDecompressorType(SnappyCodec.java:192)
>>
>>          at
>> org.apache.hadoop.io.compress.CodecPool.getDecompressor(CodecPool.java:176)
>>
>>          at
>> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getDecompressor(Compression.java:328)
>>
>>          at
>> org.apache.hadoop.hbase.io.compress.Compression.decompress(Compression.java:423)
>>
>>          at
>> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultDecodingContext.prepareDecoding(HFileBlockDefaultDecodingContext.java:90)
>>
>> --
>> *Ric Dong*
>>
>>
>


-- 
*Ric Dong*

Reply via email to