Patrick Liu
Date: Thu, 28 Aug 2014 18:36:25 -0700
From: ml-node+s1001560n13084...@n3.nabble.com
To: linkpatrick...@live.com
Subject: Re: org.apache.hadoop.io.compress.SnappyCodec not found
Hi, I fixed the issue by copying libsnappy.so to Java ire.
RegardsArthur
On 29 Aug, 2014, at 8:12 am,
p; initialized
>> native-zlib library
>> Native library checking:
>> hadoop: true
>> /mnt/hadoop/hadoop-2.4.1_snappy/lib/native/Linux-amd64-64/libhadoop.so
>> zlib: true /lib64/libz.so.1
>> snappy: true
>> /mnt/hadoop/hadoop-2.4.1_snappy/lib/native/Linux-amd64-6
t;
> On 29 Aug, 2014, at 2:39 am, arthur.hk.c...@gmail.com
> wrote:
>
>> Hi,
>>
>> I use Hadoop 2.4.1 and HBase 0.98.5 with snappy enabled in both Hadoop and
>> HBase.
>> With default setting in Spark 1.0.2, when trying to load a file I got "Clas
java:39)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
> at
> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
> ... 55 more
> C
Hi,
I use Hadoop 2.4.1 and HBase 0.98.5 with snappy enabled in both Hadoop and
HBase.
With default setting in Spark 1.0.2, when trying to load a file I got "Class
org.apache.hadoop.io.compress.SnappyCodec not found"
Can you please advise how to enable snappy in Spark?
Regards
Arth