I ran into this recently. Turned out we had an old
org-xerial-snappy.properties file in one of our conf directories that
had the setting:
# Disables loading Snappy-Java native library bundled in the
# snappy-java-*.jar file forcing to load the Snappy-Java native
# library from the java.library.path.
#
org.xerial.snappy.disable.bundled.libs=true
When I switched that to false, it made the problem go away.
May or may not be your problem of course, but worth a look.
HTH,
DR
On 01/12/2015 03:28 PM, Dan Dong wrote:
Hi,
My Spark job failed with "no snappyjava in java.library.path" as:
Caused by: java.lang.UnsatisfiedLinkError: no snappyjava in
java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1857)
at java.lang.Runtime.loadLibrary0(Runtime.java:870)
at java.lang.System.loadLibrary(System.java:1119)
at
org.xerial.snappy.SnappyNativeLoader.loadLibrary(SnappyNativeLoader.java:52)
I'm running spark-1.1.1 on hadoop2.4. I found that the file is there and I
have included it in the
CLASSPATH already.
../hadoop/share/hadoop/tools/lib/snappy-java-1.0.4.1.jar
../hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar
../hadoop/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/snappy-java-1.0.4.1.jar
Did I miss anything or I should set it in other way?
Cheers,
Dan
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org