On 23 Feb 2016, at 08:22, Arunkumar Pillai 
<arunkumar1...@gmail.com<mailto:arunkumar1...@gmail.com>> wrote:

        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
        at 
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129)
        ... 33 more
Caused by: java.lang.UnsatisfiedLinkError: 
org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative()V
        at 
org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative(Native Method)
        at 
org.apache.hadoop.security.JniBasedUnixGroupsMapping.<clinit>(JniBasedUnixGroupsMapping.java:49)
        at 
org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.<init>(JniBasedUnixGroupsMappingWithFallback.java:39)

looks supiciously like there's a mismatch between the libhadoop.so on your path 
and the Hadoop version. Is there a Hadoop 2.6 installed on the same system?

you could try to skip that bit of JNI code by switching to the shell:

spark.hadoop.security.group.mapping 
org.apache.hadoop.security.ShellBasedUnixGroupsMapping

...but that will just postpone the problem

Best to find all copies of libhadoop.so on your filesystem, and make sure the 
one that gets loaded is the Hadoop 2.7 one

Reply via email to