created a EC2 cluster using spark-ec2 command. If I run the pi.py example in
the cluster without using the example.jar, it works. But if I added the
example.jar as the driver class (sth like follows), it will fail with an
exception. Could anyone help with this? -- what is the cause of the problem?
The compilation looked fine to me.. but since I don't have much java
experience, I cannot figure out why adding the driver class path can cause
any conflict. Thanks!

./spark-submit --driver-class-path
/root/workspace/test/spark-examples-1.1.0-SNAPSHOT-hadoop2.3.0.jar
/root/workspace/test/pi.py 

14/10/20 20:37:28 INFO spark.HttpServer: Starting HTTP Server
14/10/20 20:37:28 INFO server.Server: jetty-8.y.z-SNAPSHOT
14/10/20 20:37:28 INFO server.AbstractConnector: Started
SocketConnector@0.0.0.0:46426
14/10/20 20:37:28 INFO server.Server: jetty-8.y.z-SNAPSHOT
14/10/20 20:37:28 INFO server.AbstractConnector: Started
SelectChannelConnector@0.0.0.0:4040
14/10/20 20:37:28 INFO ui.SparkUI: Started SparkUI at
http://********************:4040
Traceback (most recent call last):
  File "/root/workspace/test/pi.py", line 7, in <module>
    sc = SparkContext(conf=conf)
  File "/root/spark/python/pyspark/context.py", line 134, in __init__
    self._jsc = self._initialize_context(self._conf._jconf)
  File "/root/spark/python/pyspark/context.py", line 178, in
_initialize_context
    return self._jvm.JavaSparkContext(jconf)
  File "/root/spark/python/lib/py4j-0.8.1-src.zip/py4j/java_gateway.py",
line 669, in __call__
  File "/root/spark/python/lib/py4j-0.8.1-src.zip/py4j/protocol.py", line
300, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling
None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.ExceptionInInitializerError
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:228)
        at
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
        at py4j.Gateway.invoke(Gateway.java:214)
        at
py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
        at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
        at py4j.GatewayConnection.run(GatewayConnection.java:207)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException:
java.lang.reflect.InvocationTargetException
        at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:131)
        at org.apache.hadoop.security.Groups.<init>(Groups.java:55)
        at
org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:182)
        at
org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:235)
        at
org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:249)
        at 
org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:36)
        at
org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:109)
        at 
org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
        ... 13 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129)
        ... 20 more
Caused by: java.lang.UnsatisfiedLinkError:
org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative()V
        at 
org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative(Native
Method)
        at
org.apache.hadoop.security.JniBasedUnixGroupsMapping.<clinit>(JniBasedUnixGroupsMapping.java:49)
        at
org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.<init>(JniBasedUnixGroupsMappingWithFallback.java:38)
        ... 25 more





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/example-jar-caused-exception-when-running-pi-py-spark-1-1-tp16849.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to