Newbie for Java. so please be specific on how to resolve this, The command I was running is
$ ./spark-submit --driver-class-path /home/cloudera/Downloads/spark-1.1.0-bin-hadoop2.3/lib/spark-examples-1.1.0-hadoop2.3.0.jar /home/cloudera/Downloads/spark-1.1.0-bin-hadoop2.3/examples/src/main/python/hbase_inputformat.py quickstart.cloudera data1 14/09/12 14:12:07 WARN spark.SparkConf: Setting 'spark.executor.extraClassPath' to ':/usr/lib/hbase/hbase-protocol-0.98.1-cdh5.1.0.jar:/usr/lib/hbase/hbase-protocol-0.98.1-cdh5.1.0.jar' as a work-around. Traceback (most recent call last): File "/home/cloudera/Downloads/spark-1.1.0-bin-hadoop2.3/examples/src/main/python/hbase_inputformat.py", line 61, in <module> sc = SparkContext(appName="HBaseInputFormat") File "/home/cloudera/Downloads/spark-1.1.0-bin-hadoop2.3/python/pyspark/context.py", line 107, in __init__ conf) File "/home/cloudera/Downloads/spark-1.1.0-bin-hadoop2.3/python/pyspark/context.py", line 155, in _do_init self._jsc = self._initialize_context(self._conf._jconf) File "/home/cloudera/Downloads/spark-1.1.0-bin-hadoop2.3/python/pyspark/context.py", line 201, in _initialize_context return self._jvm.JavaSparkContext(jconf) File "/usr/lib/python2.6/site-packages/py4j-0.8.2.1-py2.6.egg/py4j/java_gateway.py", line 701, in __call__ self._fqn) File "/usr/lib/python2.6/site-packages/py4j-0.8.2.1-py2.6.egg/py4j/protocol.py", line 300, in get_return_value format(target_id, '.', name), value) py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext. : org.apache.spark.SparkException: Found both spark.driver.extraClassPath and SPARK_CLASSPATH. Use only the former. at org.apache.spark.SparkConf$$anonfun$validateSettings$5$$anonfun$apply$6.apply(SparkConf.scala:300) at org.apache.spark.SparkConf$$anonfun$validateSettings$5$$anonfun$apply$6.apply(SparkConf.scala:298) at scala.collection.immutable.List.foreach(List.scala:318) at org.apache.spark.SparkConf$$anonfun$validateSettings$5.apply(SparkConf.scala:298) at org.apache.spark.SparkConf$$anonfun$validateSettings$5.apply(SparkConf.scala:286) at scala.Option.foreach(Option.scala:236) at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:286) at org.apache.spark.SparkContext.<init>(SparkContext.scala:158) at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379) at py4j.Gateway.invoke(Gateway.java:214) at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79) at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68) at py4j.GatewayConnection.run(GatewayConnection.java:207) at java.lang.Thread.run(Thread.java:745) -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-1-1-failure-class-conflict-tp14127.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org