Hi,

---------------------------------------------

JavaStreamingContext ssc = new JavaStreamingContext(conf, new
Duration(10000));
        ssc.checkpoint(checkPointDir);

JavaStreamingContextFactory factory = new JavaStreamingContextFactory() {
            public JavaStreamingContext create() {
                return createContext(checkPointDir, outputDirectory);
            }

        };
        JavaStreamingContext ssc =
JavaStreamingContext.getOrCreate(checkPointDir, factory);

----------------------------------------------------

*first time, i run this. It work fine.*

*but, second time. it shows following error.*
*i deleted the checkpoint path and then it works.*

---------------------------------------------------
[user@h7 ~]$ spark-submit --jars /home/user/examples-spark-jar.jar  --conf
spark.driver.allowMultipleContexts=true --class com.spark.Pick --master
yarn-client --num-executors 10 --executor-cores 1 SNAPSHOT.jar
Spark assembly has been built with Hive, including Datanucleus jars on
classpath
2015-06-26 12:43:42,981 WARN  [main] util.NativeCodeLoader
(NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library
for your platform... using builtin-java classes where applicable
2015-06-26 12:43:44,246 WARN  [main] shortcircuit.DomainSocketFactory
(DomainSocketFactory.java:<init>(116)) - The short-circuit local reads
feature cannot be used because libhadoop cannot be loaded.

This is deprecated in Spark 1.0+.

Please instead use:
 - ./spark-submit with --driver-class-path to augment the driver classpath
 - spark.executor.extraClassPath to augment the executor classpath

Exception in thread "main" org.apache.spark.SparkException: Found both
spark.executor.extraClassPath and SPARK_CLASSPATH. Use only the former.
    at
org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply$7.apply(SparkConf.scala:334)
    at
org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply$7.apply(SparkConf.scala:332)
    at scala.collection.immutable.List.foreach(List.scala:318)
    at
org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:332)
    at
org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:320)
    at scala.Option.foreach(Option.scala:236)
    at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:320)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:178)
    at
org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:118)
    at
org.apache.spark.streaming.StreamingContext$$anonfun$getOrCreate$1.apply(StreamingContext.scala:561)
    at
org.apache.spark.streaming.StreamingContext$$anonfun$getOrCreate$1.apply(StreamingContext.scala:561)
    at scala.Option.map(Option.scala:145)
    at
org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:561)
    at
org.apache.spark.streaming.api.java.JavaStreamingContext$.getOrCreate(JavaStreamingContext.scala:566)
    at
org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(JavaStreamingContext.scala)
    at
com.orzota.kafka.kafka.TotalPicsWithScore.main(TotalPicsWithScore.java:159)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:360)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:76)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
[user@h7 ~]

----------------------------------------------

*can anyone help me with it*


*thanks*

Reply via email to