Re: spark-shell failing but pyspark works

2016-04-04 Thread Cyril Scetbon
The only way I've found to make it work now is by using the current spark context and changing its configuration using spark-shell options. Which is really different from pyspark where you can't instantiate a new one, initialize it etc.. > On Apr 4, 2016, at 18:16, Cyril Scetbon wrote: > > It

Re: spark-shell failing but pyspark works

2016-04-04 Thread Cyril Scetbon
It doesn't as you can see : http://pastebin.com/nKcMCtGb I don't need to set the master as I'm using Yarn and I'm on one of the yarn nodes. When I instantiate the Spark Streaming Context with the spark conf, it tries to create a new Spark Context but even with .set("spark.driver.allowMultipleCo

Re: spark-shell failing but pyspark works

2016-04-04 Thread Mich Talebzadeh
Hi Cyril, You can connect to Spark shell from any node. The connection is made to master through --master IP Address like below: spark-shell --master spark://50.140.197.217:7077 Now in the Scala code you can specify something like below: val sparkConf = new SparkConf(). setAppName(

Re: spark-shell failing but pyspark works

2016-04-04 Thread Cyril Scetbon
I suppose it doesn't work using spark-shell too ? If you can confirm Thanks > On Apr 3, 2016, at 03:39, Mich Talebzadeh wrote: > > This works fine for me > > val sparkConf = new SparkConf(). > setAppName("StreamTest"). > setMaster("yarn-client"). > set("sp

Re: spark-shell failing but pyspark works

2016-04-03 Thread Mich Talebzadeh
This works fine for me val sparkConf = new SparkConf(). setAppName("StreamTest"). setMaster("yarn-client"). set("spark.cores.max", "12"). set("spark.driver.allowMultipleContexts", "true"). set("spark.hadoop.validateOutputSpecs", "fal

Re: spark-shell failing but pyspark works

2016-04-02 Thread Cyril Scetbon
Nobody has any idea ? > On Mar 31, 2016, at 23:22, Cyril Scetbon wrote: > > Hi, > > I'm having issues to create a StreamingContext with Scala using spark-shell. > It tries to access the localhost interface and the Application Master is not > running on this interface : > > ERROR ApplicationM

spark-shell failing but pyspark works

2016-03-31 Thread Cyril Scetbon
Hi, I'm having issues to create a StreamingContext with Scala using spark-shell. It tries to access the localhost interface and the Application Master is not running on this interface : ERROR ApplicationMaster: Failed to connect to driver at localhost:47257, retrying ... I don't have the issu