It doesn't as you can see : http://pastebin.com/nKcMCtGb

I don't need to set the master as I'm using Yarn and I'm on one of the yarn 
nodes. When I instantiate the Spark Streaming Context with the spark conf, it 
tries to create a new Spark Context but even with 
.set("spark.driver.allowMultipleContexts", "true") it doesn't work and 
complains at line 956 that the Spark Context created by spark-shell was not 
initialized with allowMultipleContexts ...


> On Apr 4, 2016, at 16:29, Mich Talebzadeh <mich.talebza...@gmail.com> wrote:
> 
> Hi Cyril,
> 
> You can connect to Spark shell from any node. The connection is made to 
> master through --master IP Address like below:
> 
> spark-shell --master spark://50.140.197.217:7077 <http://50.140.197.217:7077/>
> 
> Now in the Scala code you can specify something like below:
> 
> val sparkConf = new SparkConf().
>              setAppName("StreamTest").
>              setMaster("local").
>              set("spark.cores.max", "2").
>              set("spark.driver.allowMultipleContexts", "true").
>              set("spark.hadoop.validateOutputSpecs", "false")
> 
> And that will work
> 
> Have you tried it?
> 
> HTH
> 
> Dr Mich Talebzadeh
>  
> LinkedIn  
> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>  
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>
>  
> http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/>
>  
> 
> On 4 April 2016 at 21:11, Cyril Scetbon <cyril.scet...@free.fr 
> <mailto:cyril.scet...@free.fr>> wrote:
> I suppose it doesn't work using spark-shell too ? If you can confirm
> 
> Thanks
> 
>> On Apr 3, 2016, at 03:39, Mich Talebzadeh <mich.talebza...@gmail.com 
>> <mailto:mich.talebza...@gmail.com>> wrote:
>> 
>> This works fine for me
>> 
>> val sparkConf = new SparkConf().
>>              setAppName("StreamTest").
>>              setMaster("yarn-client").
>>              set("spark.cores.max", "12").
>>              set("spark.driver.allowMultipleContexts", "true").
>>              set("spark.hadoop.validateOutputSpecs", "false")
>> 
>> Time: 1459669805000 ms
>> -------------------------------------------
>> -------------------------------------------
>> Time: 1459669860000 ms
>> -------------------------------------------
>> (Sun Apr 3 08:35:01 BST 2016  ======= Sending messages from rhes5)
>> 
>> 
>> 
>> 
>> Dr Mich Talebzadeh
>>  
>> LinkedIn  
>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>>  
>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>
>>  
>> http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/>
>>  
>> 
>> On 3 April 2016 at 03:34, Cyril Scetbon <cyril.scet...@free.fr 
>> <mailto:cyril.scet...@free.fr>> wrote:
>> Nobody has any idea ?
>> 
>> > On Mar 31, 2016, at 23:22, Cyril Scetbon <cyril.scet...@free.fr 
>> > <mailto:cyril.scet...@free.fr>> wrote:
>> >
>> > Hi,
>> >
>> > I'm having issues to create a StreamingContext with Scala using 
>> > spark-shell. It tries to access the localhost interface and the 
>> > Application Master is not running on this interface :
>> >
>> > ERROR ApplicationMaster: Failed to connect to driver at localhost:47257, 
>> > retrying ...
>> >
>> > I don't have the issue with Python and pyspark which works fine (you can 
>> > see it uses the ip address) :
>> >
>> > ApplicationMaster: Driver now available: 192.168.10.100:43290 
>> > <http://192.168.10.100:43290/>
>> >
>> > I use similar codes though :
>> >
>> > test.scala :
>> > --------------
>> >
>> > import org.apache.spark._
>> > import org.apache.spark.streaming._
>> > val app = "test-scala"
>> > val conf = new SparkConf().setAppName(app).setMaster("yarn-client")
>> > val ssc = new StreamingContext(conf, Seconds(3))
>> >
>> > command used : spark-shell -i test.scala
>> >
>> > test.py :
>> > -----------
>> >
>> > from pyspark import SparkConf, SparkContext
>> > from pyspark.streaming import StreamingContext
>> > app = "test-python"
>> > conf = SparkConf().setAppName(app).setMaster("yarn-client")
>> > sc = SparkContext(conf=conf)
>> > ssc = StreamingContext(sc, 3)
>> >
>> > command used : pyspark test.py
>> >
>> > Any idea why scala can't instantiate it ? I thought python was barely 
>> > using scala under the hood, but it seems there are differences. Are there 
>> > any parameters set using Scala but not Python ?
>> >
>> > Thanks
>> > --
>> > Cyril SCETBON
>> >
>> >
>> > ---------------------------------------------------------------------
>> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org 
>> > <mailto:user-unsubscr...@spark.apache.org>
>> > For additional commands, e-mail: user-h...@spark.apache.org 
>> > <mailto:user-h...@spark.apache.org>
>> >
>> 
>> 
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org 
>> <mailto:user-unsubscr...@spark.apache.org>
>> For additional commands, e-mail: user-h...@spark.apache.org 
>> <mailto:user-h...@spark.apache.org>
>> 
>> 
> 
> 

Reply via email to