The only way I've found to make it work now is by using the current spark
context and changing its configuration using spark-shell options. Which is
really different from pyspark where you can't instantiate a new one, initialize
it etc..
> On Apr 4, 2016, at 18:16, Cyril Scetbon wrote:
>
> It
It doesn't as you can see : http://pastebin.com/nKcMCtGb
I don't need to set the master as I'm using Yarn and I'm on one of the yarn
nodes. When I instantiate the Spark Streaming Context with the spark conf, it
tries to create a new Spark Context but even with
.set("spark.driver.allowMultipleCo
Hi Cyril,
You can connect to Spark shell from any node. The connection is made to
master through --master IP Address like below:
spark-shell --master spark://50.140.197.217:7077
Now in the Scala code you can specify something like below:
val sparkConf = new SparkConf().
setAppName(
I suppose it doesn't work using spark-shell too ? If you can confirm
Thanks
> On Apr 3, 2016, at 03:39, Mich Talebzadeh wrote:
>
> This works fine for me
>
> val sparkConf = new SparkConf().
> setAppName("StreamTest").
> setMaster("yarn-client").
> set("sp
This works fine for me
val sparkConf = new SparkConf().
setAppName("StreamTest").
setMaster("yarn-client").
set("spark.cores.max", "12").
set("spark.driver.allowMultipleContexts", "true").
set("spark.hadoop.validateOutputSpecs", "fal
Nobody has any idea ?
> On Mar 31, 2016, at 23:22, Cyril Scetbon wrote:
>
> Hi,
>
> I'm having issues to create a StreamingContext with Scala using spark-shell.
> It tries to access the localhost interface and the Application Master is not
> running on this interface :
>
> ERROR ApplicationM
Hi,
I'm having issues to create a StreamingContext with Scala using spark-shell. It
tries to access the localhost interface and the Application Master is not
running on this interface :
ERROR ApplicationMaster: Failed to connect to driver at localhost:47257,
retrying ...
I don't have the issu