Thanks Ted for the input. I was able to get it working with pyspark shell
but the same job submitted via 'spark-submit' using client or cluster
deploy mode ends up with these errors:
~
java.lang.OutOfMemoryError: Java heap space
at java.lang.Object.clone(Native Method)
at akka.util.CompactByte
Looks like the exception occurred on driver.
Consider increasing the values for the following config:
conf.set("spark.driver.memory", "10240m")
conf.set("spark.driver.maxResultSize", "2g")
Cheers
On Sat, Apr 9, 2016 at 9:02 PM, Buntu Dev wrote:
> I'm running it via pyspark against yarn in cli
I'm running it via pyspark against yarn in client deploy mode. I do notice
in the spark web ui under Environment tab all the options I've set, so I'm
guessing these are accepted.
On Sat, Apr 9, 2016 at 5:52 PM, Jacek Laskowski wrote:
> Hi,
>
> (I haven't played with GraphFrames)
>
> What's your
Hi,
(I haven't played with GraphFrames)
What's your `sc.master`? How do you run your application --
spark-submit or java -jar or sbt run or...? The reason I'm asking is
that few options might not be in use whatsoever, e.g.
spark.driver.memory and spark.executor.memory in local mode.
Pozdrawiam,