Can you show me at Spark UI -> executors tab and storage tab.
It will show us how many executor was executed and how much memory we use to
cache.
> On Jul 14, 2016, at 9:49 AM, Jean Georges Perrin wrote:
>
> I use it as a standalone cluster.
>
> I run it through start-master, then start-sl
I use it as a standalone cluster.
I run it through start-master, then start-slave. I only have one slave now, but
I will probably have a few soon.
The "application" is run on a separate box.
When everything was running on my mac, i was in local mode, but i never setup
anything in local mode. G
Hi Jean,
How do you run your Spark Application? Local Mode, Cluster Mode?
If you run in local mode did you use —driver-memory and —executor-memory
because in local mode your setting about executor and driver didn’t work that
you expected.
> On Jul 14, 2016, at 8:43 AM, Jean Georges Perrin w
Looks like replacing the setExecutorEnv() by set() did the trick... let's see
how fast it'll process my 50x 10ˆ15 data points...
> On Jul 13, 2016, at 9:24 PM, Jean Georges Perrin wrote:
>
> I have added:
>
> SparkConf conf = new
> SparkConf().setAppName("app").setExecutorEnv("s
I have added:
SparkConf conf = new
SparkConf().setAppName("app").setExecutorEnv("spark.executor.memory", "8g")
.setMaster("spark://10.0.100.120:7077");
but it did not change a thing
> On Jul 13, 2016, at 9:14 PM, Jean Georges Perrin wrote:
>
> H
Hi,
I have a Java memory issue with Spark. The same application working on my 8GB
Mac crashes on my 72GB Ubuntu server...
I have changed things in the conf file, but it looks like Spark does not care,
so I wonder if my issues are with the driver or executor.
I set:
spark.driver.memory