Re: Memory config issues

2015-01-19 Thread Sean Owen
On Mon, Jan 19, 2015 at 6:29 AM, Akhil Das wrote: > Its the executor memory (spark.executor.memory) which you can set while > creating the spark context. By default it uses 0.6% of the executor memory (Uses 0.6 or 60%) - To unsu

Re: Memory config issues

2015-01-18 Thread Alessandro Baretta
Akhil, Ah, very good point. I guess "SET spark.sql.shuffle.partitions=1024" should do it. Alex On Sun, Jan 18, 2015 at 10:29 PM, Akhil Das wrote: > Its the executor memory (spark.executor.memory) which you can set while > creating the spark context. By default it uses 0.6% of the executor memo

Re: Memory config issues

2015-01-18 Thread Akhil Das
Its the executor memory (spark.executor.memory) which you can set while creating the spark context. By default it uses 0.6% of the executor memory for Storage. Now, to show some memory usage, you need to cache (persist) the RDD. Regarding the OOM Exception, you can increase the level of parallelism

Memory config issues

2015-01-18 Thread Alessandro Baretta
All, I'm getting out of memory exceptions in SparkSQL GROUP BY queries. I have plenty of RAM, so I should be able to brute-force my way through, but I can't quite figure out what memory option affects what process. My current memory configuration is the following: export SPARK_WORKER_MEMORY=8397