Free memory while launching jobs.

2016-05-03 Thread mjordan79
200MB. Why this behaviour? Why Spark is using practically all the available RAM if I use only 1 worker with a 2.8GB limit in total? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Free-memory-while-launching-jobs-tp26872.html Sent from the Apache Spark User

Free memory while launching jobs.

2016-05-03 Thread Renato Perini
I have a machine with an 8GB total memory, on which there are other applications installed. The Spark application must run 1 driver and two jobs at a time. I have configured 8 cores in total. The machine (without Spark) has 4GB of free RAM (the other half RAM is used by other applications).