Re: [EXTERNAL] [SPARK Memory management] Does Spark support setting limits/requests for driver/executor memory ?

2022-12-08 Thread Shay Elbaz
From: Yosr Kchaou Sent: Wednesday, December 7, 2022 10:19 AM To: user@spark.apache.org Subject: [EXTERNAL] [SPARK Memory management] Does Spark support setting limits/requests for driver/executor memory ? ATTENTION: This email originated from outside of GM. Hello, We are running Spark on Kuber

[SPARK Memory management] Does Spark support setting limits/requests for driver/executor memory ?

2022-12-07 Thread Yosr Kchaou
Hello, We are running Spark on Kubernetes and noticed that driver/executors use the same value for memory request and memory limit. We see that limits/requests can be set only for cpu using the following options: spark.kubernetes.{driver/executor}.limit.cores and spark.kubernetes.{driver/executor}

Re: Spark Memory management

2015-05-22 Thread ??????
math.max(totalMb - 1024, 512) } -- Original ------ From: "swaranga";; Date: Fri, May 22, 2015 03:31 PM To: "user"; Subject: Spark Memory management Experts, This is an academic question. Since Spark runs on the JVM, how it is able to do th

Re: Spark Memory management

2015-05-22 Thread Akhil Das
nks for any inputs. > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Memory-management-tp22992.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > >

Spark Memory management

2015-05-22 Thread swaranga
memory? How accurate are these calculations? Thanks for any inputs. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Memory-management-tp22992.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Spark memory management

2014-08-06 Thread Andrew Or
Hey Gary, The answer to both of your questions is that much of it is up to the application. For (1), the standalone master can set "spark.deploy.defaultCores" to limit the number of cores each application can grab. However, the application can override this with the applications-specific "spark.c

Spark memory management

2014-08-06 Thread Gary Malouf
I have a few questions about managing Spark memory: 1) In a standalone setup, is their any cpu prioritization across users running jobs? If so, what is the behavior here? 2) With Spark 1.1, users will more easily be able to run drivers/shells from remote locations that do not cause firewall head