Hi all,

I cannot figure out why this command is not setting the driver memory (it is
setting the executor memory):

    conf = (SparkConf()
                .setMaster("yarn-client")
                .setAppName("test")
                .set("spark.driver.memory", "1G")
                .set("spark.executor.memory", "1G")
                .set("spark.executor.instances", 2)
                .set("spark.executor.cores", 4))
    sc = SparkContext(conf=conf)

whereas if I run the spark console:
./bin/pyspark --driver-memory 1G

it sets it correctly. Seemingly they both generate the same commands in the
logs.

thanks a lot,





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/spark-driver-memory-is-not-set-pyspark-1-1-0-tp15498.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to