[ https://issues.apache.org/jira/browse/SPARK-9270?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14638325#comment-14638325 ]
Sean Owen commented on SPARK-9270: ---------------------------------- Yeah there's probably an additional change to be made, just have a look at what was done in that PR to propose something consistent. > spark.app.name is not honored by pyspark > ---------------------------------------- > > Key: SPARK-9270 > URL: https://issues.apache.org/jira/browse/SPARK-9270 > Project: Spark > Issue Type: Bug > Components: PySpark > Affects Versions: 1.4.1, 1.5.0 > Reporter: Cheolsoo Park > Priority: Minor > > Currently, the app name is hardcoded in pyspark as "PySparkShell", and the > {{spark.app.name}} property is not honored. > SPARK-8650 fixed this issue for spark-sql, but pyspark is not fixed. > SPARK-9180 introduced a new option {{--name}} for spark-shell, but the > {{spark.app.name}} property isn't honored in spark-shell. > sparkR is different because {{SparkContext}} is not automatically constructed > in sparkR, and the app name can be set when initializing {{SparkContext}}. > In summary- > ||shell||support --conf spark.app.name|| > |pyspark|no| > |spark-shell|no, but --name has the same result| > |spark-sql|yes| > |sparkR|n/a| -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org