Cheolsoo Park created SPARK-9270:
------------------------------------

             Summary: spark.app.name is not honored by spark-shell and pyspark
                 Key: SPARK-9270
                 URL: https://issues.apache.org/jira/browse/SPARK-9270
             Project: Spark
          Issue Type: Bug
          Components: PySpark, Spark Shell
    Affects Versions: 1.4.1, 1.5.0
            Reporter: Cheolsoo Park
            Priority: Minor


Currently, the app name is hardcoded in spark-shell and pyspark as "SparkShell" 
and "PySparkShell" respectively, and the {{spark.app.name}} property is not 
honored.

But being able to set the app name is quite handy for various cluster 
operations. For eg, filter jobs whose app name is "X" on YARN RM page, etc.

SPARK-8650 fixed this issue for spark-sql, but it didn't for spark-shell and 
pyspark. sparkR is different because {{SparkContext}} is not automatically 
constructed in sparkR, and the app name can be set when intializing 
{{SparkContext}}.

In summary-
||shell||support --conf spark.app.name||
|spark-shell|no|
|pyspark|no|
|spark-sql|yes|
|sparkR|n/a| 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to