Hi, I am trying to configure a history server for application. 
When I running locally(./run-example SparkPi), the event logs are being 
created, and I can start history server.
But when I am trying
./spark-submit --class org.apache.spark.examples.SparkPi --master yarn-cluster 
file:///opt/hadoop/spark/examples/src/main/python/pi.py 
<file:///opt/hadoop/spark/examples/src/main/python/pi.py>
I am getting 
15/08/01 18:18:50 INFO yarn.Client:
         client token: N/A
         diagnostics: N/A
         ApplicationMaster host: 192.168.56.192
         ApplicationMaster RPC port: 0
         queue: default
         start time: 1438445890676
         final status: SUCCEEDED
         tracking URL: http://sp-m1:8088/proxy/application_1438444529840_0009/A
         user: hadoop
15/08/01 18:18:50 INFO util.Utils: Shutdown hook called
15/08/01 18:18:50 INFO util.Utils: Deleting directory 
/tmp/spark-185f7b83-cb3b-4134-a10c-452366204f74
So it is succeeded, but there is no event logs for this application.

here are my configs
spark-defaults.conf
spark.master                    yarn-cluster
spark.eventLog.dir              /opt/spark/spark-events
spark.eventLog.enabled  true

spark-env.sh
export HADOOP_CONF_DIR="/opt/hadoop/etc/hadoop"
export SPARK_DAEMON_JAVA_OPTS="-Dspark.deploy.recoveryMode=ZOOKEEPER 
-Dspark.deploy.zookeeper.url=“zk1:2181,zk2:2181”
export 
SPARK_HISTORY_OPTS="-Dspark.history.provider=org.apache.spark.deploy.history.FsHistoryProvider
 -Dspark.history.fs.logDirectory=file:/opt/spark/spark-events 
-Dspark.history.fs.cleaner.enabled=true"

Any ideas?

Thank you

Reply via email to