Thanks. Actually I've find the way. I'm using spark-submit to submit the
job the a YARN cluster with --mater yarn-cluster (which spark-submit
process is not the driver). So I can config
"spark.yarn.submit.waitAppComplettion" to "false" so that the process will
exit after the job is submitted.
ayan
spark-submit is nothing but a process in your OS, so you should be able to
submit it in background and exit. However, your spark-submit process itself
is the driver for your spark streaming application, so it will not exit for
the lifetime of the streaming app.
On Wed, Jul 8, 2015 at 1:13 PM, Bin