Re: Spark processes not doing on killing corresponding YARN application

2014-09-09 Thread didata
parkAppPort//tcpor, better yet, use a port-deterministic strategy mentioned earlier.(Hopefully the verbosity here will help someone in their furute search. Fedora aside, the original problem here can be network related, as I discovered).sincerely,didata -- View this message in context: http://apache-s

Re: Spark processes not doing on killing corresponding YARN application

2014-09-04 Thread didata
Thanks for asking this. I've have this issue with pyspark too on YARN 100 of the time: I quit out of pyspark and, while my Unix shell prompt returns, a 'yarn application -list' always shows (as does the UI) that application is still running (or at least not totally dead). When I then log onto

Spark processes not doing on killing corresponding YARN application

2014-09-04 Thread Hemanth Yamijala
Hi, I launched a spark streaming job under YARN using default configuration for Spark, using spark-submit with the master as yarn-cluster. It launched an ApplicationMaster, and 2 CoarseGrainedExecutorBackend processes. Everything ran fine, then I killed the application using yarn application -kil