Re: Spark processes not doing on killing corresponding YARN application

2014-09-09 Thread didata
I figured out this issue (in our case) ...And I'll vent a little in my reply here... =:)Fedora's well-intentioned firewall (firewall-cmd) requires you to open (enable) any port/services on a host that you need to connect to (including SSH/22 - which is enabled by default, of course). So when launch

Re: Spark processes not doing on killing corresponding YARN application

2014-09-04 Thread didata
Thanks for asking this. I've have this issue with pyspark too on YARN 100 of the time: I quit out of pyspark and, while my Unix shell prompt returns, a 'yarn application -list' always shows (as does the UI) that application is still running (or at least not totally dead). When I then log onto