Hi,
Read the doc http://spark.apache.org/docs/latest/spark-standalone.html
which seems to be the cluster manager the OP uses.
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklask
HiI am sorry but its still not clearDo you mean ./bin/spark-shell --master
localAnd what I do after that killing the org.apache.spark.deploy.SparkSubmit
--master local --class org.apache.spark.repl.Main --name Spark shell spark-shell
will kill the shell so I couldn't send the commands .Thanks
Hi,
Then use --master with spark standalone, yarn, or mesos.
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Thu, Jul 7, 2016 at 10:35 PM, Mr rty ff wrote:
> I do
I don't think Its the proper way to recreate the bug becouse I should continue
to send commands to the shellThey talking about killing the
CoarseGrainedExecutorBackend
On Thursday, July 7, 2016 11:32 PM, Jacek Laskowski wrote:
Hi,
It appears you're running local mode (local[*] assumed)
Hi,
It appears you're running local mode (local[*] assumed) so killing
spark-shell *will* kill the one and only executor -- the driver :)
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.c
This what I get when I run the command946 sun.tools.jps.Jps -lm7443
org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main --name
Spark shell spark-shellI don't think that shululd kill SparkSubmit process
On Thursday, July 7, 2016 9:58 PM, Jacek Laskowski wrote:
Hi,
Hi,
Use jps -lm and see the processes on the machine(s) to kill.
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Wed, Jul 6, 2016 at 9:49 PM, Mr rty ff wrote:
> H