Hi all,

not sure if this is a config issue or it's by design, but when I run the
spark shell, and try to submit another application from elsewhere, the
second application waits for the first to finish and outputs the following:

Initial job has not accepted any resources; check your cluster UI to ensure
that workers are registered and have sufficient memory. 

I have four workers, each have some additional resources to take up the new
application.

thanks,



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/running-multiple-applications-at-the-same-time-tp8333.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to