Andrew,
Yes, this works after cleaning up the .staging as you suggested. Thanks a
lot!
Jian
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/jar-changed-on-src-filesystem-tp10011p10216.html
Sent from the Apache Spark User List mailing list archive at Nabble
Hi Jian,
In yarn-cluster mode, Spark submit automatically uploads the assembly jar
to a distributed cache that all executor containers read from, so there is
no need to manually copy the assembly jar to all nodes (or pass it through
--jars).
It seems there are two versions of the same jar in your
They're all the same version. Actually even without the "--jars" parameter it
got the same error. Looks like it needs to copy the assembly jar for running
the example jar anyway during the staging.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/jar-changed-
Since you are running in yarn-cluster mode, and you are supply the spark
assembly jar file. There is no need to install spark on each node. Is it
possible two spark jars have different version ?
Chester
Sent from my iPad
On Jul 16, 2014, at 22:49, cmti95035 wrote:
> Hi,
>
> I need some hel