Hi Andres,
If you're using the EC2 scripts to start your standalone cluster, you can
use "~/spark-ec2/copy-dir --delete ~/spark" to sync your jars across the
cluster. Note that you will need to restart the Master and the Workers
afterwards through "sbin/start-all.sh" and "sbin/stop-all.sh". If you
Looks like a netty conflict there, most likely you are having mutiple
versions of netty jars (eg:
netty-3.6.6.Final.jar, netty-3.2.2.Final.jar, netty-all-4.0.13.Final.jar),
you only require 3.6.6 i believe. a quick fix would be to remove the rest
of them.
Thanks
Best Regards
On Wed, Aug 6, 2014
Hi all,
My name is Andres and I'm starting to use Apache Spark.
I try to submit my spark.jar to my cluster using this:
spark-submit --class "net.redborder.spark.RedBorderApplication" --master
spark://pablo02:7077 redborder-spark-selfcontained.jar
But when I did it .. My worker die .. and my dr