2 cents:
1. You should use an environment management tool, such as ansible, puppet
or chef to handle this kind of use cases (and lot more, Eg what if you want
to add more nodes or to replace one bad node)
2. There are options such as -py-files to provide a zip file
On Tue, Jan 12, 2016 at 6:11 AM
When you run spark submit in either client or cluster mode, you can either use
the options --packages or -jars to automatically copy your packages to the
worker machines.
Thanks
On Monday, January 11, 2016 12:52 PM, Andy Davidson
wrote:
I use https://code.google.com/p/parallel-ssh/ to
I use https://code.google.com/p/parallel-ssh/ to upgrade all my slaves
From: "taotao.li"
Date: Sunday, January 10, 2016 at 9:50 PM
To: "user @spark"
Subject: pre-install 3-party Python package on spark cluster
> I have a spark cluster, from machine-1 to machine 100, and machine-1 acts as