Yes, we are standalone right now. Do you have literature why one would want
to consider Mesos or YARN for Spark deployments?

Sounds like I should try upgrading my project and seeing if everything
compiles without modification. Then I can connect to an existing 1.0.0
cluster and see what what happens...

Thanks, Matei :)


On Tue, Aug 26, 2014 at 6:37 PM, Matei Zaharia <matei.zaha...@gmail.com>
wrote:

> Is this a standalone mode cluster? We don't currently make this guarantee,
> though it will likely work in 1.0.0 to 1.0.2. The problem though is that
> the standalone mode grabs the executors' version of Spark code from what's
> installed on the cluster, while your driver might be built against another
> version. On YARN and Mesos, you can more easily mix different versions of
> Spark, since each application ships its own Spark JAR (or references one
> from a URL), and this is used for both the driver and executors.
>
> Matei
>
> On August 26, 2014 at 6:10:57 PM, Victor Tso-Guillen (v...@paxata.com)
> wrote:
>
>  I wanted to make sure that there's full compatibility between minor
> releases. I have a project that has a dependency on spark-core so that it
> can be a driver program and that I can test locally. However, when
> connecting to a cluster you don't necessarily know what version you're
> connecting to. Is a 1.0.0 cluster binary compatible with a 1.0.2 driver
> program? Is a 1.0.0 driver program binary compatible with a 1.0.2 cluster?
>
>

Reply via email to