After I do a clean rebuild. It works now.
Thanks,
Qiuzhuang
On Sat, Oct 25, 2014 at 9:42 AM, Nan Zhu wrote:
> According to my experience, there are more issues rather than
> BlockManager when you try to run spark application whose build version is
> different with your cluster….
>
> I once tri
According to my experience, there are more issues rather than BlockManager when
you try to run spark application whose build version is different with your
cluster….
I once tried to make jdbc server build with branch-jdbc-1.0 run with a
branch-1.0 cluster…no workaround exits…just had to repla
I update git trunk and build in the two linux machines. I think they should
have the same version. I am going to do a force clean build and then retry.
Thanks.
On Sat, Oct 25, 2014 at 9:23 AM, Josh Rosen wrote:
> Are all processes (Master, Worker, Executors, Driver) running the same
> Spark bu
Are all processes (Master, Worker, Executors, Driver) running the same Spark
build? This error implies that you’re seeing protocol / binary
incompatibilities between your Spark driver and cluster.
Spark is API-compatibile across the 1.x series, but we don’t make binary
link-level compatibility