Because this was a maintenance release, we should not have introduced any
binary backwards or forwards incompatibilities. Therefore, applications
that were written and compiled against 1.1.0 should still work against a
1.1.1 cluster, and vice versa.
On Wed, Dec 3, 2014 at 1:30 PM, Andrew Or wrote
By the Spark server do you mean the standalone Master? It is best if they
are upgraded together because there have been changes to the Master in
1.1.1. Although it might "just work", it's highly recommended to restart
your cluster manager too.
2014-12-03 13:19 GMT-08:00 Romi Kuntsman :
> About ve
About version compatibility and upgrade path - can the Java application
dependencies and the Spark server be upgraded separately (i.e. will 1.1.0
library work with 1.1.1 server, and vice versa), or do they need to be
upgraded together?
Thanks!
*Romi Kuntsman*, *Big Data Engineer*
http://www.tot
Andrew and developers, thank you for excellent release!
It fixed almost all of our issues. Now we are migrating to Spark from Zoo of
Python, Java, Hive, Pig jobs.
Our Scala/Spark jobs often failed on 1.1. Spark 1.1.1 works like a Swiss
watch.
--
View this message in context:
http://apache-spar