Thank you for your reply.
Which resource manager has support for rolling update? YARN?
Also where can I find this information in the documentation?
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
-
To unsubs
The reason of this is Spark RPC and the persisted states of HA mode are
both using Java serialization to serialize internal classes which don't
have any compatibility guarantee.
Best Regards,
Ryan
On Fri, Jan 31, 2020 at 9:08 AM Shixiong(Ryan) Zhu
wrote:
> Unfortunately, Spark standalone mode
Unfortunately, Spark standalone mode doesn't support rolling update. All
Spark components (master, worker, driver) must be updated to the same
version. When using HA mode, the states persisted in zookeeper (or files if
not using zookeeper) need to be cleaned because they are not compatible
between
Anyone?
This question is not regarding my application running on top of Spark.
The question is about the upgrade of spark itself from 2.2 to 2.4.
I expected atleast that spark would recover from upgrades gracefully and
recover its own persisted objects.
--
Sent from: http://apache-spark-user-li
Any help would be much appreciated.
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org
After digging deeper, we found that apps/workers inside zookeeper are not
deserializable but drivers can.
Due to this driver comes up (mysteriously).
The deserialization is failing due to "RpcEndpointRef".
I think somebody should be able to point me to a solution now, i guess.
--
Sent from: ht
A few details about clusters
- Current Version 2.2
- Resource manager: Spark standalone
- Modes: cluster + supervise
- HA setup: Zookeeper
- Expected version after upgrade: 2.4.4
Note: Before and after the upgrade, everything works fine.
During the upgrade, I see number of issues.
- Spark master