The reason of this is Spark RPC and the persisted states of HA mode are
both using Java serialization to serialize internal classes which don't
have any compatibility guarantee.

Best Regards,
Ryan


On Fri, Jan 31, 2020 at 9:08 AM Shixiong(Ryan) Zhu <shixi...@databricks.com>
wrote:

> Unfortunately, Spark standalone mode doesn't support rolling update. All
> Spark components (master, worker, driver) must be updated to the same
> version. When using HA mode, the states persisted in zookeeper (or files if
> not using zookeeper) need to be cleaned because they are not compatible
> between versions.
>
> Best Regards,
> Ryan
>
>
> On Wed, Jan 29, 2020 at 2:12 AM bsikander <behro...@gmail.com> wrote:
>
>> Anyone?
>> This question is not regarding my application running on top of Spark.
>> The question is about the upgrade of spark itself from 2.2 to 2.4.
>>
>> I expected atleast that spark would recover from upgrades gracefully and
>> recover its own persisted objects.
>>
>>
>>
>> --
>> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>
>> --
>
> Best Regards,
> Ryan
>

Reply via email to