Resolving HIVE-16391 means Hive to release 1.2.x that contains the fixes of
our Hive fork (correct me if I am mistaken).

Just to be honest by myself and as a personal opinion, that basically says
Hive to take care of Spark's dependency.
Hive looks going ahead for 3.1.x and no one would use the newer release of
1.2.x. In practice, Spark doesn't make a release 1.6.x anymore for instance,

Frankly, my impression was that it's, honestly, our mistake to fix. Since
Spark community is big enough, I was thinking we should try to fix it by
ourselves first.
I am not saying upgrading is the only way to get through this but I think
we should at least try first, and see what's next.

It does, yes, sound more risky to upgrade it in our side but I think it's
worth to check and try it and see if it's possible.
I think this is a standard approach to upgrade the dependency than using
the fork or letting Hive side to release another 1.2.x.

If we fail to upgrade it for critical or inevitable reasons somehow, yes,
we could find an alternative but that basically means
we're going to stay in 1.2.x for, at least, a long time (say .. until Spark
4.0.0?).

I know somehow it happened to be sensitive but to be just literally honest
to myself, I think we should make a try.

Reply via email to