Here's another thread to start considering, and I know it's been raised
before.
What version(s) of Hive should Spark 3 support?

If at least we know it won't include Hive 0.x, could we go ahead and remove
those tests from master? It might significantly reduce the run time and
flakiness.

It seems that maintaining even the Hive 1.x fork is untenable going
forward, right? does that also imply this support is almost certainly not
maintained in 3.0?

Per below, it seems like it might even be hard to both support Hive 3 and
Hadoop 2 at the same time?

And while we're at it, what's the + and - for simply only supporting Hadoop
3 in Spark 3? Is the difference in client / HDFS API even that big? Or what
about focusing only on Hadoop 2.9.x support + 3.x support?

Lots of questions, just interested now in informal reactions, not a binding
decision.

On Thu, Oct 25, 2018 at 11:49 PM Dagang Wei <notificati...@github.com>
wrote:

> Do we really want to switch to Hive 2.3? From this page
> https://hive.apache.org/downloads.html, Hive 2.3 works with Hadoop 2.x
> (Hive 3.x works with Hadoop 3.x).
>
> —
> You are receiving this because you were mentioned.
> Reply to this email directly, view it on GitHub
> <https://github.com/apache/spark/pull/21588#issuecomment-433285287>, or mute
> the thread
> <https://github.com/notifications/unsubscribe-auth/AAyM-sRygel3il6Ne4FafD5BQ7NDSJ7Mks5uopRlgaJpZM4Usweh>
> .
>

Reply via email to