I haven't seen anyone actively 'unwilling' -- I hope not. See
discussion at https://issues.apache.org/jira/browse/SPARK-2420 where I
sketch what a downgrade means. I think it just hasn't gotten a looking
over.

Contrary to what I thought earlier, the conflict does in fact cause
problems in theory, and you show it causes a problem in practice. Not
to mention it causes issues for Hive-on-Spark now.

On Mon, Jul 21, 2014 at 6:27 PM, Andrew Lee <alee...@hotmail.com> wrote:
> Hive and Hadoop are using an older version of guava libraries (11.0.1) where
> Spark Hive is using guava 14.0.1+.
> The community isn't willing to downgrade to 11.0.1 which is the current
> version for Hadoop 2.2 and Hive 0.12.

Reply via email to