Avoiding maintaining a separate Hive version is one of the initial purpose of Spark SQL. (We had once done this for Shark.) The org.spark-project.hive:hive-0.13.1a artifact only cleans up some 3rd dependencies to avoid dependency hell in Spark. This artifact is exactly the same as Hive 0.13.1 at the source level.

On the other hand, we're planning to add a Hive metastore adapter layer to Spark SQL so that in the future we can talk to arbitrary versions greater than or equal to 0.13.1 of Hive metastore, and then always stick to the most recent Hive versions to provide the most recent Hive features. This will probably happen in Spark 1.4 or 1.5.

Cheng

On 4/3/15 7:59 PM, Rex Xiong wrote:
Hi,

I got this error when creating a hive table from parquet file:
DDLTask: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.UnsupportedOperationException: Parquet does not support timestamp. See HIVE-6384

I check HIVE-6384, it's fixed in 0.14.
The hive in spark build is a customized version 0.13.1a (GroupId: org.spark-project.hive), is it possible to get the source code for it and apply patch from HIVE-6384?

Thanks


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to