The reason for org.spark-project.hive is that Spark relies on hive-exec, but the Hive project does not publish this artifact by itself, only with all its dependencies as an uber jar. Maybe that's been improved. If so, you need to point at the new hive-exec and perhaps sort out its dependencies manually in your build.
On Mon, Jul 28, 2014 at 4:01 PM, Ted Yu <yuzhih...@gmail.com> wrote: > I found 0.13.1 artifacts in maven: > http://search.maven.org/#artifactdetails%7Corg.apache.hive%7Chive-metastore%7C0.13.1%7Cjar > > However, Spark uses groupId of org.spark-project.hive, not org.apache.hive > > Can someone tell me how it is supposed to work ? > > Cheers > > > On Mon, Jul 28, 2014 at 7:44 AM, Steve Nunez <snu...@hortonworks.com> wrote: > >> I saw a note earlier, perhaps on the user list, that at least one person is >> using Hive 0.13. Anyone got a working build configuration for this version >> of Hive? >> >> Regards, >> - Steve >> >> >> >> -- >> CONFIDENTIALITY NOTICE >> NOTICE: This message is intended for the use of the individual or entity to >> which it is addressed and may contain information that is confidential, >> privileged and exempt from disclosure under applicable law. If the reader >> of this message is not the intended recipient, you are hereby notified that >> any printing, copying, dissemination, distribution, disclosure or >> forwarding of this communication is strictly prohibited. If you have >> received this communication in error, please contact the sender immediately >> and delete it from your system. Thank You. >>