Hi all,
Sorry the referenced link is not using a private/own branch of hive. It's
using Hortonworks 2.3 and the hive packaged in HDP2.3, and installed a
standalone version of Spark cluster (1.5.2)
But the Hive on Spark cannot run.
Could anyone help on this? Thanks a lot!
Regards,
Sai
On Wed, De
The referenced link seems to be w.r.t. Hive on Spark which is still in its
own branch of Hive.
FYI
On Tue, Dec 1, 2015 at 11:23 PM, 张炜 wrote:
> Hello Ted and all,
> We are using Hive 1.2.1 and Spark 1.5.1
> I also noticed that there are other users reporting this problem.
>
> http://apache-spar
Hello Ted and all,
We are using Hive 1.2.1 and Spark 1.5.1
I also noticed that there are other users reporting this problem.
http://apache-spark-user-list.1001560.n3.nabble.com/Issue-with-spark-on-hive-td25372.html#a25486
Thanks a lot for help!
Regards,
Sai
On Wed, Dec 2, 2015 at 11:11 AM Ted Yu
Can you tell us the version of Spark and hive you use ?
Thanks
On Tue, Dec 1, 2015 at 7:08 PM, 张炜 wrote:
> Dear all,
> We have a requirement that needs to update delete records in hive. These
> operations are available in hive now.
>
> But when using hiveContext in Spark, it always pops up an "
Dear all,
We have a requirement that needs to update delete records in hive. These
operations are available in hive now.
But when using hiveContext in Spark, it always pops up an "not supported"
error.
Is there anyway to support update/delete operations using spark?
Regards,
Sai