Re: 答复: Answers to recent questions on Hive on Spark

2015-11-27 Thread Xuefu Zhang
Hi Wenli, Hive on Spark team believes that Hive on Spark is production ready. In fact, CDH already provides support for selected customers in 5.4, which is based on Hive 1.1.0. CDH will release Hive on Spark as GA in 5.7 which is coming soon. Thanks, Xuefu On Fri, Nov 27, 2015 at 7:28 PM, Wangwe

Re: Answers to recent questions on Hive on Spark

2015-11-27 Thread Xuefu Zhang
Okay. I think I know what problem you have now. To run Hive on Spark, spark-assembly.jar is needed and it's also recommended that you have a spark installation (identified by spark.home) on the same host where HS2 is running. You only need spark-assembly.jar in HS2's /lib directory. Other than thos

答复: Answers to recent questions on Hive on Spark

2015-11-27 Thread Wangwenli
Hi xuefu , thanks for the information. One simple question, any plan when the hive on spark can be used in production environment? Regards wenli 发件人: Xuefu Zhang [mailto:xzh...@cloudera.com] 发送时间: 2015年11月28日 2:12 收件人: user@hive.apache.org; d...@hive.apache.org 主题: Answers to recent questio

hive:SEVERE: org.apache.calcite.runtime.CalciteException: Failed to encode '%测试%' in character set 'ISO-8859-1'

2015-11-27 Thread San Luo
( select acm.mdistrict_id as district_id ,'d' as time_granularity ,20151127 as time_item,COUNT(pj.ProjectID) as project_count from PIAO_3_PROJECTINFO pj inner join PIAO_3_SITEINFO si on si.SiteID = pj.SiteID inner join area_city_management acm on acm.city_id = si.CityId where pj.SellEndTime >

RE: Answers to recent questions on Hive on Spark

2015-11-27 Thread Mich Talebzadeh
Hi, Thanks for heads up and comments. Sounds like when it comes to using spark as the execution engine for Hive, we are in no man’s land so to speak. I have opened questions in both Hive and Spark user forums. Not much of luck for reasons that you alluded to. Ok just to clarify the pr

Answers to recent questions on Hive on Spark

2015-11-27 Thread Xuefu Zhang
Hi there, There seemed an increasing interest in Hive On Spark From the Hive users. I understand that there have been a few questions or problems reported and I can see some frustration sometimes. It's impossible for Hive on Spark team to respond every inquiry even thought we wish we could. Howeve