Re: query though hive jdbc cause chinese character become unrecognizable characters

2013-12-06 Thread Szehon Ho
I took a closer look. I tried the new JDBC Driver (org.apache.hive.jdbc.HiveDriver) against Hive-Server2, and it displays Japanese characters properly without any special configurations. Can you take a look at HIVE-3245 for details, and see if that

Question about invocation of the terminate method in UDAF

2013-12-06 Thread Yongcheng Li
Hi, Just want to confirm my understanding of how terminate method works in UDAF. [1] The terminate method of a UDAF is only invoked once (never twice) for the group of data that need to be aggregated. [2] When terminate method is invoked, all the data of the group that need to be aggregated hav

MIN/MAX issue with timestamps and RCFILE/ORC tables

2013-12-06 Thread David Engel
Hi, Because of the known, and believed fixed, issue with MIN/MAX (HIVE-4931), we're using a recent (2013-12-02), locally built version of Hive 0.13.0-SNAPSHOT. Unfortunately, we're still seeing issues using MIN/MAX on timestamp types when using RCFILE and ORC formatted tables. I could not find a

Re: query though hive jdbc cause chinese character become unrecognizable characters

2013-12-06 Thread Szehon Ho
Looks like the issue is tracked from HIVE-3245. I think we need to support adding encoding parameter as part of jdbc url similar to mysql jdbc's useUnicode/characterEncoding flags. I can take a look at it if nobody else has. For now, I think you can manually encode the result value from jdbc. T

Why from_utc_timestamp works for some bigint, but not others

2013-12-06 Thread java8964
Hi, I am using Hive 0.9.0, and not sure why the from_utc_timestamp gave me error to the following value, but works for others. The following example shows 2 bigint as 2 epoch value of milliseconds level. They are only 11 seconds difference. One works fine in hive 0.9.0 with from_utc_timestamp UD

Size of a Hive Map column in characters!

2013-12-06 Thread Sunderlin, Mark
The size(map) function is defined as follows: size(Map)Returns the number of elements in the map type What if I want the total size of the map for that row? This doesn't work: select length(MAP); How can I get the total size of a map column in either bytes or characters? --- Mark E. Sunder