Sent from Rocket Mail via Android
1) How should I compress the file to use LZO compression.
a) Write your own mapreduce code
b) use pig scripts
c) create temp tables and load data in compression backed table
2) How to know whether LZO compression utility (command ?) is installed on
the Hadoop cluster?
check hadoop conf files and c
Hi Juan
I was able to reproduce this issue with a different dataset. I posted a patch
for this bug here https://issues.apache.org/jira/browse/HIVE-5991. Can you use
this patch and see if it resolves the issue?
Thanks
Prasanth Jayachandran
On Nov 27, 2013, at 11:01 AM, Prasanth Jayachandran
w
Yep, looks like is not jdbc driver issue, but hive-server2 itself. Setting
platform's lang to utf8 might be a requirement for Hive-server2 in this
scenario, as I am not aware of any hive-specific properties to control
character encodings. Not sure if anyone else has any insight?
Thanks,
Szehon
Hi,
I have a large set of text files. I have created a Hive table pointing to each
of these text files. I am looking to compress the files to save storage.
1) How should I compress the file to use LZO compression.
2) How to know whether LZO compression utility (command ?) is installed on the
H
When is the serialize method of a Hive SerDe invoked?
I recently created a couple of Hive SerDes and wrote unit tests for the
serialize and deserialize methods, and I've been able to test the
deserialize method in a real Hive environment, but I can't figure out a
scenario where serialize is called
I have just started to experiment with Hbase in our cluster. I have an
secure Hbase cluster setup and would like to create an external Hive table
around several of our Hbase tables. This works fine until we enabled secure
(kerberos) client access to Hbase. Now we cannot query any of our Hive
extern
hi,i add the following code into /etc/init.d/hive-server2 script ,now the
character can be show normally,but i hope it can be handle with out any
change of the hive file
export LANG="zh_CN.UTF-8"
On Sat, Dec 7, 2013 at 1:07 PM, Szehon Ho wrote:
> I took a closer look. I tried the new JDBC Dri