Hi Roberto,
Yes, your JsonSerde was the one I got working last night. My next question is,
do you know of any way to have the Hive Table schema statement (e.g. user_id
bigint, details struct> )
generated or imported if I have the Thrift definitions?
Since Thrift & Hive are both Facebook origin
I think the struct for a struct is correct (you can see in
ObjectInspectorUtils.getStandardStructTypeName() how it is built).
If you're using this serde:
http://code.google.com/p/hive-json-serde/source/browse/trunk/src/org/apache/hadoop/hive/contrib/serde2/JsonSerde.java,
it just doesn't support s
Hello,
Thank you for your answers, this solves the issue.
I have set mapred.max.split.size to 102400 in hive-site.xml and jobs
are using appropriate number of mappers.
I have played a little with different configurations and
CombineHiveInputFormat gives better performance than HiveInputFor
Hi All,
I can't find any reasonable documentation about being able to use Serde's.
I have a JSONSerde which I use to create an external table
create external table scratch.json_serde_test (user_id bigint, details
struct > ) row format serde
'org.apache.hadoop.hive.contrib.serde2.JsonSerde'