Any hive+parquet user/dev to address this?

Regards,
Mohammad

On Monday, October 5, 2015 3:41 PM, Mohammad Islam <misla...@yahoo.com> wrote:



Hi,
Does the parquet table support auto casting to wider data types? For example, 
If I have a parquet table where some parquet data files which have "int"  as 
data type and other files have "long" data type for the same field.

The table schema has type "bigint" for the same field.
Does hive can read the file that was written with type "int"?

I got this exception "Failed with exception 
java.io.IOException:org.apache.hadoop.hive.ql.metadata.HiveException: 
java.lang.ClassCastException: org.apache.hadoop.io.IntWritable cannot be cast 
to org.apache.hadoop.io.LongWritable".

Regards,
Mohammad

Reply via email to