Hi,
We're currently experiencing an issue with a query against a table backed
by ORC. Nothing special - any query causes it.
We're currently using HDP 2.2.4.x, so Hive 0.14.0.2.2.4.x
The error we're seeing in the logs is:
Caused by: java.lang.RuntimeException: Error creating a batch
at
Hi,
Expand Elliot's answer for a partitioned table, e.g.:
CREATE EXTERNAL TABLE orc_table (
)
PARTITIONED BY (col1 type, col2 type)
STORED AS ORC
LOCATION '/hdfs/folder/containing/orc/files';
ALTER TABLE orc_table ADD PARTITION (col1 = 'val1', col2 = 'val2') LOCATION
'/hdfs/folder/containing/
Hi All,
I'm having a problem in Hive 0.13.0 using INSERT OVERWRITE with dynamic
partitioning, selecting from an ORC table to another ORC table (I don't
think the target table being ORC is significant). The insert is generating
an java.lang.ArrayIndexOutOfBoundsException. Has anyone seen this error
Hi,
I'm trying to write ORC files outside of hive and I'm currently
looking at uniontype.
I've identified multiple ways to do this. I could use a Writable or
Java StandardStructObjectInspector and pass in a StandardUnion.
However, the OrcInputFormat uses OrcStruct as the value type and I'd
like t
Hi,
I'm attempting to write a ReaderReader and RecordWriter for ORC that
support reading and writing String for HiveChar and HiveVarchar and
BigDecimal for HiveDecimal to simplify usage a little - I need Serializable
types.
To do this I've gone the ObjectInspector route, eg:
... extends JavaHi