Thank you everyone for your help! Owen, we're on an old version of hive
(1.1.0-cdh5.9.2).
On Thu, Jul 18, 2019 at 9:38 AM Owen O'Malley
wrote:
> ORC files expect UTF-8, which is a superset of ascii, in strings, char,
> and varchar. The only place that I know that will cause trouble if you put
>
ORC files expect UTF-8, which is a superset of ascii, in strings, char, and
varchar. The only place that I know that will cause trouble if you put
non-utf-8 data in strings is the statistics. The API for getting the
min/max will convert to Java strings.
But back to your original point, the schema
The table has data in it perhaps that is beyond ASCII.
Easier way is to go for additional column , update with data and the drop
the older one after validation of records in String type col.
Regards
Dev
On Thu, Jul 18, 2019, 4:44 AM William Shen
wrote:
> Hi all,
>
> I assumed that it should be
Which version of Hive are you on? The recent versions (hive >= 2.3) should
support schema evolution in the ORC reader.
.. Owen
On Wed, Jul 17, 2019 at 11:07 PM Jörn Franke wrote:
> You have to create a new table with this column as varchar and do a select
> insert from the old table.
>
> > Am 1
You have to create a new table with this column as varchar and do a select
insert from the old table.
> Am 18.07.2019 um 01:14 schrieb William Shen :
>
> Hi all,
>
> I assumed that it should be compatible to convert column type varchar to
> string, however, after running ALTER TABLE table CHA
To add, the storage information is as follows
# Storage Information
SerDe Library: org.apache.hadoop.hive.ql.io.orc.OrcSerde
InputFormat: org.apache.hadoop.hive.ql.io.orc.OrcInputFormat
OutputFormat: org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat
Compressed: No
On Wed
Hi all,
I assumed that it should be compatible to convert column type varchar to
string, however, after running ALTER TABLE table CHANGE col col STRING, I
encounter the following error when querying the column from hive:
Failed with exception
java.io.IOException:org.apache.hadoop.hive.ql.metadata