That looks like you’ve encountered a file with no delimiter as that’s near the 
max size for an array or string. Also I don’t think you can terminate fields 
with a line feed as that’s the hard coded row delimiter.

Thanks
Shawn

From: xuanhuang <18351886...@163.com>
Reply-To: "user@hive.apache.org" <user@hive.apache.org>
Date: Saturday, November 30, 2019 at 1:40 AM
To: "user@hive.apache.org" <user@hive.apache.org>
Subject: hive error: "Too many bytes before delimiter: 2147483648"


Hello all,
I encountered a problem while using hive, please ask you. The following is the 
specific situation.
Platform: hive on spark
Error: java.io.IOException: Too many bytes before delimiter: 2147483648
Description: When using small files of the same format for testing, there is no 
problem; but for all data (about 140G), the above error is reported.
Table creation statement:
DROP TABLE IF EXISTS test;
CREATE EXTERNAL TABLE test (
     json string
     )
     ROW FORMAT DELIMITED
     FIELDS TERMINATED BY '\ r \ n'
     STORED AS TEXTFILE

LOCATION 'hdfs: / userdata / test';

The big data platform I use is TDH Community Edition 5.0.2.
Any help is much appreciated!










Reply via email to