Just an update:

This error only happened on CDH4 environment (even I added hive-exec-0.12.0.jar 
to classpath).

If I use CDH5, it works fine.

John

From: John Zeng [mailto:john.z...@dataguise.com]
Sent: Sunday, May 11, 2014 5:54 PM
To: user@hive.apache.org
Subject: java.lang.NoSuchFieldError: HIVE_ORC_FILE_MEMORY_POOL when inserting 
data to ORC table

Resending since this mailing list had issue to post message in last few days.

From: John Zeng
Sent: Friday, May 9, 2014 6:18 PM
To: user@hive.apache.org<mailto:user@hive.apache.org>
Subject: java.lang.NoSuchFieldError: HIVE_ORC_FILE_MEMORY_POOL when inserting 
data to ORC table

Hi, All,

I created a ORC table by doing this:

add jar /home/dguser/hive-0.12.0/lib/hive-exec-0.12.0.jar;

CREATE TABLE orc_UserDataTest2(
PassportNumbers1 STRING,
PassportNumbers2 STRING,
TaxID STRING,
CM11 STRING,
CM13 STRING,
CM15 STRING,
Name STRING,
EmailAddress STRING )
ROW FORMAT SERDE 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'
STORED AS inputformat 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat'
outputformat 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat';

The table creation is successful and I can see a new folder under warehouse 
directory:

Logging initialized using configuration in 
jar:file:/home/opt/cloudera/parcels/CDH-4.2.0-1.cdh4.2.0.p0.10/lib/hive/lib/hive-common-0.10.0-cdh4.2.0.jar!/hive-log4j.properties
Hive history file=/tmp/dguser/hive_job_log_dguser_201405091727_1340301870.txt
Added /home/dguser/hive-0.12.0/lib/hive-exec-0.12.0.jar to class path
Added resource: /home/dguser/hive-0.12.0/lib/hive-exec-0.12.0.jar
OK
Time taken: 1.953 seconds

But when I inserted data to it, I got following fatal error in map task:

2014-05-09 17:37:48,447 FATAL ExecMapper: java.lang.NoSuchFieldError: 
HIVE_ORC_FILE_MEMORY_POOL
  at 
org.apache.hadoop.hive.ql.io.orc.MemoryManager.<init>(MemoryManager.java:83)
  at org.apache.hadoop.hive.ql.io.orc.OrcFile.getMemoryManager(OrcFile.java:302)
  at org.apache.hadoop.hive.ql.io.orc.OrcFile.access$000(OrcFile.java:32)
  at 
org.apache.hadoop.hive.ql.io.orc.OrcFile$WriterOptions.<init>(OrcFile.java:145)
  at org.apache.hadoop.hive.ql.io.orc.OrcFile.writerOptions(OrcFile.java:241)
  at 
org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat.getHiveRecordWriter(OrcOutputFormat.java:115)
  at 
org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:250)
  at 
org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:237)
  at 
org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:496)
  at 
org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:543)
  at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:474)
  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:800)
  at 
org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84)
  at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:474)
  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:800)
  at 
org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:83)
  at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:474)
  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:800)
  at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:546)
  at org.apache.hadoop.hive.ql.exec.ExecMapper.map(ExecMapper.java:143)
  at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
  at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:418)
  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:333)
  at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
  at java.security.AccessController.doPrivileged(Native Method)
  at javax.security.auth.Subject.doAs(Subject.java:396)
  at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
  at org.apache.hadoop.mapred.Child.main(Child.java:262)


The way I inserted data is just copying data from another table (which has 100 
rows):

add jar /home/dguser/hive-0.12.0/lib/hive-exec-0.12.0.jar;

insert overwrite table orc_UserDataTest2
select * from UserDataTest2;

Any idea for what this error is?

Thanks

John

Reply via email to