Hi All

I am trying to test the transaction feature, especially compaction of Hive 0.13. I am getting a nullpointer exception at org.apache.hadoop.hive.ql.lockmgr.DbTxnManager.heartbeat(DbTxnManager.java:244) while trying to insert into the bucketed table. I have mentioned below step-by-step what I am doing. Please guide me as to where I am going wrong.

hive> CREATE EXTERNAL TABLE EMP (ID INT, NAME STRING, COUNTRY STRING, VAR STRING)
    > ROW FORMAT DELIMITED
    > FIELDS TERMINATED BY '\t'
    > LINES TERMINATED BY '\n'
    > LOCATION '/tmp/employees';

hive> SELECT * FROM EMP;
OK
1       A       US      x
2       B       UK      y
3       C       AUS     1
4       D       UK      2
5       E       US      1
6       F       UK      2
7       G       AUS     x
8       H       IND     y
9       I       US      x
10      J       UK      y

hive> CREATE EXTERNAL TABLE BUCKET_EMP (ID INT, NAME STRING, VAR STRING)
    > PARTITIONED BY (COUNTRY STRING)
    > CLUSTERED BY(VAR) INTO 3 BUCKETS
    > ROW FORMAT DELIMITED
    > FIELDS TERMINATED BY '\t'
    > LINES TERMINATED BY '\n'
    > STORED AS ORC
    > LOCATION '/tmp/bucket_emp';

hive> SET hive.exec.dynamic.partition = true;
hive> SET hive.exec.dynamic.partition.mode = nonstrict;
hive> INSERT OVERWRITE TABLE BUCKET_EMP
    > PARTITION(COUNTRY)
    > SELECT ID, NAME , VAR, COUNTRY
    > FROM EMP;

hive> SELECT * FROM BUCKET_EMP;
OK
7       G       x       AUS
3       C       1       AUS
8       H       y       IND
10      J       y       UK
2       B       y       UK
6       F       2       UK
4       D       2       UK
9       I       x       US
1       A       x       US
5       E       1       US

hive> SET hive.txn.manager = org.apache.hadoop.hive.ql.lockmgr.DbTxnManager;
hive> SET hive.compactor.initiator.on = true;
hive> SET hive.compactor.worker.threads = 3;
hive> SET hive.compactor.check.interval = 300;
hive> SET hive.compactor.delta.num.threshold = 1;

hive> INSERT OVERWRITE TABLE BUCKET_EMP
    > PARTITION(COUNTRY)
    > SELECT ID, NAME,
    > CASE WHEN VAR = '1' THEN 'X' WHEN VAR = '2' THEN 'Y' END AS VAR, COUNTRY
    > FROM EMP;

java.lang.NullPointerException
at org.apache.hadoop.hive.ql.lockmgr.DbTxnManager.heartbeat(DbTxnManager.java:244) at org.apache.hadoop.hive.ql.exec.Heartbeater.heartbeat(Heartbeater.java:79) at org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.progress(HadoopJobExecHelper.java:242) at org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.progress(HadoopJobExecHelper.java:547) at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:426) at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1508)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1275)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1093)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Ended Job = job_1411574868628_0015 with exception 'java.lang.NullPointerException(null)'

Thanks
Supriya


Reply via email to