Hello,

I am trying to query the table created for Kafka streaming.

ADD JAR /home/lib/kafka-handler-3.1.3000.7.1.1.0-565.jar;
ADD JAR /home/lib/hive-exec-3.1.2.jar;

select count(*) from brand_affinity_atc_struct;


It throws the following error,

java.io.IOException: java.util.concurrent.ExecutionException: 
java.lang.NoSuchMethodError: 
com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;JJ)V
      at 
org.apache.hadoop.hive.kafka.KafkaInputFormat.computeSplits(KafkaInputFormat.java:143)
      at 
org.apache.hadoop.hive.kafka.KafkaInputFormat.getSplits(KafkaInputFormat.java:69)
      at 
org.apache.hadoop.hive.ql.io.HiveInputFormat.addSplitsForGroup(HiveInputFormat.java:442)
      at 
org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:561)
      at 
org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.initialize(HiveSplitGenerator.java:196)
      at 
org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable$1.run(RootInputInitializerManager.java:278)
      at 
org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable$1.run(RootInputInitializerManager.java:269)
      at java.security.AccessController.doPrivileged(Native Method)
      at javax.security.auth.Subject.doAs(Subject.java:422)
      at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1893)
      at 
org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable.call(RootInputInitializerManager.java:269)
      at 
org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable.call(RootInputInitializerManager.java:253)
      at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
      at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
      at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: 
java.lang.NoSuchMethodError: 
com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;JJ)V
      at java.util.concurrent.FutureTask.report(FutureTask.java:122)
      at java.util.concurrent.FutureTask.get(FutureTask.java:206)
      at 
org.apache.hadoop.hive.kafka.KafkaInputFormat.computeSplits(KafkaInputFormat.java:138)
      ... 15 more
Caused by: java.lang.NoSuchMethodError: 
com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;JJ)V
      at 
org.apache.hadoop.hive.kafka.KafkaInputSplit.<init>(KafkaInputSplit.java:54)
      at 
org.apache.hadoop.hive.kafka.KafkaInputFormat.lambda$buildFullScanFromKafka$4(KafkaInputFormat.java:115)
      at 
java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
      at 
java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382)
      at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
      at 
java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
      at 
java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
      at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
      at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566)
      at 
org.apache.hadoop.hive.kafka.KafkaInputFormat.buildFullScanFromKafka(KafkaInputFormat.java:117)
      at 
org.apache.hadoop.hive.kafka.KafkaInputFormat.lambda$computeSplits$5(KafkaInputFormat.java:135)
      ... 4 more

This is the table:

ADD JAR /home/lib/kafka-handler-3.1.3000.7.1.1.0-565.jar;
ADD JAR /home/lib/hive-exec-3.1.2.jar;
CREATE EXTERNAL TABLE
  brand_affinity_atc_struct (
    `vertices` STRUCT<entities:array<struct<entityId:STRING, type:STRING, 
entityAttributes:STRUCT<wpid:STRING, itemId:STRING, atc_cnt:INT, 
imprsn_cnt:INT> >>>)
PARTITIONED BY (itemId STRING)
ROW FORMAT DELIMITED
FIELDS TERMINATED BY ','
STORED BY 'org.apache.hadoop.hive.kafka.KafkaStorageHandler'
TBLPROPERTIES (
  "kafka.topic" = "errors-prod",
  "kafka.bootstrap.servers" = "localhost:9092",
  "kafka.serde.class"="org.apache.hadoop.hive.serde2.JsonSerDe");


What could be the cause ?

--
Thanks,
Guru.

Reply via email to