Could you send a "show create table" for the two tables involved?


On 19 June 2014 00:25, Clay McDonald <stuart.mcdon...@bateswhite.com> wrote:

>  I’m trying to run the following hive join query and get the following
> error. Any suggestions?
>
>
>
>
>
>
>
>
>
> hive> select count(B.txn_id) AS CNT FROM txn_hdr_combined AS B JOIN
> upc_tab
>                                        le AS C ON B.txn_id = C.txn_id;
>
> com.esotericsoftware.kryo.KryoException: Class cannot be created (missing
> no-arg
>               constructor):
> org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableH
> iveVarcharObjectInspector
>
> Serialization trace:
>
> objectInspector (org.apache.hadoop.hive.ql.exec.ColumnInfo)
>
> signature (org.apache.hadoop.hive.ql.exec.RowSchema)
>
> rowSchema (org.apache.hadoop.hive.ql.exec.ReduceSinkOperator)
>
> childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
>
> aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)
>
> mapWork (org.apache.hadoop.hive.ql.plan.MapredWork)
>
>         at com.esotericsoftware.kryo.Kryo.newInstantiator(Kryo.java:1097)
>
>         at com.esotericsoftware.kryo.Kryo.newInstance(Kryo.java:1109)
>
>         at
> com.esotericsoftware.kryo.serializers.FieldSerializer.create(FieldSer
> ializer.java:526)
>
>         at
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSeria
> lizer.java:502)
>
>         at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
>
>         at
> com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.ja
> va:106)
>
>         at
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSeria
> lizer.java:507)
>
>         at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
>
>         at
> com.esotericsoftware.kryo.serializers.CollectionSerializer.read(Colle
> ctionSerializer.java:112)
>
>         at
> com.esotericsoftware.kryo.serializers.CollectionSerializer.read(Colle
> ctionSerializer.java:18)
>
>         at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
>
>         at
> com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.ja
> va:106)
>
>         at
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSeria
> lizer.java:507)
>
>         at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
>
>         at
> com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.ja
> va:106)
>
>         at
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSeria
> lizer.java:507)
>
>         at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
>
>         at
> com.esotericsoftware.kryo.serializers.CollectionSerializer.read(Colle
> ctionSerializer.java:112)
>
>         at
> com.esotericsoftware.kryo.serializers.CollectionSerializer.read(Colle
> ctionSerializer.java:18)
>
>         at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
>
>         at
> com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.ja
> va:106)
>
>         at
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSeria
> lizer.java:507)
>
>         at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
>
>         at
> com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerialize
> r.java:139)
>
>         at
> com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerialize
>                                                                               
>                                   r.java:17)
>
>         at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
>
>         at
> com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.ja
> va:106)
>
>         at
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSeria
>                                                                               
>      lizer.java:507)
>
>         at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
>
>         at
> com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.ja
>                                                                               
>                                va:106)
>
>         at
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSeria
>                                                            lizer.java:507)
>
>         at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:672)
>
>         at
> org.apache.hadoop.hive.ql.exec.Utilities.deserializeObjectByKryo(Util
> ities.java:810)
>
>         at
> org.apache.hadoop.hive.ql.exec.Utilities.deserializePlan(Utilities.ja
> va:718)
>
>         at
> org.apache.hadoop.hive.ql.exec.Utilities.clonePlan(Utilities.java:748
>                                                                               
>                             )
>
>         at
> org.apache.hadoop.hive.ql.optimizer.physical.CommonJoinTaskDispatcher
>                                                   
> .processCurrentTask(CommonJoinTaskDispatcher.java:503)
>
>         at
> org.apache.hadoop.hive.ql.optimizer.physical.AbstractJoinTaskDispatch
>                                                
> er.dispatch(AbstractJoinTaskDispatcher.java:182)
>
>         at
> org.apache.hadoop.hive.ql.lib.TaskGraphWalker.dispatch(TaskGraphWalke
>                                       r.java:111)
>
>         at
> org.apache.hadoop.hive.ql.lib.TaskGraphWalker.walk(TaskGraphWalker.ja
> va:194)
>
>         at
> org.apache.hadoop.hive.ql.lib.TaskGraphWalker.startWalking(TaskGraphW
> alker.java:139)
>
>         at
> org.apache.hadoop.hive.ql.optimizer.physical.CommonJoinResolver.resol
> ve(CommonJoinResolver.java:79)
>
>         at
> org.apache.hadoop.hive.ql.optimizer.physical.PhysicalOptimizer.optimi
> ze(PhysicalOptimizer.java:90)
>
>         at
> org.apache.hadoop.hive.ql.parse.MapReduceCompiler.compile(MapReduceCo
>                                                                               
>                        mpiler.java:300)
>
>         at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(Sema
>                                                             
> nticAnalyzer.java:8410)
>
>         at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSema
>                           nticAnalyzer.java:284)
>
>         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:441)
>
>         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:342)
>
>         at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1000)
>
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
>
>         at
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:2
>             59)
>
>         at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
>
>         at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
>
>         at
> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781
>                                                                               
>                                        )
>
>         at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
>
>         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
>
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
>                      java:39)
>
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> sorImpl.java:25)
>
>         at java.lang.reflect.Method.invoke(Method.java:597)
>
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
>
> FAILED: SemanticException Generate Map Join Task Error: Class cannot be
> created
>                                                       (missing no-arg
> constructor):
> org.apache.hadoop.hive.serde2.objectinspector.prim
> itive.WritableHiveVarcharObjectInspector
>
> Serialization trace:
>
> objectInspector (org.apache.hadoop.hive.ql.exec.ColumnInfo)
>
> signature (org.apache.hadoop.hive.ql.exec.RowSchema)
>
> rowSchema (org.apache.hadoop.hive.ql.exec.ReduceSinkOperator)
>
> childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
>
> aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)
>
> mapWork (org.apache.hadoop.hive.ql.plan.MapredWork)
>
> hive>
>
>
>
> *Clay McDonald*
> Database Administrator
>
> Bates White, LLC
> 1300 Eye St, NW, Suite 600 East
> Washington, DC 20005
> Main: 202.408.6110
> Cell: 202.560.4101
> Direct: 202.747.5962
> Email: clay.mcdon...@bateswhite.com
>
> ****************************************************
> This electronic message transmission contains information from Bates
> White, LLC, which may be confidential or privileged. The information is
> intended to be for the use of the individual or entity named above. If you
> are not the intended recipient, be aware that any disclosure, copying,
> distribution, or use of the contents of this information is prohibited.
>
> If you have received this electronic transmission in error, please notify
> me by telephone at 202.747.5962 or by electronic mail at
> clay.mcdon...@bateswhite.com immediately.
>
> *****************************************************
>



-- 
André Araújo
Big Data Consultant/Solutions Architect
The Pythian Group - Australia - www.pythian.com

Office (calls from within Australia): 1300 366 021 x1270
Office (international): +61 2 8016 7000  x270 *OR* +1 613 565 8696   x1270
Mobile: +61 410 323 559
Fax: +61 2 9805 0544
IM: pythianaraujo @ AIM/MSN/Y! or ara...@pythian.com @ GTalk

“Success is not about standing at the top, it's the steps you leave behind.”
— Iker Pou (rock climber)

-- 


--



Reply via email to