Thanks Kulkarni - I think I have the right data types but here are DDL 
definitions and query.

Query:

select registration
                    from studenttab10k s right outer join votertab10k v
                        on (s.name = v.name);

Table definitions:

create external table IF NOT EXISTS votertab10k(
            name string,
            age int,
            registration string,
            contributions float)
        row format delimited
        fields terminated by '\t'
        stored as textfile
        location '/user/hive/tests/data/votertab10k';

create external table IF NOT EXISTS studenttab10k(
            name string,
            age int,
            gpa double)
        row format delimited
        fields terminated by '\t'
        stored as textfile
        location '/user/hive/tests/data/studenttab10k';

Thanks,
Kanna


From: kulkarni.swar...@gmail.com [mailto:kulkarni.swar...@gmail.com]
Sent: Tuesday, July 10, 2012 1:25 PM
To: user@hive.apache.org
Cc: d...@hive.apache.org
Subject: Re: Casting exception while converting from "LazyDouble" to 
"LazyString"

Hi Kanna,

This might just mean that in your query you are having a STRING type for a 
field which is actually a DOUBLE.
On Tue, Jul 10, 2012 at 3:05 PM, Kanna Karanam 
<kanna...@microsoft.com<mailto:kanna...@microsoft.com>> wrote:
Has anyone seen this error before? Am I missing anything here?

2012-07-10 11:11:02,203 INFO org.apache.hadoop.mapred.TaskInProgress: Error 
from attempt_201207091248_0107_m_000000_0: java.lang.RuntimeException: 
org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while 
processing row {"name":"zach johnson","age":77,"gpa":3.27}
                at 
org.apache.hadoop.hive.ql.exec.ExecMapper.map(ExecMapper.java:161)
                at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
                at 
org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
                at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
                at org.apache.hadoop.mapred.Child$4.run(Child.java:271)
                at java.security.AccessController.doPrivileged(Native Method)
                at javax.security.auth.Subject.doAs(Subject.java:396)
                at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1124)
                at org.apache.hadoop.mapred.Child.main(Child.java:265)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error 
while processing row {"name":"zach johnson","age":77,"gpa":3.27}
                at 
org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:550)
                at 
org.apache.hadoop.hive.ql.exec.ExecMapper.map(ExecMapper.java:143)
                ... 8 more
Caused by: java.lang.ClassCastException: 
org.apache.hadoop.hive.serde2.lazy.LazyDouble cannot be cast to 
org.apache.hadoop.hive.serde2.lazy.LazyString
                at 
org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyStringObjectInspector.getPrimitiveWritableObject(LazyStringObjectInspector.java:47)
                at 
org.apache.hadoop.hive.serde2.lazybinary.LazyBinarySerDe.serialize(LazyBinarySerDe.java:351)
                at 
org.apache.hadoop.hive.serde2.lazybinary.LazyBinarySerDe.serializeStruct(LazyBinarySerDe.java:255)
                at 
org.apache.hadoop.hive.serde2.lazybinary.LazyBinarySerDe.serialize(LazyBinarySerDe.java:202)
                at 
org.apache.hadoop.hive.ql.exec.ReduceSinkOperator.processOp(ReduceSinkOperator.java:236)
                at 
org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471)
                at 
org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762)
                at 
org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:83)
                at 
org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471)
                at 
org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762)
                at 
org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:531)


Thanks,
Kanna



--
Swarnim

Reply via email to