Thanks a lot. I will look into it.

> I don't have the entire code so it's hard for me to say but does it help if
> you change:
>  private TreeMap<Long, Session> sessionMap =...
> to be
>  private Map<Long, Session> sessionMap =...
>

Actually I did change the source like what you suggested to avoid the
convert exception.
But I need to use some TreeMap method, so I had to check and convert
the sessionMap to a TreeMap before I use it.

Thanks a lot.




On Thu, Nov 15, 2012 at 3:01 PM, Mark Grover
<grover.markgro...@gmail.com> wrote:
> Hi Cheng,
> It's reflection that's causing you the problem. You seem to be using the old
> class (UDAF) to implement UDAF. While that may still be fine, just so that
> you know there is a newer, better performing method to implement UDAFs (more
> on that at https://cwiki.apache.org/Hive/genericudafcasestudy.html) that
> doesn't require the use of reflection. I would recommend using creating a
> GenericUDAF over the old method if you are creating a a new UDAF
>
> I don't have the entire code so it's hard for me to say but does it help if
> you change:
>  private TreeMap<Long, Session> sessionMap =...
> to be
>  private Map<Long, Session> sessionMap =...
>
> That would be a good programming practice anyways.
>
> Mark
>
> On Mon, Nov 12, 2012 at 8:01 PM, Cheng Su <scarcer...@gmail.com> wrote:
>>
>> Hi all.
>>
>> I'm writing a hive UDAF to calculate page view per session. Java source is
>> blow:
>>
>> ----
>> public class CalculateAvgPVPerSession extends UDAF {
>>
>>         /**
>>          * @author Cheng Su
>>          *
>>          */
>>         public static class CountSessionUDAFEvaluator implements
>> UDAFEvaluator {
>>
>>                 private VisitSessions visitSessions = new VisitSessions();
>>
>>                 /* (non-Javadoc)
>>                  * @see
>> org.apache.hadoop.hive.ql.exec.UDAFEvaluator#init()
>>                  */
>>                 @Override
>>                 public void init() {
>>                         // do nothing
>>                 }
>>
>>                 public boolean iterate(Text value) {
>>                         visitSessions.append(value.toString());
>>                         return true;
>>                 }
>>
>>                 public VisitSessions terminatePartial() {
>>                         return visitSessions;
>>                 }
>>
>>
>>                 public boolean merge(VisitSessions other) {
>>                         visitSessions.merge(other);
>>                         return true;
>>                 }
>>
>>
>>                 public FloatWritable terminate() {
>>                         return new
>> FloatWritable(visitSessions.getAveragePVPerSession());
>>                 }
>>         }
>>
>> }
>> ----
>>
>> VisitSessions is a class which contains a private field
>> java.util.TreeMap. Source is blow
>>
>> ----
>>
>> public class VisitSessions {
>>
>>         private static final DateFormat dateFormat = new
>> SimpleDateFormat("yyyyMMddHHmmss");
>>
>>         private final long interval;
>>
>>         private static final class Session {
>>                 private long start;
>>                 private long end;
>>
>>                 long getPeriod() {
>>                         return end - start;
>>                 }
>>         }
>>
>>         private long pageView = 0L;
>>
>>         private TreeMap<Long, Session> sessionMap = Maps.newTreeMap();
>>
>>         // ... do sth ...
>>
>>        public void merge(VisitSessions other) {
>>                 for (final Entry<Long, Session> otherSessionEntry :
>> other.sessionMap.entrySet()) {
>>                         mergeOne(otherSessionEntry.getValue());
>>                 }
>>                 pageView += other.pageView;
>>         }
>>
>>
>> }
>>
>> ----
>> When I use this UDAF, I get this exception :
>> ----
>> java.lang.RuntimeException:
>> org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error
>> while processing row (tag=0)
>>
>> {"key":{"_col0":0,"_col1":2011,"_col2":10},"value":{"_col0":{"interval":1800000,"pageview":8957,"sessionmap":{1319818373000:{"start":1319818373000,"end":1319818731000},1319821763000:{"start":1319821763000,"end":1319824141000},1319858388000:{"start":1319858388000,"end":1319865262000}}},"_col1":{"interval":1800000,"pageview":8957,"sessionmap":{1319818373000:{"start":1319818373000,"end":1319818731000},1319821763000:{"start":1319821763000,"end":1319824141000},1319858388000:{"start":1319858388000,"end":1319865262000}}}},"alias":0}
>>         at
>> org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:268)
>>         at
>> org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:519)
>>         at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:420)
>>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Unknown Source)
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
>>         at org.apache.hadoop.mapred.Child.main(Child.java:249)
>> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive
>> Runtime Error while processing row (tag=0)
>>
>> {"key":{"_col0":0,"_col1":2011,"_col2":10},"value":{"_col0":{"interval":1800000,"pageview":8957,"sessionmap":{1319818373000:{"start":1319818373000,"end":1319818731000},1319821763000:{"start":1319821763000,"end":1319824141000},1319858388000:{"start":1319858388000,"end":1319865262000}}},"_col1":{"interval":1800000,"pageview":8957,"sessionmap":{1319818373000:{"start":1319818373000,"end":1319818731000},1319821763000:{"start":1319821763000,"end":1319824141000},1319858388000:{"start":1319858388000,"end":1319865262000}}}},"alias":0}
>>         at
>> org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:256)
>>         ... 7 more
>> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException:
>> java.lang.RuntimeException: cannot set field private java.util.TreeMap
>> com.lietou.datawarehouse.hive.helper.VisitSessions.sessionMap of class
>> com.lietou.datawarehouse.hive.helper.VisitSessions
>> com.lietou.datawarehouse.hive.helper.VisitSessions@5f78dc08
>>         at
>> org.apache.hadoop.hive.ql.exec.GroupByOperator.processOp(GroupByOperator.java:737)
>>         at
>> org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471)
>>         at
>> org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:247)
>>         ... 7 more
>> Caused by: java.lang.RuntimeException: cannot set field private
>> java.util.TreeMap
>> com.lietou.datawarehouse.hive.helper.VisitSessions.sessionMap of class
>> com.lietou.datawarehouse.hive.helper.VisitSessions
>> com.lietou.datawarehouse.hive.helper.VisitSessions@5f78dc08
>>         at
>> org.apache.hadoop.hive.serde2.objectinspector.ReflectionStructObjectInspector.setStructFieldData(ReflectionStructObjectInspector.java:180)
>>         at
>> org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorConverters$StructConverter.convert(ObjectInspectorConverters.java:240)
>>         at
>> org.apache.hadoop.hive.ql.udf.generic.GenericUDFUtils$ConversionHelper.convertIfNecessary(GenericUDFUtils.java:345)
>>         at
>> org.apache.hadoop.hive.ql.udf.generic.GenericUDAFBridge$GenericUDAFBridgeEvaluator.merge(GenericUDAFBridge.java:176)
>>         at
>> org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator.aggregate(GenericUDAFEvaluator.java:142)
>>         at
>> org.apache.hadoop.hive.ql.exec.GroupByOperator.updateAggregations(GroupByOperator.java:600)
>>         at
>> org.apache.hadoop.hive.ql.exec.GroupByOperator.processAggr(GroupByOperator.java:824)
>>         at
>> org.apache.hadoop.hive.ql.exec.GroupByOperator.processOp(GroupByOperator.java:724)
>>         ... 9 more
>> Caused by: java.lang.IllegalArgumentException: Can not set
>> java.util.TreeMap field
>> com.lietou.datawarehouse.hive.helper.VisitSessions.sessionMap to
>> java.util.HashMap
>>         at
>> sun.reflect.UnsafeFieldAccessorImpl.throwSetIllegalArgumentException(Unknown
>> Source)
>>         at
>> sun.reflect.UnsafeFieldAccessorImpl.throwSetIllegalArgumentException(Unknown
>> Source)
>>         at sun.reflect.UnsafeObjectFieldAccessorImpl.set(Unknown Source)
>>         at java.lang.reflect.Field.set(Unknown Source)
>>         at
>> org.apache.hadoop.hive.serde2.objectinspector.ReflectionStructObjectInspector.setStructFieldData(ReflectionStructObjectInspector.java:178)
>>         ... 16 more
>> ----
>>
>> It happened at the reduce phase. Why hive try to set a HashMap to the
>> TreeMap field?
>>
>> And how to run a junit test of UDAF? So that I can debug it.
>>
>> --
>>
>> Regards,
>> Cheng Su
>
>



-- 

Regards,
Cheng Su

Reply via email to