Regular hive server uses hive1 dependencies while LLAP uses hive2 dependencies, it seems your UDF built against hive 1.x, to make it work with llap you need to build it again with 2.x dependencies
Sent from my iPhone > On Jul 4, 2019, at 8:16 PM, Bernard Quizon > <bernard.qui...@cheetahdigital.com> wrote: > > Hi. > > Just an update, it is working when I use the Default HiveServer JDBC URL. > The error occurs when I use LLAP. > > Regards, > Bernard > >> On Fri, Jul 5, 2019 at 10:40 AM Bernard Quizon >> <bernard.qui...@cheetahdigital.com> wrote: >> Hi. >> >> So I created a GenericUDF that returns a map, it works fine on simple SELECT >> statements. >> For example: >> SELECT member_id, map_merge(src_map, dest_map, array('key1')) from >> test_table limit 100; >> >> But returns an error when I use it on JOINs, for example: >> >> SELECT >> cust100.map_merge(e.map_1, t.map_1, array('key1')) >> FROM test_table t >> INNER JOIN ext_test_table e >> ON t.id = e.id >> >> Please see stack trace below: >> Serialization trace: >> genericUDF (org.apache.hadoop.hive.ql.plan.ExprNodeGenericFuncDesc) >> colExprMap (org.apache.hadoop.hive.ql.plan.SelectDesc) >> conf (org.apache.hadoop.hive.ql.exec.vector.VectorSelectOperator) >> childOperators >> (org.apache.hadoop.hive.ql.exec.vector.mapjoin.VectorMapJoinInnerStringOperator) >> childOperators (org.apache.hadoop.hive.ql.exec.vector.VectorSelectOperator) >> childOperators (org.apache.hadoop.hive.ql.exec.vector.VectorFilterOperator) >> childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator) >> aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork) >> at >> org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:156) >> at >> org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133) >> at >> org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:670) >> at >> org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClass(SerializationUtilities.java:185) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:118) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551) >> at >> org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790) >> at >> org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:180) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:161) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:39) >> at >> org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708) >> at >> org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551) >> at >> org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708) >> at >> org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551) >> at >> org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790) >> at >> org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:180) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:134) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:40) >> at >> org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708) >> at >> org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551) >> at >> org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790) >> at >> org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:180) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:134) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:40) >> at >> org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708) >> at >> org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551) >> at >> org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790) >> at >> org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:180) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:134) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:40) >> at >> org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708) >> at >> org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551) >> at >> org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790) >> at >> org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:180) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:134) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:40) >> at >> org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708) >> at >> org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551) >> at >> org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790) >> at >> org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:180) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:161) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:39) >> at >> org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708) >> at >> org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125) >> at >> org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551) >> at >> org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:686) >> at >> org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:210) >> at >> org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializeObjectByKryo(SerializationUtilities.java:707) >> at >> org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:613) >> at >> org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:590) >> at >> org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:468) >> ... 21 more >> Caused by: java.lang.ClassNotFoundException: com.test.hiveudf.MapMergeUdf >> at java.net.URLClassLoader.findClass(URLClassLoader.java:381) >> at java.lang.ClassLoader.loadClass(ClassLoader.java:424) >> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) >> at java.lang.ClassLoader.loadClass(ClassLoader.java:357) >> at java.lang.Class.forName0(Native Method) >> at java.lang.Class.forName(Class.java:348) >> at >> org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:154) >> ... 84 more >> >> I also applied the suggestions from HIVE-7711, I added a >> `DoNothingSerializer` and used the annotation `DefaultSerializer` to set it >> in my MapMergeUdf. >> Something like: >> @DefaultSerializer(value = classOf[DoNothingSerializer]) >> class MapMergeUdf extends GenericUDF { >> Would you guys know how to resolve this issue? >> >> Thanks, >> Bernard