Hi Furcy, Thanks for sharing. I modified my code to mark the map variables "transient" but still got same error. this is the code:
public class fun_name extends GenericUDTF { private PrimitiveObjectInspector stringOI = null; transient Map<String, Map<String, String>> mapObject; transient Map<String, String> eventDetails; @Override public void close() throws HiveException { } @Override public StructObjectInspector initialize(ObjectInspector[] args) throws UDFArgumentException { if (args.length != 1) { throw new UDFArgumentException("fun_name() takes exactly one argument"); } if (args[0].getCategory() != ObjectInspector.Category.PRIMITIVE && ((PrimitiveObjectInspector) args[0]).getPrimitiveCategory() != PrimitiveObjectInspector.PrimitiveCategory.STRING) { throw new UDFArgumentException("fun_name() takes a string as a parameter"); } stringOI = (PrimitiveObjectInspector) args[0]; ArrayList<String> fieldNames = new ArrayList<String>(1); ArrayList<ObjectInspector> fieldOIs = new ArrayList<ObjectInspector>(1); fieldNames.add("field1"); fieldNames.add("field2"); fieldOIs.add(PrimitiveObjectInspectorFactory.javaStringObjectInspector); fieldOIs.add(PrimitiveObjectInspectorFactory.javaStringObjectInspector); return ObjectInspectorFactory.getStandardStructObjectInspector(fieldNames, fieldOIs); } @Override public void process(Object[] record) throws HiveException { ObjectMapper mapper = new ObjectMapper(); try { final String document = (String) stringOI.getPrimitiveJavaObject(record[0]); mapObject = mapper.readValue(document, new TypeReference<Map<String, Map<String, String>>>() { }); for (String someId : mapObject.keySet()) { eventDetails = mapObject.get(someId); List<String> forwardObj = new ArrayList<String>(); forwardObj.add(someId); forwardObj.add(eventDetails.get("aFieldName")); forward(forwardObj); } } } } to compile and build the jar: *javac -Xlint:deprecation fun_naemjavajar cvf from_json.jar from_json.class 'fun_name$1.class'* add jar and create function in hive: *add jar fun_name.jar;create temporary function fun_name as 'fun_name';* succeeded; when run a query, got error: Vertex failed, vertexName=Map 1, vertexId=vertex_1407434664593_24842_1_00, diagnostics=[Vertex Input: impression_test initializer failed., o*rg.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find class: fun_name* Serialization trace: genericUDTF (org.apache.hadoop.hive.ql.plan.UDTFDesc) conf (org.apache.hadoop.hive.ql.exec.UDTFOperator) childOperators (org.apache.hadoop.hive.ql.exec.SelectOperator) childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator) aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)] DAG failed due to vertex failure. failedVertices:1 killedVertices:0 FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask could be something else? On Tue, Sep 9, 2014 at 1:31 AM, Furcy Pin <furcy....@flaminem.com> wrote: > Hi, > > I think I encountered this kind of serialization problem when writing UDFs. > Usually, marking every fields of the UDF as *transient* does the trick. > > I guess the error means that Kryo tries to serialize the UDF class and > everything that is inside, and by marking them as transient > you ensure that it will not and that they will be instantiated in the > default constructor or during the call of initialize() > > Please keep me informed if it works or not, > > Regards, > > Furcy > > > 2014-09-09 1:44 GMT+02:00 Echo Li <echo...@gmail.com>: > >> I wrote a UDTF in hive 0.13, the function parse a column which is json >> string and return a table. The function compiles successfully by adding >> hive-exec-0.13.0.2.1.2.1-471.jar to classpath, however when the jar is >> added to hive and a function created using the jar then I try to run a >> query using that function, I got error: >> >> org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find >> class: class_name >> >> I went through all steps in a lower version hive (0.10) everything works >> fine, I searched around and seams that is caused by the ‘kryo” serde, so my >> question is, is there a fix? and where to find it? >> >> thank you. >> > >