hi, I have a TensorFlow2 model and need to use it with spark2.4. But I failed to load it in spark(java or scala):
scala> import org.{tensorflow => tf} import org.{tensorflow=>tf} scala> val bundle = tf.SavedModelBundle.load("/home/hadoop/xDeepFM","serve") 2021-04-23 07:32:56.223881: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:32] Reading SavedModel from: /home/hadoop/xDeepFM 2021-04-23 07:32:56.266424: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:55] Reading meta graph with tags { serve } 2021-04-23 07:32:56.266468: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:93] Reading SavedModel debug info (if present) from: /home/hadoop/xDeepFM 2021-04-23 07:32:56.346757: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:206] Restoring SavedModel bundle. 2021-04-23 07:32:56.873838: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:190] Running initialization op on SavedModel bundle at path: /home/hadoop/xDeepFM 2021-04-23 07:32:56.928656: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:277] SavedModel load for tags { serve }; Status: success: OK. Took 704788 microseconds. java.lang.NoSuchMethodError: com.google.protobuf.Parser.parseFrom(Ljava/nio/ByteBuffer;)Ljava/lang/Object; at org.tensorflow.proto.framework.MetaGraphDef.parseFrom(MetaGraphDef.java:3067) at org.tensorflow.SavedModelBundle.load(SavedModelBundle.java:422) at org.tensorflow.SavedModelBundle.access$000(SavedModelBundle.java:59) at org.tensorflow.SavedModelBundle$Loader.load(SavedModelBundle.java:68) at org.tensorflow.SavedModelBundle.load(SavedModelBundle.java:242) ... 49 elided I confuse about it for quite a couple days. And *the model can be loaded in flink and pyspark correctly.* Any advice is welcome referenceļ¼ https://stackoverflow.com/questions/67276124/com-google-protobuf-parser-parsefrom-method-cant-use-in-spark https://github.com/tensorflow/java/issues/298 Thanks! -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org