davidradl commented on code in PR #171:
URL: 
https://github.com/apache/flink-connector-kafka/pull/171#discussion_r2096097472


##########
flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/lineage/TypeDatasetFacet.java:
##########
@@ -1,11 +1,26 @@
 package org.apache.flink.connector.kafka.lineage;
 
 import org.apache.flink.annotation.PublicEvolving;
+import org.apache.flink.api.common.serialization.SerializationSchema;
 import org.apache.flink.api.common.typeinfo.TypeInformation;
 import org.apache.flink.streaming.api.lineage.LineageDatasetFacet;
 
+import java.util.Optional;
+
 /** Facet definition to contain type information of source and sink. */
 @PublicEvolving
 public interface TypeDatasetFacet extends LineageDatasetFacet {
     TypeInformation getTypeInformation();
+
+    /**
+     * Sometimes a sink implementing {@link TypeDatasetFacetProvider} is not 
able to extract type.
+     * This is happening for AvroSerializationSchema due to type erasure 
problem. In this case, it
+     * makes sense to expose SerializationSchema to the lineage consumer so 
that it can use it to
+     * extract type information.
+     *
+     * @return
+     */
+    default Optional<SerializationSchema> getSerializationSchema() {

Review Comment:
   Shouldn't this interface live in Flink? As I assume a File connector using 
the Avro format could hit this issue as well.  



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to