mas-chen commented on code in PR #1:
URL: 
https://github.com/apache/flink-connector-kafka/pull/1#discussion_r1049336717


##########
flink-connector-kafka-e2e-tests/flink-end-to-end-tests-common-kafka/src/test/java/org/apache/flink/tests/util/kafka/SQLClientKafkaITCase.java:
##########
@@ -82,15 +86,27 @@ private static Configuration getConfiguration() {
         return flinkConfig;
     }
 
-    @Rule

Review Comment:
   I actually propose to get rid of this test. Reason being is that 
SQLClientSchemaRegistryITCase tests SQL Kafka table source and sink already.
   
   This test is just testing E2E the json format -> avro format -> csv format 
via file sink. However, now that we are using containerized Flink, the file it 
writes to is now within the container and it is difficult to obtain the file 
from the taskmanager containers to assert on the file output contents.
   
   We could easily fix this test if we can use 
LocalStandaloneFlinkResourceFactory (but it is in 
flink-end-to-end-test-common). 
   
   Plus I had to create an extraneous module to get the jar of a custom sql 
function (see flink-sql-toolbox module). 
   
   Alternatively, I can totally rewrite and simplify the test. (only testing 
json format -> avro format and asserting Kafka output contents, but this is 
what SQLClientSchemaRegistryITCase is essentially doing).
   
   @zentol @MartijnVisser WDYT?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to