I am trying to read from kafka using pyflink table API on EMR 5.30, Flink 1.10 (and I have reproduced the error with 1.11). I am getting an error the following error using either `flink run -py` or pyflink-shell.sh (the error message below was generated in the pyflink shell):
>>> Kafka() Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/tmp/8772bc6d-f275-4ecf-bdcc-d2c34d910473pyflink.zip/pyflink/table/descriptors.py", line 705, in __init__ TypeError: 'JavaPackage' object is not callable >>> Csv() Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/tmp/8772bc6d-f275-4ecf-bdcc-d2c34d910473pyflink.zip/pyflink/table/descriptors.py", line 398, in __init__ TypeError: 'JavaPackage' object is not callable >>> Json() Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/tmp/8772bc6d-f275-4ecf-bdcc-d2c34d910473pyflink.zip/pyflink/table/descriptors.py", line 553, in __init__ TypeError: 'JavaPackage' object is not callable I assume this is telling me I don’t have the flink-kafka connector jar, but I have not been able to figure out how to provide the right jars. I have tried using `flink run -j /path/to/flink-connector-kafka-base_2.11-1.10.0.jar`, `flink run –classpath /path/to/dependency/dir`, `flink run -Dyarn.provided.lib.dirs=hdfs:///flink/kafka/dependencies`. Is there a way to provide the jar dependencies when submitting a python job (or does this error indicate something else)? Thanks, Jesse