Hi Manas,

Since Flink 1.9, the entire architecture of PyFlink has been redesigned. So
the method described in the link won't work.
But you can use more convenient DDL[1] or descriptor[2] to read kafka data.
Besides, You can refer to the common questions about PyFlink[3]

[1]
https://ci.apache.org/projects/flink/flink-docs-release-1.10/dev/table/sql/create.html#run-a-create-statement
[2]
https://ci.apache.org/projects/flink/flink-docs-release-1.10/dev/table/connect.html#kafka-connector
[3]
https://ci.apache.org/projects/flink/flink-docs-release-1.10/dev/table/python/common_questions.html

Best,
Xingbo

Manas Kale <manaskal...@gmail.com> 于2020年6月29日周一 下午8:10写道:

> Hi,
> I want to consume and write to Kafak from Flink's python API.
>
> The only way I found to do this was through this
> <https://stackoverflow.com/questions/52744277/apache-flink-kafka-connector-in-python-streaming-api-cannot-load-user-class>
>  question
> on SO where the user essentially copies FlinkKafka connector JARs into the
> Flink runtime's lib/ directory.
>
>    - Is this the recommended method to do this? If not, what is?
>    - Is there any official documentation for using Kafka with pyFlink? Is
>    this officially supported?
>    - How does the method described in the link work? Does the Flink
>    runtime load and expose all JARs in /lib to the python script? Can I write
>    custom operators in Java and use those through python?
>
> Thanks,
> Manas
>

Reply via email to