Hi Dian
Thank you for your reply.
I could solve this issue by setting an additional environment variable
FLINK_OPT_DIR = "C:\flink1_13_3\opt" in Windows.
Now it works fine.
Regards,
Christian
Von: Dian Fu
Gesendet: Donnerstag, 28. Oktober 2021 10:21
An: Schmid Christian
Hi,
When I execute a PyFlink-Job locally in a mini cluster, then everything works
fine:
(env)
user@bla /cygdrive/c/flink1_13_3/examples/python/table/batch
$ python word_count.py
Results directory: C:\cygwin64\tmp/result
But when I try to execute the PyFlink-Job in a remote cluster, the job
exe
Hi all
According to the official documentation (Table API / JDBC SQL Connector
v.1.14.0) "the JDBC connector allows reading data from and writing data into
any relational databases with a JDBC driver".
At the moment we are using SQL Server in conjunction with Flink and Java, which
works perfect