Hi Luan,
Your error is because the version of the Python interpreter used by your
client and the cluster side is inconsistent. The default protocol version
used by pickle before python 3.8 is 4, and after python 3.8 it is 5. If the
two version do not match, an error will be reported as shown in yo
actually I had build/compile
- pyarrow==2.0.0 (test skipped)
- apache-beam==2.27.0 (test skipped)
on python 3.9, and test with example python jobs( bin/flink run
-pyclientexec python3.7 -pyexec python3.9 -py
examples/python/table/word_count.py )
but got exceptions following
Caused by: java.util.co
Hi Martjin and Luan,
As of now, the main reason why PyFlink has not declared to support Python
3.9 is that the dependent apache-beam, and the versions of numpy and
pyarrow that apache-beam depends on do not provide corresponding whl
packages in Python 3.9. Users need source code installation, but
Hi Luan,
According to the documentation Python 3.9 is currently indeed not
supported. I briefly checked the Jira tickets and also couldn't find one
about adding support for this, so I've created
https://issues.apache.org/jira/browse/FLINK-27058 for that.
@dian0511...@gmail.com @hxbks...@gmail.co
Hi
currently I'll need to run pyflink udf on python 3.9 which is not supported
right now
I tried building
- pyarrow==2.0.0
- apache-beam==2.27.0
on python 3.9 and test python jobs but failed
Is there any discussions/git branch on python 3.9 before? (I didn't find
any in this dev list)
so I can c