Hey you are right. The issue was with Flink and pyflink version mismatch.
It turned out Flink 1.12 was installed on the cluster. Downgrading pyflink
from 1.12.3 to 1.12 fixed the issue.

Thank you for your help.

On Fri, 8 Oct 2021, 04:04 Dian Fu, <dian0511...@gmail.com> wrote:

> Hi Kamil,
>
> I have checked that this method exists in 1.12.3:
> https://github.com/apache/flink/blob/release-1.12.3/flink-python/src/main/java/org/apache/flink/python/util/PythonConfigUtil.java#L137
>
> Could you double check whether the Flink version is 1.12.3 (not just the
> PyFlink version)?
>
> Regards,
> Dian
>
>
>
> On Tue, Oct 5, 2021 at 11:34 PM Nicolaus Weidner <
> nicolaus.weid...@ververica.com> wrote:
>
>> Hi Kamil,
>>
>> On Tue, Oct 5, 2021 at 9:03 AM Kamil ty <kamilt...@gmail.com> wrote:
>>
>>> Hello,
>>>
>>> I'm trying to run a pyflink job in cluster mode (with yarn). My job
>>> contains source and sink definitions using Table API which are converted to
>>> a datastream and back. Unfortunately I'm getting an unusual exception at:
>>> *table = t_env.from_data_stream(ds, 'user_id, first_name, last_name).*
>>>
>>
>> Just to make sure: Is the missing quotation mark just a typo in your
>> mail, or your code (right before the closing bracket)?
>> *table = t_env.from_data_stream(ds, 'user_id, first_name, last_name['])*
>>
>> Best regards,
>> Nico
>>
>

Reply via email to