[ 
https://issues.apache.org/jira/browse/FLINK-27830?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

TongMeng updated FLINK-27830:
-----------------------------
    Description: 
I use commd
{code:java}
./flink run --python /home/ubuntu/pyflink/main.py 
/home/ubuntu/pyflink/xmltest.xml /home/ubuntu/pyflink/task.xml --pyFliles 
/home/ubuntu/pyflink/logtest.py /home/ubuntu/pyflink/KafkaSource.py 
/home/ubuntu/pyflink/KafkaSink.py /home/ubuntu/pyflink/pyFlinkState.py 
/home/ubuntu/pyflink/projectInit.py /home/ubuntu/pyflink/taskInit.py 
/home/ubuntu/pyflink/UDF1.py {code}
to submit my pyflink job.

The error happened on:
{code:java}
st_env.create_statement_set().add_insert_sql(f"insert into algorithmsink select 
{taskInfo}(line, calculationStatusMap, gathertime, storetime, rawData, 
terminal, deviceID, recievetime, car, sendtime, baseInfoMap, workStatusMap, 
`timestamp`) from mysource").execute().wait()
{code}
My appendix error.txt contains the exceptions. It seems like there is something 
wrong with Apache Beam.

When I use python command to run my job (in standalone mode instead of 
submitting to Flink cluster), it works well.

 

 

 

  was:
I use commd
{code:java}
./flink run --python /home/ubuntu/pyflink/main.py 
/home/ubuntu/pyflink/xmltest.xml /home/ubuntu/pyflink/task.xml --pyFliles 
/home/ubuntu/pyflink/logtest.py /home/ubuntu/pyflink/KafkaSource.py 
/home/ubuntu/pyflink/KafkaSink.py /home/ubuntu/pyflink/pyFlinkState.py 
/home/ubuntu/pyflink/projectInit.py /home/ubuntu/pyflink/taskInit.py 
/home/ubuntu/pyflink/UDF1.py {code}
 

to submit my pyflink job.

The error happened on:
{code:java}
st_env.create_statement_set().add_insert_sql(f"insert into algorithmsink select 
{taskInfo}(line, calculationStatusMap, gathertime, storetime, rawData, 
terminal, deviceID, recievetime, car, sendtime, baseInfoMap, workStatusMap, 
`timestamp`) from mysource").execute().wait()
{code}
My appendix error.txt contains the exceptions. It seems like there is something 
wrong with Apache Beam.

When I use python command to run my job (in standalone mode instead of 
submitting to Flink cluster), it works well.

 

 

 


> My Pyflink job could not submit to Flink cluster
> ------------------------------------------------
>
>                 Key: FLINK-27830
>                 URL: https://issues.apache.org/jira/browse/FLINK-27830
>             Project: Flink
>          Issue Type: Bug
>          Components: API / Python
>    Affects Versions: 1.13.0
>            Reporter: TongMeng
>            Priority: Major
>         Attachments: error.txt
>
>
> I use commd
> {code:java}
> ./flink run --python /home/ubuntu/pyflink/main.py 
> /home/ubuntu/pyflink/xmltest.xml /home/ubuntu/pyflink/task.xml --pyFliles 
> /home/ubuntu/pyflink/logtest.py /home/ubuntu/pyflink/KafkaSource.py 
> /home/ubuntu/pyflink/KafkaSink.py /home/ubuntu/pyflink/pyFlinkState.py 
> /home/ubuntu/pyflink/projectInit.py /home/ubuntu/pyflink/taskInit.py 
> /home/ubuntu/pyflink/UDF1.py {code}
> to submit my pyflink job.
> The error happened on:
> {code:java}
> st_env.create_statement_set().add_insert_sql(f"insert into algorithmsink 
> select {taskInfo}(line, calculationStatusMap, gathertime, storetime, rawData, 
> terminal, deviceID, recievetime, car, sendtime, baseInfoMap, workStatusMap, 
> `timestamp`) from mysource").execute().wait()
> {code}
> My appendix error.txt contains the exceptions. It seems like there is 
> something wrong with Apache Beam.
> When I use python command to run my job (in standalone mode instead of 
> submitting to Flink cluster), it works well.
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

Reply via email to