Hi everyone,

I need some help with deploying multiple jobs from a single main function in 
Application mode using Flink Kubernetes Operator. As per the documentation [1] 
it should be possible to use multiple "executeAsync()" to deploy multiple jobs 
from the same file. This indeed is the case when running the job locally using 
the CLI with something like `/bin/flink run -pym main -pyfs /project/` (I'm 
using PyFlink btw) and I can see multiple jobs running in the UI. However, when 
I try to deploy the same job using Flink Kubernetes Operator, it seems that 
only the first job gets submitted. The second job is never submitted although 
the code leading up to "executeAsync()" does get executed.

This is a minimal representation of the deployment manifest that I tried to run:

apiVersion: flink.apache.org/v1beta1
kind: FlinkDeployment
metadata:
    name: flink-job
spec:
    image: flinkimage
    imagePullPolicy: IfNotPresent
    flinkVersion: v1_17
    flinkConfiguration:
        taskmanager.numberOfTaskSlots: "1"
        state.savepoints.dir: hdfs://...
        state.checkpoints.dir: hdfs://...
    serviceAccount: flink
    jobManager:
        resource:
            memory: "1024m"
            cpu: 0.5
    taskManager:
        resource:
            memory: "1024m"
            cpu: 0.5
    job:
        jarURI: local:///opt/flink/opt/flink-python_2.12-1.17.0.jar
        entryClass: "org.apache.flink.client.python.PythonDriver"
        args: ["python", "-pym", "main", "-pyfs", "/project/"]
        parallelism: 1
        upgradeMode: savepoint
        state: running

Any help would be greatly appreciated. I'm using Flink v1.17 and Flink 
Kubernetes Operator v1.7.0.

[1]https://nightlies.apache.org/flink/flink-docs-release-1.17/docs/deployment/overview/#application-mode<https://nightlies.apache.org/flink/flink-docs-release-1.17/docs/deployment/overview/#application-mode),>


Thanks,
Sunny

Reply via email to