Hi Manas, I think the documentation assumes that you first start a session cluster and then submit jobs from outside the Docker images. If your jobs are included in the Docker image, then you could log into the master process and start the jobs from within the Docker image.
Cheers, Till On Tue, Feb 16, 2021 at 1:00 PM Manas Kale <manaskal...@gmail.com> wrote: > Hi, > > I have a project that is a set of 6 jobs out of which 4 are written in > Java and 2 are written in pyFlink. I want to dockerize these so that all 6 > can be run in a single Flink session cluster. > > I have been able to successfully set up the JobManager and TaskManager > containers as per [1] after creating a custom Docker image that has Python. > For the last step, the guide asks us to submit the job using a local > distribution of Flink: > > $ ./bin/flink run ./examples/streaming/TopSpeedWindowing.jar > > I am probably missing something here because I have the following > questions: > Why do I need to use a local distribution to submit a job? > Why can't I use the Flink distribution that already exists within the > images? > How do I submit a job using the Docker image's distribution? > > > [1] > https://ci.apache.org/projects/flink/flink-docs-stable/deployment/resource-providers/standalone/docker.html#starting-a-session-cluster-on-docker > >