It is a big challenge for us if we want to support multiple jobs in the
same application.
For example,
1. If some jobs have finished, we should not run them again when the
JobManager failover. It means we need to store the finished jobs in the HA
services.
2. Some jobs are only run in specific cond
Hi,
We are using HA mode. Looks like multiple jobs is not an option for us
That makes sense! Thanks for your guys' help!
Thanks,
Qihua
On Wed, Jun 23, 2021 at 7:28 PM Yang Wang wrote:
> Robert is right. We Could only support single job submission in
> application mode when the HA mode is
Robert is right. We Could only support single job submission in application
mode when the HA mode is enabled.
This is a known limitation of current application mode implementation.
Best,
Yang
Robert Metzger 于2021年6月24日周四 上午3:54写道:
> Thanks a lot for checking again. I just started Flink in Appl
Thanks a lot for checking again. I just started Flink in Application mode
with a jar that contains two "executeAsync" submissions, and indeed two
jobs are running.
I think the problem in your case is that you are using High Availability (I
guess, because there are log statements from the
ZooKeeper
Hi Robert,
But I saw Flink doc shows application mode can run multiple jobs? Or I
misunderstand it?
https://ci.apache.org/projects/flink/flink-docs-release-1.11/ops/deployment/
*Compared to the Per-Job mode, the Application Mode allows the
submission of applications consisting of multiple jobs.
Hi Qihua,
Application Mode is meant for executing one job at a time, not multiple
jobs on the same JobManager.
If you want to do that, you need to use session mode, which allows managing
multiple jobs on the same JobManager.
On Tue, Jun 22, 2021 at 10:43 PM Qihua Yang wrote:
> Hi Arvid,
>
> Do
Hi Arvid,
Do you know if I can start multiple jobs for a single flink application?
Thanks,
Qihua
On Thu, Jun 17, 2021 at 12:11 PM Qihua Yang wrote:
> Hi,
>
> I am using application mode.
>
> Thanks,
> Qihua
>
> On Thu, Jun 17, 2021 at 12:09 PM Arvid Heise wrote:
>
>> Hi Qihua,
>>
>> Which ex
Hi,
I am using application mode.
Thanks,
Qihua
On Thu, Jun 17, 2021 at 12:09 PM Arvid Heise wrote:
> Hi Qihua,
>
> Which execution mode are you using?
>
> On Thu, Jun 17, 2021 at 6:48 PM Qihua Yang wrote:
>
>> Hi,
>>
>> Thank you for your reply. What I want is flink app has multiple jobs,
>>
Hi Qihua,
Which execution mode are you using?
On Thu, Jun 17, 2021 at 6:48 PM Qihua Yang wrote:
> Hi,
>
> Thank you for your reply. What I want is flink app has multiple jobs, each
> job manage a stream. Currently our flink app has only 1 job that manage
> multiple streams.
> I did try env.exec
Hi,
Thank you for your reply. What I want is flink app has multiple jobs, each
job manage a stream. Currently our flink app has only 1 job that manage
multiple streams.
I did try env.executeAsyc(), but it still doesn't work. From the log, when
the second executeAsync() was called, it shows " *Job
Hi,
env.execute("Job 1"); is a blocking call. You either have to use
executeAsync or use a separate thread to submit the second job. If Job 1
finishes then this would also work by having sequential execution.
However, I think what you actually want to do is to use the same env with 2
topologies a
Hi,
Does anyone know how to run multiple jobs in same flink application?
I did a simple test. First job was started. I did see the log message, but
I didn't see the second job was started, even I saw the log message.
public void testJobs() throws Exception {
StreamExecutionEnvironment env =
Strea
12 matches
Mail list logo