I didn't say that the main should return the ExecutionEnvironment.
You can define and execute as many programs in a main function as you like.
The program can be defined somewhere else, e.g., in a function that
receives an ExecutionEnvironment and attaches a program such as

public void buildMyProgram(ExecutionEnvironment env) {
  DataSet<String> lines = env.readTextFile(...);
  // do something
  lines.writeAsText(...);
}

That method could be invoked from main():

psv main() {
  ExecutionEnv env = ...

  if(...) {
    buildMyProgram(env);
  }
  else {
    buildSomeOtherProg(env);
  }

  env.execute();

  // run some more programs
}

2015-05-08 12:56 GMT+02:00 Flavio Pompermaier <pomperma...@okkam.it>:

> Hi Fabian,
> thanks for the response.
> So my mains should be converted in a method returning the
> ExecutionEnvironment.
> However it think that it will be very nice to have a syntax like the one
> of the Hadoop ProgramDriver to define jobs to invoke from a single root
> class.
> Do you think it could be useful?
>
> On Fri, May 8, 2015 at 12:42 PM, Fabian Hueske <fhue...@gmail.com> wrote:
>
>> You easily have multiple Flink programs in a single JAR file.
>> A program is defined using an ExecutionEnvironment and executed when you
>> call ExecutionEnvironment.exeucte().
>> Where and how you do that does not matter.
>>
>> You can for example implement a main function such as:
>>
>> public static void main(String... args) {
>>
>>   if (today == Monday) {
>>     ExecutionEnvironment env = ...
>>     // define Monday prog
>>     env.execute()
>>   }
>>   else {
>>     ExecutionEnvironment env = ...
>>     // define other prog
>>     env.execute()
>>   }
>> }
>>
>> 2015-05-08 11:41 GMT+02:00 Flavio Pompermaier <pomperma...@okkam.it>:
>>
>>> Hi to all,
>>> is there any way to keep multiple jobs in a jar and then choose at
>>> runtime the one to execute (like what ProgramDriver does in Hadoop)?
>>>
>>> Best,
>>> Flavio
>>>
>>>
>>
>

Reply via email to