Thank you all for the support! It will be a really nice feature if the web client could be able to show me the list of Flink jobs within my jar.. it should be sufficient to mark them with a special annotation and inspect the classes within the jar..
On Fri, May 8, 2015 at 3:03 PM, Malte Schwarzer <[email protected]> wrote: > Hi Flavio, > > you also can put each job in a single class and use the –c parameter to > execute jobs separately: > > /bin/flink run –c com.myflinkjobs.JobA /path/to/jar/multiplejobs.jar > /bin/flink run –c com.myflinkjobs.JobB /path/to/jar/multiplejobs.jar > … > > Cheers > Malte > > Von: Robert Metzger <[email protected]> > Antworten an: <[email protected]> > Datum: Freitag, 8. Mai 2015 14:57 > An: "[email protected]" <[email protected]> > Betreff: Re: Package multiple jobs in a single jar > > Hi Flavio, > > the pom from our quickstart is a good reference: > https://github.com/apache/flink/blob/master/flink-quickstart/flink-quickstart-java/src/main/resources/archetype-resources/pom.xml > > > > > On Fri, May 8, 2015 at 2:53 PM, Flavio Pompermaier <[email protected]> > wrote: > >> Ok, get it. >> And is there a reference pom.xml for shading my application into one >> fat-jar? which flink dependencies can I exclude? >> >> On Fri, May 8, 2015 at 1:05 PM, Fabian Hueske <[email protected]> wrote: >> >>> I didn't say that the main should return the ExecutionEnvironment. >>> You can define and execute as many programs in a main function as you >>> like. >>> The program can be defined somewhere else, e.g., in a function that >>> receives an ExecutionEnvironment and attaches a program such as >>> >>> public void buildMyProgram(ExecutionEnvironment env) { >>> DataSet<String> lines = env.readTextFile(...); >>> // do something >>> lines.writeAsText(...); >>> } >>> >>> That method could be invoked from main(): >>> >>> psv main() { >>> ExecutionEnv env = ... >>> >>> if(...) { >>> buildMyProgram(env); >>> } >>> else { >>> buildSomeOtherProg(env); >>> } >>> >>> env.execute(); >>> >>> // run some more programs >>> } >>> >>> 2015-05-08 12:56 GMT+02:00 Flavio Pompermaier <[email protected]>: >>> >>>> Hi Fabian, >>>> thanks for the response. >>>> So my mains should be converted in a method returning the >>>> ExecutionEnvironment. >>>> However it think that it will be very nice to have a syntax like the >>>> one of the Hadoop ProgramDriver to define jobs to invoke from a single >>>> root class. >>>> Do you think it could be useful? >>>> >>>> On Fri, May 8, 2015 at 12:42 PM, Fabian Hueske <[email protected]> >>>> wrote: >>>> >>>>> You easily have multiple Flink programs in a single JAR file. >>>>> A program is defined using an ExecutionEnvironment and executed when >>>>> you call ExecutionEnvironment.exeucte(). >>>>> Where and how you do that does not matter. >>>>> >>>>> You can for example implement a main function such as: >>>>> >>>>> public static void main(String... args) { >>>>> >>>>> if (today == Monday) { >>>>> ExecutionEnvironment env = ... >>>>> // define Monday prog >>>>> env.execute() >>>>> } >>>>> else { >>>>> ExecutionEnvironment env = ... >>>>> // define other prog >>>>> env.execute() >>>>> } >>>>> } >>>>> >>>>> 2015-05-08 11:41 GMT+02:00 Flavio Pompermaier <[email protected]>: >>>>> >>>>>> Hi to all, >>>>>> is there any way to keep multiple jobs in a jar and then choose at >>>>>> runtime the one to execute (like what ProgramDriver does in Hadoop)? >>>>>> >>>>>> Best, >>>>>> Flavio >>>>>> >>>>>> >>>>> >>>> >>> >> >> >
