dynamic add sink to flink

2017-06-09 Thread qq
Hi: we use flink as a router to our kafka, we read from one kafka and to a lot of diffrent kafka and topic, but it will cost too much time to start the flink job if we want to add some other kafka sink to the flink, so if there any way to dynamic add sink to flink or just start the flink j

Re: coGroup exception or something else in Gelly job

2017-06-09 Thread Greg Hogan
Have you looked at org.apache.flink.gelly.GraphExtension.CustomVertexValue.createInitSemiCluster(CustomVertexValue.java:51)? > On Jun 9, 2017, at 4:53 PM, Kaepke, Marc wrote: > > Hi everyone, > > I don’t have any exceptions if I execute my Gelly job in my IDE (local) > directly. > The next s

Re: Stream sql example

2017-06-09 Thread Dawid Wysakowicz
Thanks a lot Timo, after I added the ResultTypeQueryable interface to my mapper it worked. As for the SongEvent the reason I tried remapping it to Row is that it has an enum field on which I want to filter, so my first approach was to remap it in TableSource to String. What do you think should be t

coGroup exception or something else in Gelly job

2017-06-09 Thread Kaepke, Marc
Hi everyone, I don’t have any exceptions if I execute my Gelly job in my IDE (local) directly. The next step is an execution with a real kubernetes cluster (1 JobManager and 3 TaskManager on dedicated machines). The word count example is running without exceptions. My Gelly job throws following

Re: Stream sql example

2017-06-09 Thread Timo Walther
Hi David, I think the problem is that the type of the DataStream produced by the TableSource, does not match the type that is declared in the ` getReturnType()`. A `MapFunction` is always a generic type (because Row cannot be analyzed). A solution would be that the mapper implements `ResultTy

Re: Stream sql example

2017-06-09 Thread Dawid Wysakowicz
Sorry forgot to add the link: https://gist.github.com/dawidwys/537d12a6f2355cba728bf93f1af87b45 Z pozdrowieniami! / Cheers! Dawid Wysakowicz *Data/Software Engineer* Skype: dawid_wys | Twitter: @OneMoreCoder 2017-06-09 20:19 GMT+02:00 Dawid Wysakowicz : > Hi, > I tri

Stream sql example

2017-06-09 Thread Dawid Wysakowicz
Hi, I tried writing a simple sql query with custom StreamTableSource and it fails with error: org.apache.flink.table.codegen.CodeGenException: Arity of result type does >> not match number of expressions. > > at >> org.apache.flink.table.codegen.CodeGenerator.generateResultExpression(CodeGenerator

Re: Problem with WebUI

2017-06-09 Thread Dawid Wysakowicz
I had a look into yarn logs and I found such exception: > 2017-06-09 17:10:20,922 ERROR > org.apache.flink.runtime.webmonitor.files.StaticFileServerHandler - Caught > exception > java.lang.AbstractMethodError > at io.netty.util.ReferenceCountUtil.touch(ReferenceCountUtil.java:73) > at > io.netty.

Re: In-transit Data Encryption in EMR

2017-06-09 Thread vinay patil
Hi Guys, Can anyone please provide me solution to my queries. On Jun 8, 2017 11:30 PM, "Vinay Patil" wrote: > Hi Guys, > > I am able to setup SSL correctly, however the following command does not > work correctly and results in the error I had mailed earlier > > flink run -m yarn-cluster -yt d

Problem with WebUI

2017-06-09 Thread Dawid Wysakowicz
Hi, I am trying to run a flink job on yarn. When I submit the job with following command bin/flink run -m yarn-cluster -yn 2 -c ${className} ${jarName} I cannot access the WebUI, it seems to run correctly, but I can't access the UI. I get 500 response code. When I run yarn-session, the WebUI i

Re: Error running Flink job in Yarn-cluster mode

2017-06-09 Thread Biplob Biswas
One more thing i noticed is that the streaming wordcount from the flink package works when i run it but when i used the same github code, packaged it and uploaded the fat jar with the word count example to the cluster, i get the same error. I am wondering, How can making my uber jar produce such e

Re: Error running Flink job in Yarn-cluster mode

2017-06-09 Thread Biplob Biswas
Hi Nico, I tried running my job with 3 and even 2 yarn containers and the result is the same. Then I tried running the example wordcount(streaming and batch both) and they seem to find the task and job managers and run succesfully. ./bin/flink run -m yarn-cluster -yn 3 -yt ./lib ./examples/stream