Re: How many events can Flink process each second

2019-10-23 Thread Andres Angel
Hello A.V. Id depends on the the underlying resources you are planing for your jobs. I mean memory and processing will play a principal role about this answer. keep in mind you are capable to break down your job in a number of parallel tasks by environment or even by an specific taks within your p

Flink KPL based on a custom class object

2019-10-01 Thread Andres Angel
Hello folks, I need to create a flink producer for Kinesis capable to sink a payload based on a custom class object I have build. The official documentation comes with this basic example assuming that we are sinking a string object: FlinkKinesisProducer kinesis = new FlinkKinesisProducer<>(new Si

Update tables env after have been register them

2019-08-14 Thread Andres Angel
Hello everyone, My use case assume that we execute a job where we load from Redis few data and turn it into DS to register them as tables. But, it's possible that after have completed this step the data might change and we may need to read again the data to keep the tables content up to date. Her

Re: FlatMap returning Row<> based on ArrayList elements()

2019-08-07 Thread Andres Angel
rstood your meaning. > > > > Best, > > Victor > > > > *From: *Andres Angel > *Date: *Wednesday, August 7, 2019 at 9:55 PM > *To: *Haibo Sun > *Cc: *user > *Subject: *Re: FlatMap returning Row<> based on ArrayList elements() > > > > Hello

Re: FlatMap returning Row<> based on ArrayList elements()

2019-08-07 Thread Andres Angel
is? thanks so much AU On Wed, Aug 7, 2019 at 3:57 AM Haibo Sun wrote: > Hi Andres Angel, > > I guess people don't understand your problem (including me). I don't know > if the following sample code is what you want, if not, can you describe th

FlatMap returning Row<> based on ArrayList elements()

2019-07-29 Thread Andres Angel
Hello everyone, I need to parse into an anonymous function an input data to turn it into several Row elements. Originally I would have done something like Row.of(1,2,3,4) but these elements can change on the flight as part of my function. This is why I have decided to store them in a list and righ

Assign a Row.of(ListsElements) exception

2019-07-25 Thread Andres Angel
Hello everyone, I have a list with bunch of elements and I need create a Row.of() based on the whole elements. I try to apply a lambda function for this purpose as: mylist.forEach(n->out.collect(Row.of(n))); but I got the exception below: org.apache.flink.streaming.runtime.tasks.ExceptionInChai

Re: LEFT JOIN issue SQL API

2019-07-25 Thread Andres Angel
gt; > 1. left record arrives, no matched right record, so +(left, null) will > be generated. > 2 right record arrives, the previous result should be retracted, so > -(left, null) and +(left, right) will be generated > > Andres Angel 于2019年7月25日周四 上午8:15写道: > >> Hello guys

LEFT JOIN issue SQL API

2019-07-24 Thread Andres Angel
Hello guys I have registered some table environments and now I'm trying to perform a query on these using LEFT JOIN like the example below: Table fullenrichment = tenv.sqlQuery( "SELECT pp.a,pp.b,pp.c,pp.d,pp.a " + " FROM t1 pp LEFT JOIN t2 ent" +

sqlQuery split string

2019-07-24 Thread Andres Angel
Hello everyone, Following the current available functions https://ci.apache.org/projects/flink/flink-docs-release-1.8/dev/table/functions.html, how could I split a column string by a caracter? example column content : col =a,b,c query: Select col from tenv expected return : cola , colb, colc t

Re: Accessing variables build within a map/flatmap function within a DS

2019-07-24 Thread Andres Angel
a variable outside and modify it in the anonymous class. > > Andres Angel 于2019年7月24日周三 下午8:44写道: > >> Hello everyone, >> >> I was wondering if there is a way how to read the content of a varible >> build within a map/flatmap function out of the DS method. >> &

Accessing variables build within a map/flatmap function within a DS

2019-07-24 Thread Andres Angel
Hello everyone, I was wondering if there is a way how to read the content of a varible build within a map/flatmap function out of the DS method. example: DataStream dsString = env.fromElements("1,a,1.1|2,b,2.2,-2", "3,c|4,d,4.4"); DataStream dsTuple = dsString.flatMap(new FlatMapFunction()

Re: Create Tuple Array dynamically fro a DS

2019-07-23 Thread Andres Angel
DS but a > DS, because Tuple can't be directly used. It seems that you want to > turn this new DS into a table, but if different records have different > number of columns this is not a good practice as the schema of each record > is not the same (but as a workaround, you can fill

Re: Create Tuple Array dynamically fro a DS

2019-07-23 Thread Andres Angel
Sorry I can't quite get your question... Do you mean that how to spilt the > string into fields? > > There is a `split` method in java. You can give it a regexp and it will > return an array containing all the split fields. > > Andres Angel 于2019年7月24日周三 上午10:28写道: > >&g

Re: Create Tuple Array dynamically fro a DS

2019-07-23 Thread Andres Angel
with user defined functions when you need to > use them. > > Another method is that you can store them into arrays. > > Also, if the type of the first 3 fields are the same for the first and > second payload, you can use a Tuple4<> and set the last element as null for > the

Create within a map function of a DS a new register DS

2019-07-23 Thread Andres Angel
Hello everyone, I need to read an element from my DS and according to the content create on the flight a new DS and register it as new EnvironmentTable. I'm using the map function for my input DS, however when I try to use the variable env(environment, in my case StreamExecutionEnvironment ) I ca

Re: Transform from Table to DS

2019-07-23 Thread Andres Angel
be that this Exception has something to do with your import. > If you are coding in a Java environment then you should import > StreamTableEnvironment.java not StreamTableEnvironment.scala. > > Andres Angel 于2019年7月24日周三 上午12:01写道: > >> Hello guys I'm working on Java e

Create Tuple Array dynamically fro a DS

2019-07-23 Thread Andres Angel
Hello everyone, I need to create dynamically the size of my Tuple that feeds a DS, let me explain it better. Let's assume the first payload I read has this format "filed1,field2,field3", then this might require a Tuple3<> but my payload later can be "field1,field2,field3,field4" then my Tuple migh

Transform from Table to DS

2019-07-23 Thread Andres Angel
Hello guys I'm working on Java environment and I have a sample code as: Table schemafit = tenv.sqlQuery("Here is my query"); I need to turn this into a DS to print and any other transformation then I doing a sort of: DataStream resultSet = tenv.toAppendStream(schemafit, Row.class); resultSet.pr

Use batch and stream environment in a single pipeline

2019-07-22 Thread Andres Angel
Hello everyone, I need to create a table from a stream environment and thinking in a pure SQL approach I was wondering if I can create few of the enrichment tables in batch environment and only the streaming payload as streaming table environment. I tried to create a batch table environment with