Re: DataSet/DataStream of scala type class interface

2020-04-13 Thread Salva Alcántara
FYI, I have posted the same question (a bit more polished) in https://stackoverflow.com/questions/61193662/dataset-datastream-of-type-class-interface Also, you can find the code in this repo: https://github.com/salvalcantara/flink-events-and-polymorphism -- Sent from: http://apache-flink

DataSet/DataStream of scala type class interface

2020-04-13 Thread Salva Alcántara
I am just experimenting with the usage of Scala type classes within Flink. I have defined the following type class interface: ```scala trait LikeEvent[T] { def timestamp(payload: T): Int } ``` Now, I want to consider a `DataSet` of `LikeEvent[_]` like this: ```scala // existing classes t

Re: Table API and registration of DataSet/DataStream

2017-09-14 Thread Flavio Pompermaier
I see...anyway for me it continue to be very misleading to have different syntax for where clauses (SQL vs scala)... Why not make them compatible? Is it that complex? On Thu, Sep 14, 2017 at 4:26 PM, Fabian Hueske wrote: > Hi Flavio, > > 1) The Java Table API does not aim to resemble SQL but th

Re: Table API and registration of DataSet/DataStream

2017-09-14 Thread Fabian Hueske
Hi Flavio, 1) The Java Table API does not aim to resemble SQL but the Scala Table API which is integrated with the host language (Scala). Hence the different syntax for expressions. 2) Yes, that would be one way to do it. If that adds to much boilerplate code, you could encapsulate the code in yo

Re: Table API and registration of DataSet/DataStream

2017-09-14 Thread Flavio Pompermaier
Hi Fabian, basically these were my problems with Table API. 1 ) Table.sql() has a different where syntax than Table.where() , and this is very annoying (IMHO). Ex: Table.sql("SELECT * FROM XXX WHERE Y IS NOT NULL) vs Table.i.where("Y.isNotNull"). 2) If I understood correctly, my program that id

Re: Table API and registration of DataSet/DataStream

2017-09-14 Thread Fabian Hueske
Not sure what you mean by "translate a where clause to a filter function". Isn't that exactly what Table.filter(String condition) is doing? It translates a SQL-like condition (represented as String) into an operator that filter the Table. 2017-09-09 23:49 GMT+02:00 Flavio Pompermaier : > Yes I

Re: Table API and registration of DataSet/DataStream

2017-09-09 Thread Flavio Pompermaier
Yes I can do that of course. What I need is basically the possibility to translate a where clause to a filter function. Is there any utility class that does that in Flink? On 9 Sep 2017 21:54, "Fabian Hueske" wrote: > Hi Flavio, > > I tried to follow your example. If I got it right, you would li

Re: Table API and registration of DataSet/DataStream

2017-09-09 Thread Fabian Hueske
Hi Flavio, I tried to follow your example. If I got it right, you would like to change the registered table by assigning a different DataStream to the original myDs variable. With registerDataStream("test", myDs, ...) you don't register the variable myDs as a table but it's current value, i.e., a

Table API and registration of DataSet/DataStream

2017-09-08 Thread Flavio Pompermaier
Hi to all, I have a doubt about Table API. Let's say my code is something like: StreamTableEnvironment te = ...; RowTypeInfo rtf = new RowTypeInfo(...); DataStream myDs = te.registerDataStream("test",myDs,columnNames); Table table = te.sql("SELECT *, (NAME = 'John') as VALID FROM test WHERE ..."

Re: DataSet -> DataStream

2016-03-11 Thread Stephan Ewen
Hi! It should be quite straightforward to write an "OutputFormat" that wraps the "FlinkKafkaProducer". That way you can write to Kafka from a DataSet program. Stephan On Fri, Mar 11, 2016 at 1:46 PM, Prez Cannady wrote: > This is roughly the solution I have now. On the other hand, I was ho

Re: DataSet -> DataStream

2016-03-11 Thread Prez Cannady
This is roughly the solution I have now. On the other hand, I was hoping for a solution that doesn’t involve checking whether a file has updated. Prez Cannady p: 617 500 3378 e: revp...@opencorrelate.org GH: https://github.com/opencorrelate

Re: DataSet -> DataStream

2016-03-10 Thread Ashutosh Kumar
As data is already collected, why do you want add one more layer of Kafka. Instead you can start processing your data. Thanks Ashutosh On Mar 11, 2016 4:19 AM, "Prez Cannady" wrote: > > I’d like to pour some data I’ve collected into a DataSet via JDBC into a > Kafka topic, but I think I need to t

Re: DataSet -> DataStream

2016-03-10 Thread Balaji Rajagopalan
You could I suppose write the dateset to a sink a file and then read the file to a data stream. On Fri, Mar 11, 2016 at 4:18 AM, Prez Cannady wrote: > > I’d like to pour some data I’ve collected into a DataSet via JDBC into a > Kafka topic, but I think I need to transform my DataSet into a DataS

DataSet -> DataStream

2016-03-10 Thread Prez Cannady
I’d like to pour some data I’ve collected into a DataSet via JDBC into a Kafka topic, but I think I need to transform my DataSet into a DataStream first. If anyone has a clue how to proceed, I’d appreciate it; or let me know if I’m completely off track. Prez Cannady p: 617 500 3378 e: re