High latency in reading Iceberg tables using Flink table api

2024-03-12 Thread Chetas Joshi
Hello all, I am using the flink-iceberg-runtime lib to read an iceberg table into a Flink datastream. I am using Glue as the catalog. I use the flink table API to build and query an iceberg table and then use toDataStream to convert it into a DataStream. Here is the code Table table

Re: Usecase advise for Apache Flink Table API

2023-08-27 Thread Giannis Polyzos
egrading the performance of SQLServer Database. >> >> Is it a good idea to implement it through Apache Flink Table API for >> real-time data joins? >> >> ie: there are 10 different sql tables with complex join queries with 10 >> different conditions. (Approx. -

Re: Usecase advise for Apache Flink Table API

2023-08-27 Thread liu ron
atabase. > > Is it a good idea to implement it through Apache Flink Table API for > real-time data joins? > > ie: there are 10 different sql tables with complex join queries with 10 > different conditions. (Approx. - The accumulative 10 tables size is ~100GB > and could grow

Usecase advise for Apache Flink Table API

2023-08-27 Thread Nirmal Chhatrala
Hello! We have a use case requirement to implement complex joins and aggregation on multiple sql tables. Today, it is happening at SQLServer level which is degrading the performance of SQLServer Database. Is it a good idea to implement it through Apache Flink Table API for real-time data joins

Re: Flink Table API + Jacoco Plugin

2023-08-01 Thread Brendan Cortez
up On Tue, 18 Jul 2023 at 09:27, Brendan Cortez wrote: > Hi all! > > I'm trying to use the jacoco-maven-plugin and run unit-tests for Flink > Table API, but they fail with an error (see file error_log_flink_17.txt for > full error stacktrace in attachment): > java.lang.I

Flink Table API + Jacoco Plugin

2023-07-17 Thread Brendan Cortez
Hi all! I'm trying to use the jacoco-maven-plugin and run unit-tests for Flink Table API, but they fail with an error (see file error_log_flink_17.txt for full error stacktrace in attachment): java.lang.IllegalArgumentException: fromIndex(2) > toIndex(0) ... I'm using: - Flink:

Re: Flink Table API watermark after a select operation on a table

2023-06-25 Thread feng xiangyu
Hi Eugenio, According to docs[1], there are two ways to define the watermark in a table: 1. Defining in DDL 2. During DataStream-to-Table Conversion In your case, I think could use CREATE TABLE DDL to create a new table from filteredPhasesDurationsTable with watermark. See more in CREATE Statemen

Re: Flink Table API watermark after a select operation on a table

2023-06-25 Thread feng xiangyu
Hi, Eugenio AFAIK, you could define watermark on the data_fine by adding attribute in phasesDurationsSchema. For example: final Schema phasesDurationsSchema = Schema.newBuilder() .column("id_fascicolo", DataTypes.BIGINT().notNull()) .column("nrg", DataTypes.STRING()) .column("giudice", DataTypes.

Flink Table API watermark after a select operation on a table

2023-06-25 Thread Eugenio Marotti
Hi everyone, I'm using Flink for processing some streaming data. First of all I have two tables receiving events from Kafka. These tables are joined and the resulting table is converted to a DataStream where it is processed by a custom KeyedProcessFunction. The output is then converted to a tab

Lookup join or enrichment join against a changelog stream in Apache Flink Table API

2023-01-11 Thread Colin Williams
I'm interested in doing a "lookup join" or "enrichment join" against a "changelog stream" read by "upsert-kafka". I am wondering if this is possible against the table API. I found https://github.com/fhueske/flink-sql-demo#enrichment-join-against

Re: flink-table-api-scala-bridge sources

2022-10-31 Thread Chesnay Schepler
ng on maven: https://repo1.maven.org/maven2/org/apache/flink/flink-table-api-scala-bridge_2.12/1.15.2/ If you have a look at the -sources.jar you will see it doesn't actually contain any sources. It would be very helpful to have these sources published since they contain the API a lot of user

flink-table-api-scala-bridge sources

2022-10-25 Thread Clemens Valiente
Hi everyone I noticed when going through the scala datastream/table api bridge in my IDE I cannot see the source of the code. I believe it is because the Sources are missing on maven: https://repo1.maven.org/maven2/org/apache/flink/flink-table-api-scala-bridge_2.12/1.15.2/ If you have a look at

Re: Any usage examples for flink-table-api-java-bridge?

2022-07-18 Thread Alexander Fedulov
You are welcome, glad it helped! Best, Alexander Fedulov On Mon, Jul 18, 2022 at 8:06 PM Salva Alcántara wrote: > For the record, Alexander Fedulov pointed me to an example within the > kafka connector: > > > https://github.com/apache/flink/blob/025675725336cd572aa2601be525efd4995e5b84/flink-co

RE: Any usage examples for flink-table-api-java-bridge?

2022-07-18 Thread Salva Alcántara
For the record, Alexander Fedulov pointed me to an example within the kafka connector: https://github.com/apache/flink/blob/025675725336cd572aa2601be525efd4995e5b84/flink-connectors/flink-connector-kafka/src/main/java/org/apache/flink/streaming/connectors/kafka/table/KafkaDynamicSink.java#L218 Th

RE: Any usage examples for flink-table-api-java-bridge?

2022-07-18 Thread Salva Alcántara
sinks/#project-configuration > > ```` > > org.apache.flink > flink-table-api-java-bridge > 1.16-SNAPSHOT > provided > > ``` > > The following is stated: > > When developing the connector/format, we suggest shipping both a thin JAR > > and an ub

Any usage examples for flink-table-api-java-bridge?

2022-07-18 Thread Salva Alcántara
The following library is mentioned here: - https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/sourcessinks/#project-configuration org.apache.flink flink-table-api-java-bridge 1.16-SNAPSHOT provided ``` The following is stated: When developing the

Re: Flink Table API and Rules

2022-02-08 Thread Roman Khachatryan
Flink > Table API. I am able to match the data that exists in my streaming data and > the specific column of the database table. Now I need to activate some rules > that users have entered in the system (another database table) and I want to > evaluate the rules. What is the bes

Flink Table API and Rules

2022-02-08 Thread Shahdat Hossain
I am trying to match streaming data against my database table using Flink Table API. I am able to match the data that exists in my streaming data and the specific column of the database table. Now I need to activate some rules that users have entered in the system (another database table) and I

Re: flink-table-api-scala-bridge missing source files

2021-12-31 Thread Timo Walther
code is not included in `flink-table-api-scala-bridge_2.12-1.14.2-sources.jar` for now. You can find all the compiled code in the compiled jar (flink-table-api-scala-bridge_2.12-1.14.2.jar) for debugging. If we need to also include scala code in the sources.jar, we could try something like this[1

Re: flink-table-api-scala-bridge missing source files

2021-12-26 Thread Zhipeng Zhang
Hi Yuval, It seems that scala code is not included in `flink-table-api-scala-bridge_2.12-1.14.2-sources.jar` for now. You can find all the compiled code in the compiled jar (flink-table-api-scala-bridge_2.12-1.14.2.jar) for debugging. If we need to also include scala code in the sources.jar, we

flink-table-api-scala-bridge missing source files

2021-12-25 Thread Yuval Itzchakov
Hi, I'd like to debug something that happens inside the scala-bridge package. However, when trying to browse the classes in IntelliJ it seems like they are missing sources. When looking at the sources jar file, I see only two classes: jar -tf ~/Downloads/flink-table-api-scala-bridge_2.12-1

Re: Multiple select queries in single job on flink table API

2021-04-21 Thread tbud
/"TableResult result1 = stmtSet.execute(); result1.print();"/ I tried this, and the result is following : Job has been submitted with JobID 4803aa5edc31b3ddc884f922008c5c03 +++ | default_catalog.default_databas

Re: Multiple select queries in single job on flink table API

2021-04-21 Thread Arvid Heise
Hi tbud, you still have two executes; it should only be one. Can you try the following instead of using outputTable1? TableResult result1 = stmtSet.execute(); result1.print(); On Sun, Apr 18, 2021 at 12:05 AM tbud wrote: > I have tried that too For example : > > /tableEnv.createTemporaryView("

Re: Multiple select queries in single job on flink table API

2021-04-17 Thread tbud
I have tried that too For example : /tableEnv.createTemporaryView("CreditDetails", creditStream); tableEnv.executeSql( "CREATE TABLE output(loanId VARCHAR) with ('connector.type' = 'filesystem'," + "'connector.path' = 'file:///path/Downloads/1'," + "'format.type' = 'csv')"); Table creditD

Re: Multiple select queries in single job on flink table API

2021-04-16 Thread Yuval Itzchakov
Yes. Instead of calling execute on each table, create a StatementSet using your StreamTableEnvironment (tableEnv.createStatementSet) and use addInsert and finally .execute when you want to run the job. On Sat, Apr 17, 2021, 03:20 tbud wrote: > If I want to run two different select queries on

Multiple select queries in single job on flink table API

2021-04-16 Thread tbud
If I want to run two different select queries on a flink table created from the dataStream, the blink-planner runs them as two different jobs. Is there a way to combine them and run as a single job ? Example code : /StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironmen

Re: Stream aggregation using Flink Table API (Blink plan)

2020-11-12 Thread Felipe Gutierrez
I see. now it has different query plans. It was documented on another page so I got confused. Thanks! -- -- Felipe Gutierrez -- skype: felipe.o.gutierrez -- https://felipeogutierrez.blogspot.com On Thu, Nov 12, 2020 at 12:41 PM Jark Wu wrote: > > Hi Felipe, > > The default value of `table.optimiz

Re: Stream aggregation using Flink Table API (Blink plan)

2020-11-12 Thread Jark Wu
Hi Felipe, The default value of `table.optimizer.agg-phase-strategy` is AUTO, if mini-batch is enabled, if will use TWO-PHASE, otherwise ONE-PHASE. https://ci.apache.org/projects/flink/flink-docs-master/dev/table/config.html#table-optimizer-agg-phase-strategy On Thu, 12 Nov 2020 at 17:52, Felipe

Re: Stream aggregation using Flink Table API (Blink plan)

2020-11-12 Thread Felipe Gutierrez
Hi Jack, I don't get the difference from the "MiniBatch Aggregation" if compared with the "Local-Global Aggregation". On the web page [1] it says that I have to enable the TWO_PHASE parameter. So I compared the query plan from both, with and without the TWO_PHASE parameter. And they are the same.

Re: Stream aggregation using Flink Table API (Blink plan)

2020-11-10 Thread Felipe Gutierrez
I see, thanks Timo -- -- Felipe Gutierrez -- skype: felipe.o.gutierrez -- https://felipeogutierrez.blogspot.com On Tue, Nov 10, 2020 at 3:22 PM Timo Walther wrote: > > Hi Felipe, > > with non-deterministic Jark meant that you never know if the mini batch > timer (every 3 s) or the mini batch thr

Re: Stream aggregation using Flink Table API (Blink plan)

2020-11-10 Thread Timo Walther
Hi Felipe, with non-deterministic Jark meant that you never know if the mini batch timer (every 3 s) or the mini batch threshold (e.g. 3 rows) fires the execution. This depends how fast records arrive at the operator. In general, processing time can be considered non-deterministic, because 1

Re: Stream aggregation using Flink Table API (Blink plan)

2020-11-10 Thread Felipe Gutierrez
Hi Jark, thanks for your reply. Indeed, I forgot to write DISTINCT on the query and now the query plan is splitting into two hash partition phases. what do you mean by deterministic time? Why only the window aggregate is deterministic? If I implement the ProcessingTimeCallback [1] interface is it

Re: Stream aggregation using Flink Table API (Blink plan)

2020-11-09 Thread Jark Wu
Hi Felipe, The "Split Distinct Aggregation", i.e. the "table.optimizer.distinct-agg.split.enabled" option, only works for distinct aggregations (e.g. COUNT(DISTINCT ...)). However, the query in your example is using COUNT(driverId). You can update it to COUNT(DISTINCT driverId), and it should sh

Re: Stream aggregation using Flink Table API (Blink plan)

2020-11-09 Thread Felipe Gutierrez
I realized that I forgot the image. Now it is attached. -- -- Felipe Gutierrez -- skype: felipe.o.gutierrez -- https://felipeogutierrez.blogspot.com On Mon, Nov 9, 2020 at 1:41 PM Felipe Gutierrez wrote: > > Hi community, > > I am testing the "Split Distinct Aggregation" [1] consuming the taxi >

Stream aggregation using Flink Table API (Blink plan)

2020-11-09 Thread Felipe Gutierrez
Hi community, I am testing the "Split Distinct Aggregation" [1] consuming the taxi ride data set. My sql query from the table environment is the one below: Table tableCountDistinct = tableEnv.sqlQuery("SELECT startDate, COUNT(driverId) FROM TaxiRide GROUP BY startDate"); and I am enableing: conf

Re: Flink Table API and not recognizing s3 plugins

2020-09-15 Thread Dan Hill
Sweet, this was the issue. I got this to work by copying the s3 jar over to plugins for the client container. Thanks for all of the help! The Table API is sweet! On Mon, Sep 14, 2020 at 11:14 PM Dan Hill wrote: > Yes, the client runs in K8. It uses a different K8 config than the Helm > chart

Re: Flink Table API and not recognizing s3 plugins

2020-09-14 Thread Dan Hill
Yes, the client runs in K8. It uses a different K8 config than the Helm chart and does not load the plugins. Does the client use the same plugin structure as the Flink job/task manager? I can try using it tomorrow. Cool, that link would work too. Thanks, Arvid! On Mon, Sep 14, 2020 at 10:59

Re: Flink Table API and not recognizing s3 plugins

2020-09-14 Thread Arvid Heise
Hi Dan, Are you running the client also in K8s? If so you need an initialization step, where you add the library to the plugins directory. Putting it into lib or into the user jar doesn't work anymore as we removed the shading in s3 in Flink 1.10. The official Flink docker image has an easy way t

Re: Flink Table API and not recognizing s3 plugins

2020-09-14 Thread Dan Hill
Thanks for the update! I'm trying a bunch of combinations on the client side to get the S3 Filesystem to be picked up correctly. Most of my attempts involved building into the job jar (which I'm guessing won't work). I then started getting issues with ClassCastExceptions. I might try a little m

Re: Flink Table API and not recognizing s3 plugins

2020-09-14 Thread Jingsong Li
Hi Dan, I think Arvid and Dawid are right, as a workaround, you can try making S3Filesystem works in the client. But for a long term solution, we can fix it. I created https://issues.apache.org/jira/browse/FLINK-19228 for tracking this. Best, Jingsong On Mon, Sep 14, 2020 at 3:57 PM Dawid Wysak

Re: Flink Table API and not recognizing s3 plugins

2020-09-14 Thread Dawid Wysakowicz
Hi Dan, As far as I checked in the code, the FileSystemSink will try to create staging directories from the client. I think it might be problematic, as your case shows. We might need to revisit that part. I am cc'ing Jingsong who worked on the FileSystemSink. As a workaround you might try putting

Re: Flink Table API and not recognizing s3 plugins

2020-09-10 Thread Dan Hill
This is running on my local minikube and is trying to hit minio. On Thu, Sep 10, 2020 at 1:10 PM Dan Hill wrote: > I'm using this Helm chart > . I > start the job by building an image with the job jar and using kubectl apply > t

Re: Flink Table API and not recognizing s3 plugins

2020-09-10 Thread Arvid Heise
In general, I'd assume that JM and TM are enough. However, it seems like the query planner is doing some path sanitization for which it needs the filesystem. Since I don't know this part too well, I'm pulling in Jark and Dawid that may know more. I'm also not sure if this is intentional or a bug.

Re: Flink Table API and not recognizing s3 plugins

2020-09-10 Thread Dan Hill
I'm using this Helm chart . I start the job by building an image with the job jar and using kubectl apply to do a flink run with the jar. The log4j.properties on jobmanager and taskmanager have debug level set and are pretty embed

Re: Flink Table API and not recognizing s3 plugins

2020-09-10 Thread Dan Hill
Copying more of the log 2020-09-10 19:50:17,712 INFO org.apache.flink.client.cli.CliFrontend [] - 2020-09-10 19:50:17,718 INFO org.apache.flink.client.cli.CliFrontend [] - Starting

Re: Flink Table API and not recognizing s3 plugins

2020-09-10 Thread Dan Hill
I was able to get more info to output on jobmanager. 2020-09-10 19:50:17,722 INFO org.apache.flink.client.cli.CliFrontend [] - 2020-09-10 19:50:17,731 INFO org.apache.flink.configuration.GlobalConfig

Re: Flink Table API and not recognizing s3 plugins

2020-09-10 Thread Arvid Heise
Hi Dan, somehow enabling debug statements did not work. However, the logs helps to narrow down the issue. The exception occurs neither on jobmanager nor on taskmanager. It occurs wherever you execute the command line interface. How do you execute the job? Do you start it from your machine? Can y

Re: Flink Table API and not recognizing s3 plugins

2020-09-10 Thread Dan Hill
I changed the levels to DEBUG. I don't see useful data in the logs. https://drive.google.com/file/d/1ua1zsr3BInY_8xdsWwA__F0uloAqy-vG/view?usp=sharing On Thu, Sep 10, 2020 at 8:45 AM Arvid Heise wrote: > Could you try 1) or 2) and enable debug logging* and share the log with us? > > *Usually b

Re: Flink Table API and not recognizing s3 plugins

2020-09-10 Thread Arvid Heise
Could you try 1) or 2) and enable debug logging* and share the log with us? *Usually by adjusting FLINK_HOME/conf/log4j.properties. On Thu, Sep 10, 2020 at 5:38 PM Dan Hill wrote: > Ah, sorry, it's a copy/paste issue with this email. I've tried both: > 1) using s3a uri with flink-s3-fs-hadoop

Re: Flink Table API and not recognizing s3 plugins

2020-09-10 Thread Dan Hill
Ah, sorry, it's a copy/paste issue with this email. I've tried both: 1) using s3a uri with flink-s3-fs-hadoop jar in /opt/flink/plugins/s3-fs-hadoop. 2) using s3p uri with flink-s3-fs-presto jar in /opt/flink/plugins/s3-fs-presto. 3) loading both 1 and 2 4) trying s3 uri. When doing 1) Caused by

Re: Flink Table API and not recognizing s3 plugins

2020-09-10 Thread Arvid Heise
Hi Dan, s3p is only provided by flink-s3-fs-presto plugin. The plugin you used provides s3a. (and both provide s3, but it's good to use the more specific prefix). Best, Arvid On Thu, Sep 10, 2020 at 9:24 AM Dan Hill wrote: > *Background* > I'm converting some prototype Flink v1.11.1 code that

Flink Table API and not recognizing s3 plugins

2020-09-10 Thread Dan Hill
*Background* I'm converting some prototype Flink v1.11.1 code that uses DataSet/DataTable APIs to use the Table API. *Problem* When switching to using the Table API, my s3 plugins stopped working. I don't know why. I've added the required maven table dependencies to the job. I've tried us movin

Re: Flink Table API/SQL debugging, testability

2020-08-25 Thread Dawid Wysakowicz
(CloseableIterator it = result.collect()) {                 assertThat(...);             } You can also check some of the *ITCase tests in flink-table-planner-blink module. Best, Dawid On 25/08/2020 09:19, narasimha wrote: > I was looking for testability, debugging practices on Flink Table &g

Flink Table API/SQL debugging, testability

2020-08-25 Thread narasimha
I was looking for testability, debugging practices on Flink Table API/SQL. Really difficult to find them when compared to Streaming API. Can someone please share their experiences on debugging, testability. -- A.Narasimha Swamy

Re: Flink Table API

2020-02-12 Thread Jark Wu
Hi Flavio, There is 2 main entry points in Table API, one is `TableEnvironment`, another is `StreamTableEnvironment`. - `TableEnvironment` is used for pure Table API & SQL programs. - `StreamTableEnvironment` can be used to convert from/to DataStream. These two interface will be kept in the futur

Flink Table API

2020-02-12 Thread Flavio Pompermaier
Hi to all, I was trying to use the new Table API using the new Blink planner but I figured out that they do not use exactly the same APIs..for example I can't go back and forth from Tables to Dataset/Datastream anymore (using tableEnv.toDataset for example). Is this a temporary behaviour or this fu

Fwd: FLINK TABLE API UDAF QUESTION, For heap backends, the new state serializer must not be incompatible

2019-08-26 Thread orlando qi
-- Forwarded message - 发件人: orlando qi Date: 2019年8月23日周五 上午10:44 Subject: FLINK TABLE API UDAF QUESTION, For heap backends, the new state serializer must not be incompatible To: Hello everyone: I defined a UDAF function when I am using the FLINK TABLE API to achieve the

FLINK TABLE API UDAF QUESTION, For heap backends, the new state serializer must not be incompatible

2019-08-22 Thread orlando qi
Hello everyone: I defined a UDAF function when I am using the FLINK TABLE API to achieve the aggregation operation. There is no problem with the task running from beginning in cluster. But it throws an exception when it is restart task from checkpoint,How can I resolve it

Re: Flink Table API schema doesn't support nested object in ObjectArray

2019-08-09 Thread Fabian Hueske
;> >> Thank you, >> Fabian >> >> Am Fr., 2. Aug. 2019 um 16:01 Uhr schrieb Jacky Du < >> jacky.du0...@gmail.com>: >> >>> Hi, All >>> >>> Just find that Flink Table API have some issue if define nested object >>> in an obj

Re: Flink Table API schema doesn't support nested object in ObjectArray

2019-08-07 Thread Jacky Du
: > Thanks for the bug report Jacky! > > Would you mind opening a Jira issue, preferably with a code snippet that > reproduces the bug? > > Thank you, > Fabian > > Am Fr., 2. Aug. 2019 um 16:01 Uhr schrieb Jacky Du >: > >> Hi, All >> >> Just find th

Re: Flink Table API schema doesn't support nested object in ObjectArray

2019-08-02 Thread Fabian Hueske
Thanks for the bug report Jacky! Would you mind opening a Jira issue, preferably with a code snippet that reproduces the bug? Thank you, Fabian Am Fr., 2. Aug. 2019 um 16:01 Uhr schrieb Jacky Du : > Hi, All > > Just find that Flink Table API have some issue if define nested objec

Flink Table API schema doesn't support nested object in ObjectArray

2019-08-02 Thread Jacky Du
Hi, All Just find that Flink Table API have some issue if define nested object in an object array . it will give column not found exception if a table schema define like below : payload : Row(arraylist : ObjectArrayTypeInfo) but Table APi works fine if we don't have nested object in array

Re: Add Logical filter on a query plan from Flink Table API

2019-07-09 Thread Hequn Cheng
Hi Felipe, Yes, we are short of such tutorials. Probably you can take a look at the code of Flink-9713[1](checking the changelog in IDE is more convenient). The code shows how to create a logical node and how to use a rule to convert it into a FlinkLogicalRel and then convert into a DataStream Rel

Re: Add Logical filter on a query plan from Flink Table API

2019-07-09 Thread Felipe Gutierrez
Hi Hequn, it has been very hard to find even a very small tutorial of how to create my on rule in Calcite+Flink. What I did was copy a Calcite rule to my project and try to understand it. I am working with the FilterJoinRule [1] which is one rule the Flink is modifying it. In the end I want to cre

Re: Add Logical filter on a query plan from Flink Table API

2019-07-09 Thread Hequn Cheng
Hi Felipe, > what is the relation of RelFactories [1] when I use it to create the INSTANCE of my rule? The `RelFactories.LOGICAL_BUILDER` can be used during the rule transformation, i.e., the `RelFactories.LOGICAL_BUILDER` is a `RelBuilderFactory` which contains a `create` method can be used to cr

Re: Add Logical filter on a query plan from Flink Table API

2019-07-09 Thread Felipe Gutierrez
Hi Hequn, what is the relation of RelFactories [1] when I use it to create the INSTANCE of my rule? For example: public static final MyFilterRule INSTANCE = new MyFilterRule(Filter.class, RelFactories.LOGICAL_BUILDER); then I create a CalciteCOnfigBuilder using "new CalciteConfigBuilder().addLog

Re: Add Logical filter on a query plan from Flink Table API

2019-07-08 Thread Hequn Cheng
Hi Felipe, > I would like to create a logical filter if there is no filter set on the logical query. How should I implement it? Do you mean you want to add a LogicalFilter node if the query even doesn't contain filter? Logically, this can be done through a rule. However, it sounds a little hack an

Re: Flink Table API and Date fields

2019-07-08 Thread Rong Rong
t; Best, JingsongLee >> >> ---------- >> From:Flavio Pompermaier >> Send Time:2019年7月8日(星期一) 15:40 >> To:JingsongLee >> Cc:user >> Subject:Re: Flink Table API and Date fields >> >>

Add Logical filter on a query plan from Flink Table API

2019-07-08 Thread Felipe Gutierrez
Hi, I am a newbie in Apache Calcite. I am trying to use it with Apache Flink. To start I am trying to create a HelloWorld which just add a logical filter on my query. 1 - I have my Flink app using Table API [1]. 2 - I have created my Calcite filter rule which is applied to my FLink query if I use

Re: Flink Table API and Date fields

2019-07-08 Thread Timo Walther
Send Time:2019年7月8日(星期一) 15:40 To:JingsongLee mailto:lzljs3620...@aliyun.com>> Cc:user mailto:user@flink.apache.org>> Subject:Re: Flink Table API and Date fields I think I could do it for this specific use case but isn't this a big limitation of Table API?

Re: Flink Table API and Date fields

2019-07-08 Thread Flavio Pompermaier
a. > > Best, JingsongLee > > -- > From:Flavio Pompermaier > Send Time:2019年7月8日(星期一) 15:40 > To:JingsongLee > Cc:user > Subject:Re: Flink Table API and Date fields > > I think I could do it for this specifi

Re: Flink Table API and Date fields

2019-07-08 Thread JingsongLee
15:40 To:JingsongLee Cc:user Subject:Re: Flink Table API and Date fields I think I could do it for this specific use case but isn't this a big limitation of Table API? I think that java.util.Date should be a first class citizen in Flink.. Best, Flavio On Mon, Jul 8, 2019 at 4:06 AM JingsongLee

Re: Flink Table API and Date fields

2019-07-08 Thread Flavio Pompermaier
I think I could do it for this specific use case but isn't this a big limitation of Table API? I think that java.util.Date should be a first class citizen in Flink.. Best, Flavio On Mon, Jul 8, 2019 at 4:06 AM JingsongLee wrote: > Hi Flavio: > Looks like you use java.util.Date in your pojo, Now

Re: Flink Table API and Date fields

2019-07-07 Thread JingsongLee
Hi Flavio: Looks like you use java.util.Date in your pojo, Now Flink table not support BasicTypeInfo.DATE_TYPE_INFO because of the limitations of some judgments in the code. Can you use java.sql.Date? Best, JingsongLee -- From:F

Flink Table API and Date fields

2019-07-05 Thread Flavio Pompermaier
Hi to all, in my use case I have a stream of POJOs with Date fields. When I use Table API I get the following error: Exception in thread "main" org.apache.flink.table.api.ValidationException: SQL validation failed. Type is not supported: Date at org.apache.flink.table.calcite.FlinkPlannerImpl.vali

Re: Hello-world example of Flink Table API using a edited Calcite rule

2019-06-27 Thread JingsongLee
le of Flink Table API using a edited Calcite rule Hi JingsongLee, Sorry for not explain very well. I am gonna try a clarification of my idea. 1 - I want to use InMemoryExternalCatalog in a way to save some statistics which I create by listening to a stream. 2 - Then I will have my core applic

Re: Hello-world example of Flink Table API using a edited Calcite rule

2019-06-26 Thread Felipe Gutierrez
/github.com/apache/flink/blob/release-1.8/flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/api/ExternalCatalogTest.scala > [2] > https://github.com/apache/flink/blob/release-1.8/flink-table/flink-table-planner/src/main/scala/org/

Re: Hello-world example of Flink Table API using a edited Calcite rule

2019-06-26 Thread JingsongLee
flink-table-planner/src/test/scala/org/apache/flink/table/api/ExternalCatalogTest.scala [2] https://github.com/apache/flink/blob/release-1.8/flink-table/flink-table-planner/src/main/scala/org/apache/flink/table/descriptors/OldCsv.scala Best, Jingso

Re: Hello-world example of Flink Table API using a edited Calcite rule

2019-06-26 Thread Felipe Gutierrez
[1]. > You can find more CalciteConfigBuilder API test in [2]. > > 1. > https://github.com/apache/flink/blob/release-1.8/flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/api/batch/table/CorrelateTest.scala#L168 > 2. > https://github.com/apache/flink/blob/re

Re: Hello-world example of Flink Table API using a edited Calcite rule

2019-06-26 Thread JingsongLee
Hi Felipe: I think your approach is absolutely right. You can try to do some plan test just like [1]. You can find more CalciteConfigBuilder API test in [2]. 1.https://github.com/apache/flink/blob/release-1.8/flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/api/batch/table

Hello-world example of Flink Table API using a edited Calcite rule

2019-06-26 Thread Felipe Gutierrez
fig(); tableConfig.setCalciteConfig(ccb.build()); I suppose that with this I can change the query plan of the Flink Table API. I am also not sure if I will need to use an external catalog like this post assumes to use [3]. In a nutshell, I would like to have a simple example where I can execute a query using Flink Table API

Case When in Flink Table API

2019-01-29 Thread Soheil Pourbafrani
How can I use the correct way of *Case When *this example: myTlb.select( "o_orderdate.substring(0,4) as o_year, volume, (when(n_name.like('%BRAZIL%'),volume).else(0)) as case_volume" ) Flink errors on the line (when(n_name.like('%BRAZIL%'),volume).else(0)) as case_volume"

Re: Flink Table API Sum method

2019-01-29 Thread Dawid Wysakowicz
ggregations On 29/01/2019 15:13, Soheil Pourbafrani wrote: > Hi, > > How can I use the Flink Table API SUM function? For example something > like this: > table.agg(sum("feild1")) signature.asc Description: OpenPGP digital signature

Flink Table API Sum method

2019-01-29 Thread Soheil Pourbafrani
Hi, How can I use the Flink Table API SUM function? For example something like this: table.agg(sum("feild1"))

Re: How flink table api to join with mysql dimtable

2018-11-14 Thread Hequn Cheng
Hi yelun, Currently, there are no direct ways to dynamically load data and do join in Flink-SQL, as a workaround you can implement your logic with an udtf. In the udtf, you can load the data into a cache and update it according to your requirement. Best, Hequn On Wed, Nov 14, 2018 at 10:34 AM ye

How flink table api to join with mysql dimtable

2018-11-13 Thread yelun
hi, I want to use flink sql to left join static dimension table from mysql currently, so I converted the mysql table into data stream to join with datastream which has converted to flink table. While I found that the real-time stream data is not joined correctly with mysql data at the beginnin

Re: Flink Table API and table name

2018-10-16 Thread Flavio Pompermaier
Done: https://issues.apache.org/jira/browse/FLINK-10562 On Tue, Oct 16, 2018 at 11:12 AM Timo Walther wrote: > Hi Flavio, > > yes you are right, I don't see a reason why we should not support such > table names. Feel free to open an issue for it. > > Regards, > Timo > > > Am 16.10.18 um 10:56 sc

Re: Flink Table API and table name

2018-10-16 Thread Timo Walther
Hi Flavio, yes you are right, I don't see a reason why we should not support such table names. Feel free to open an issue for it. Regards, Timo Am 16.10.18 um 10:56 schrieb miki haiat: Im not sure if it will solve this issue but can you try to register the your catalog [1] 1.https://ci.ap

Re: Flink Table API and table name

2018-10-16 Thread miki haiat
Im not sure if it will solve this issue but can you try to register the your catalog [1] 1. https://ci.apache.org/projects/flink/flink-docs-release-1.6/dev/table/common.html#register-an-external-catalog On Tue, Oct 16, 2018 at 11:40 AM Flavio Pompermaier wrote: > Hi to all, > in my job I'm try

Flink Table API and table name

2018-10-16 Thread Flavio Pompermaier
Hi to all, in my job I'm trying to read a dataset whose name/id starts with a number. It seems that when using the Table API to read that dataset, if the name starts with a number it is a problem..am I wrong? I can't find anything about table id constraints on the documentation and it seems that i

Re: Explanation on limitations of the Flink Table API

2016-04-21 Thread Simone Robutti
Thanks for all your input. The design document covers the use cases we have in mind and querying external sources may be interesting to us for other uses not mentioned in the first mail. I will wait for developments in this direction, because the expected result seems promising. :) Thank you aga

Re: Explanation on limitations of the Flink Table API

2016-04-21 Thread Flavio Pompermaier
We're also trying to work around the current limitations of Table API and we're reading DataSets with on-purpose input formats that returns a POJO Row containing the list of values (but we're reading all values as String...). Actually we would also need a way to abstract the composition of Flink op

Re: Explanation on limitations of the Flink Table API

2016-04-21 Thread Fabian Hueske
Hi Simone, in Flink 1.0.x, the Table API does not support reading external data, i.e., it is not possible to read a CSV file directly from the Table API. Tables can only be created from DataSet or DataStream which means that the data is already converted into "Flink types". However, the Table API

Explanation on limitations of the Flink Table API

2016-04-21 Thread Simone Robutti
Hello, I would like to know if it's possible to create a Flink Table from an arbitrary CSV (or any other form of tabular data) without doing type safe parsing with expliciteky type classes/POJOs. To my knowledge this is not possible but I would like to know if I'm missing something. My requiremen