Unsubscribe

2022-05-04 Thread Nishant Gupta
Unsubscribe

Flink Join Time Window

2019-09-30 Thread Nishant Gupta
Hi Team, I am trying to Join [kafka stream] and [badip stream grouped with badip] Can someone please help me out with verifying what is wrong in highlighted query. Am I writing the time window join query wrong with this use case.? Or it is a bug and i should report this what is the work around, i

Re: Flink- Heap Space running out

2019-09-27 Thread Nishant Gupta
have tried Temporal functions - It is working fine* I was really wishing to make it work with idle state and time window join. Could you please check the configuration and query. Please let me know if any other details are required On Fri, Sep 27, 2019 at 12:41 PM Nishant Gupta wrote: > > Hi

Re: Flink- Heap Space running out

2019-09-27 Thread Nishant Gupta
aiat >: > >> You can configure the task manager memory in the config.yaml file. >> What is the current configuration? >> >> On Thu, Sep 26, 2019, 17:14 Nishant Gupta >> wrote: >> >>> am running a query to join a stream and a table as below. It i

Flink- Heap Space running out

2019-09-26 Thread Nishant Gupta
am running a query to join a stream and a table as below. It is running out of heap space. Even though it has enough heap space in flink cluster (60GB * 3) Is there an eviction strategy needed for this query ? *SELECT sourceKafka.* FROM sourceKafka INNER JOIN DefaulterTable ON sourceKafka.CC=Def

CSV Table source as data-stream in environment file

2019-09-26 Thread Nishant Gupta
Hi Team, How do we define csv table source as a data-stream instead of data-set in environment file.? Whether or not i mention update-mode: append or not... I takes only csv file as data-set. Is there any detailed reference to environment file configuration where sinks and sources are defined.

Joins Usage Clarification

2019-09-25 Thread Nishant Gupta
Hi Team, There are 3 types of window join (Tumbling, Session, and Sliding) and 1 interval Join as mentioned in (For Table API) [1] https://ci.apache.org/projects/flink/flink-docs-release-1.9/dev/stream/operators/joining.html

Flink Temporal Tables Usage

2019-09-24 Thread Nishant Gupta
Hi Team, I have slight confusion w.r.t usage of temporal tables. In documentation [1], it mentions that we need to use Lookuptables like HBaseTableSource and In documentation [2], while using SQLClient, there isn't anything mentioned about it. Do we need to use the same kind of LookUpTables in e

Re: Time Window Flink SQL join

2019-09-21 Thread Nishant Gupta
al.StreamTableEnvironmentImpl.toDataStream(StreamTableEnvironmentImpl.java:319) at org.apache.flink.table.api.java.internal.StreamTableEnvironmentImpl.toAppendStream(StreamTableEnvironmentImpl.java:227) at flink.StreamingJob.main(StreamingJob.java:140) On Fri, Sep 20, 2019 at 8:26 PM Nishant Gupta wrote: &g

Re: Time Window Flink SQL join

2019-09-20 Thread Nishant Gupta
Hueske wrote: > Hi, > > This looks OK on the first sight. > Is it doing what you expect? > > Fabian > > Am Fr., 20. Sept. 2019 um 16:29 Uhr schrieb Nishant Gupta < > nishantgupta1...@gmail.com>: > >> Hi Fabian, >> >> Thanks for the information. &

Re: Time Window Flink SQL join

2019-09-20 Thread Nishant Gupta
uld be the time-time versioned table [2]. > Since it is a time-versioned table, it could even be updated with new IPs. > > This type of join will only keep the time-versioned table (the bad-ips in > state) and not the other (high-volume) table. > > Best, > Fabian > > [1] &g

Re: Time Window Flink SQL join

2019-09-18 Thread Nishant Gupta
st, Fabian > > Am Mi., 18. Sept. 2019 um 14:02 Uhr schrieb Nishant Gupta < > nishantgupta1...@gmail.com>: > >> Hi Team, >> >> I am running a query for Time Window Join as below >> >> INSERT INTO sourceKafkaMalicious >> SELECT sourceKafka.* FROM sourc

Time Window Flink SQL join

2019-09-18 Thread Nishant Gupta
Hi Team, I am running a query for Time Window Join as below INSERT INTO sourceKafkaMalicious SELECT sourceKafka.* FROM sourceKafka INNER JOIN badips ON sourceKafka.`source.ip`=badips.ip WHERE sourceKafka.timestamp_received BETWEEN CURRENT_TIMESTAMP - INTERVAL '15' MINUTE AND CURRENT_TIMESTAMP; T