Hi
It's been over 5 years since I last did anything with FlinkCEP and Flink.
Has there been any significant development in FlinkCEP during this time?
BR. Esa
Hi
Thank you. I need to read them. Does this all work in Flink 1.8 now ?
BR Esa
From: Dawid Wysakowicz
Sent: Thursday, April 11, 2019 12:59 PM
To: Esa Heikkinen (TAU) ; Fabian Hueske
Cc: Dian Fu ; jincheng sun ;
user@flink.apache.org
Subject: Re: FlinkCEP and SQL?
Hi Esa,
Have you
told that FlinkCEP is based on
this.
BR Esa
From: Fabian Hueske
Sent: Thursday, April 11, 2019 11:33 AM
To: Esa Heikkinen (TAU)
Cc: Dian Fu ; jincheng sun ;
user@flink.apache.org
Subject: Re: FlinkCEP and SQL?
Hi Esa,
Flink's implementation of SQL MATCH_RECOGNIZE is based on it's C
Hi
Is SQL CEP based (old) FlinkCEP at all and are SQL CEP and FlinkCEP completely
separate ?
BR Esa
From: Dian Fu
Sent: Thursday, April 4, 2019 2:37 PM
To: Esa Heikkinen (TAU)
Cc: jincheng sun ; user@flink.apache.org
Subject: Re: FlinkCEP and SQL?
Should the all sources be combined into one
: Esa Heikkinen (TAU)
Cc: user@flink.apache.org
Subject: Re: FlinkCEP and SQL?
Hi BR Esa,
CEP is available in Flink SQL, Please the detail here:
https://ci.apache.org/projects/flink/flink-docs-release-1.7/dev/table/sql.html#pattern-recognition
Best,
Jincheng
Esa Heikkinen (TAU) mailto:esa.heikki
Hi
What is the situation of FlinkCEP and SQL?
Is it already possible to use SQL in CEP?
Is there any example cases where SQL is used in CEP?
BR Esa
Subject: Re: FlinkCEP and scientific papers ?
Hi Esa,
the SQL/CEP integration might be part of Flink 1.7. The discussion has just
been started again [1].
Regards,
Timo
[1] https://issues.apache.org/jira/browse/FLINK-6935
Am 07.08.18 um 15:36 schrieb Esa Heikkinen:
There was one good example of
Hi
What do you mean by “dynamic pattern” ?
How it differs from “static” pattern ?
BR Esa
From: Fabian Hueske
Sent: Monday, November 26, 2018 12:01 PM
To: srbistline.t...@gmail.com
Cc: user
Subject: Re: CEP Dynamic Patterns
Hi Steve,
No this feature has not been contributed yet.
Best, Fabia
, July 23, 2018 3:00 PM
To: Esa Heikkinen
Cc: Till Rohrmann ; Chesnay Schepler
; user
Subject: Re: FlinkCEP and scientific papers ?
Hi Esa,
I think the core implementation pattern is still based that paper, there is a
package named "nfa"[1] contains the main thought.
The latest CEP mo
Hi
Thank you. This was very good paper for me ☺
How much current FlinkCEP works like this (the paper was written 2008) ?
Are there exist newer papers related to current FlinkCEP ?
BR Esa
From: Till Rohrmann
Sent: Wednesday, July 18, 2018 9:38 AM
To: vino yang
Cc: Esa Heikkinen ; Chesnay
Hi
I don't know this the correct forum to ask, but are there exist some good
scientific papers about FlinkCEP (Complex Event Processing) ?
I know Flink is based to Stratosphere, but how is it FlinkCEP ?
BR Esa
Hi
I would want to find some Flink examples about complex analysis (like CEP) and
its log files.
I have already found logs for TaxiRides and how to find the long rides using by
Flink and CEP [1], but it is little simple analysis case (only two sequential
events).
Do you know any more complex
Hi
Ok. Thanks for the clarification. But the controlling of savepoints is only
possible by command line (or a script) ? Or is it possible to do internally in
sync with application ?
Esa
From: Shuyi Chen
Sent: Wednesday, May 30, 2018 8:18 AM
To: Esa Heikkinen
Cc: Fabian Hueske ; user
Hi
I would be interested to know what are the most two or three typical use cases
in Flink ? What they can be ?
What people do most by Flink ? Do you have any opinion or experience about that
?
I mean mostly smaller examples of uses.
Best, Esa
Hi
Are there only one env.execute() in application ?
Is it unstoppable forever loop ?
Or can I stop env.execute() and then do something and after that restart it ?
Best, Esa
From: Fabian Hueske
Sent: Tuesday, May 29, 2018 1:35 PM
To: Esa Heikkinen
Cc: user@flink.apache.org
Subject: Re
Hi
Is it env.execute() mandatory at the end of application ? It is possible to run
the application without it ?
I found some examples where it is missing.
Best, Esa
Hi
I don't know whether this question is suitable for this forum, but I take the
risk and ask :)
In my understanding the execution model in Flink is very data (flow) stream
oriented and specific. It is difficult to build a control flow logic (like
state-machine) outside of the stream specific
Hi
This is little bit out of topic in this mail list, but do you know any good
general forum or mail list for the (complex) event processing ?
Best, Esa
org/projects/flink/flink-docs-release-1.4/dev/api_concepts.html
Best, Esa
From: Georgi Stoyanov
Sent: Friday, May 18, 2018 7:54 PM
To: Esa Heikkinen
Cc: user@flink.apache.org
Subject: RE: How to Flink can solve this example
Hi Esa :)
I don't think that there are people here that want to make your
Li
Sent: Wednesday, May 9, 2018 8:18 PM
To: Georgi Stoyanov
Cc: Esa Heikkinen ; user@flink.apache.org
Subject: Re: Recommended books
I'd recommend this book, Stream Processing with Apache Flink: Fundamentals,
Implementation, and Operation of Streaming Applications. It's probabl
Hi
Sorry the stupid question, but how to connect readTextFile (or readCsvFile),
MapFunction and SQL together in Scala code ?
Best, Esa
From: Fabian Hueske
Sent: Tuesday, May 8, 2018 10:26 PM
To: Esa Heikkinen
Cc: user@flink.apache.org
Subject: Re: Reading csv-files in parallel
Hi,
the Table
Hi
Could you recommend some Flink books to learn Scala programming and basics in
Flink ?
Best, Esa
(state-machine-based) logic for
reading csv-files by certain order.
Esa
From: Fabian Hueske
Sent: Tuesday, May 8, 2018 2:00 PM
To: Esa Heikkinen
Cc: user@flink.apache.org
Subject: Re: Reading csv-files in parallel
Hi,
the easiest approach is to read the CSV files linewise as regular text files
To: Esa Heikkinen
Cc: user@flink.apache.org
Subject: Re: Reading csv-files in parallel
Hi Esa,
you can certainly read CSV files in parallel. This works very well in a batch
query.
For streaming queries, that expect data to be ingested in timestamp order this
is much more challenging, because you
Hi
I would want to read many different type csv-files (time series data) parallel
using by CsvTableSource. Is that possible in Flink application ? If yes, are
there exist the examples about that ?
If it is not, do you have any advices how to do that ?
Should I combine all csv-files to one csv-
That would be enough, but I would appreciate the full source (Scala) code
examples of using IterativeConditions.
How to find correct imports for getEventsForPattern ?
Best, Esa
From: Dawid Wysakowicz
Sent: Thursday, May 3, 2018 2:53 PM
To: Esa Heikkinen
Subject: Re: use of values of
? or documentation ?
Or should I use some other method than CEP, because my case is more batch
processing than stream processing ?
Best, Esa
From: Dawid Wysakowicz
Sent: Thursday, May 3, 2018 12:54 PM
To: Esa Heikkinen
Subject: Re: use of values of previously accepted event
Hi Esa,
You cannot
to set variable X in "start" and how to read the value of the variable X in
"end" ?
Should or can I use global variables ?
Should variables be declared in TestData ? And how ?
Best, Esa
From: Esa Heikkinen
Sent: Thursday, April 26, 2018 3:18 PM
To: user@flink.apache.org
Hi
Or is it possible to use global or local variables inside in pattern sequence ?
And how (by Scala) ?
Best, Esa
From: Esa Heikkinen
Sent: Wednesday, April 25, 2018 4:16 PM
To: user@flink.apache.org
Subject: CEP: use of values of previously accepted event
Hi
I have tried to read [1] and
Hi
I have tried to read [1] and understand how to get values of previously
accepted event to use in current event (or pattern).
Iterative conditions (with context.getEventsForPatterns) do something like
that, but it gets all previously accepter events..
How to get only last one (by Scala) ? Are
Hi
How Flink can deal with a spatial (position of coordinates) data ?
For example, I would want to check the coordinates of some moving object have
crossed some boundary.
Is that possible ?
Best, Esa
, April 17, 2018 1:41 PM
To: Esa Heikkinen
Cc: user@flink.apache.org
Subject: Re: State-machine-based search logic in Flink ?
Hi Esa,
What do you mean by "individual searches in the Table API"?
There is some work (a pending PR [1]) to integrate the MATCH_RECOGNIZE clause
(SQL 2016) [2] in
Hi
I am not sure I have understand all, but it is possible to build some kind of
state-machine-based search logic for
example on top of the individual searches in Table API (using CsvTableSource) ?
Best, Esa
: Saturday, April 14, 2018 1:43 PM
To: Esa Heikkinen
Cc: user@flink.apache.org
Subject: Re: Complexity of Flink
I think this always depends. I found Flink more clean compared to other Big
Data platforms and with some experience it is rather easy to deploy.
However how do you measure complexity
Hi
I am writing a scientific article, that is related to deployment of Flink.
I would be very interesting to know, how to measure a complexity of
Flink platform or framework ?
Does anyone know a good articles about that ?
I think it is not always so simple to deploy and use..
Best, Esa
Hi
I have noticed that Flink can be pretty tedious to install and build first
applications from scratch. Especially if the application is little bit complex.
There are also little bit different development and run time environments,
which require different software components with correct versi
Hi
Yes I have access to the flink source code, but could you explain little bit
more what to do with it in this case ?
Best, Esa
From: Kostas Kloudas [mailto:k.klou...@data-artisans.com]
Sent: Wednesday, March 7, 2018 3:51 PM
To: Esa Heikkinen
Cc: user@flink.apache.org
Subject: Re: Simple CEP
"pattern" or "select", because no
results.. Is there any way to debug CEP's operations ?
Best, Esa
From: Kostas Kloudas [mailto:k.klou...@data-artisans.com]
Sent: Wednesday, March 7, 2018 2:54 PM
To: Esa Heikkinen
Cc: user@flink.apache.org
Subject: Re: Simple CEP patter
Hi
It works now. It was because of the missing “import”. Thank you.
Best, Esa
From: Hequn Cheng [mailto:chenghe...@gmail.com]
Sent: Wednesday, March 7, 2018 3:00 PM
To: Esa Heikkinen
Cc: Timo Walther ; user@flink.apache.org
Subject: Re: CsvTableSource Types.TIMESTAMP
Hi Esa,
Have you ever
What would be the simplest working CEP (Scala) pattern ?
I want to test if my CEP application works at all.
Best, Esa
Subject: Re: CsvTableSource Types.TIMESTAMP
Hi,
SQL_TIMESTAMP is the same. A couple of months ago it was decided to rename this
property such that it can be used for timestamps with timezone support in the
future.
Regards,
Tiom
Am 3/5/18 um 2:10 PM schrieb Esa Heikkinen:
I have tried to follow
I have tried to following example to work, but no succeed yet.
https://flink.apache.org/news/2017/03/29/table-sql-api-update.html
Error .. value TIMESTAMP is not a member of object
org.apache.glink.table.api.Types
What would be the problem ?
What the imports should I use ?
Or should I use SQL
Hi
I have tried to learn CEP, but from some reasons it seems to be little
difficult. It looks very complex.
Are there exist some simple (Scala) examples about CEP with full Maven projects
? I have only found TaxiRide of Dataartisan example [1].
For example what variables, classes and functions
Hi
Should the custom source function be written by Java, but no Scala, like in
that RideCleansing exercise ?
Best, Esa
From: Fabian Hueske [mailto:fhue...@gmail.com]
Sent: Thursday, March 1, 2018 11:23 AM
To: Esa Heikkinen
Cc: user@flink.apache.org
Subject: Re: Reading csv-files
Hi Esa,
IMO
: Thursday, March 1, 2018 11:35 AM
To: Esa Heikkinen
Cc: user@flink.apache.org
Subject: Re: Questions about the FlinkCEP
Hi Esa,
The answers to the questions are inlined.
On Feb 28, 2018, at 8:32 PM, Esa Heikkinen
mailto:heikk...@student.tut.fi>> wrote:
Hi
I have tried to learn FlinkCEP [1],
Hi
I have tried to learn FlinkCEP [1], but i have yet not found the clear
answers for questions:
1) Whether the pattern of CEP is meant only for one data stream at the
same time ?
2) If i have many different parallel data streams (or sources), should i
combine them into one data stream (an
do not know
better. I also tried Spark, but it also had its own problems. For example CEP
is not good in Spark than in Flink.
Best, Esa
From: Fabian Hueske [mailto:fhue...@gmail.com]
Sent: Tuesday, February 27, 2018 11:27 PM
To: Esa Heikkinen
Cc: user@flink.apache.org
Subject: Re: Reading csv
correctly configured watermark
assigner, it should be possible to get valid watermarks.
In any case, reading timestamped data from files is much more tricky
than ingesting data from an event log which provides the events in the
same order in which they were written.
Best, Fabian
2018-02-27
I'd want to read csv-files, which includes time series data and one
column is timestamp.
Is it better to use addSource() (like in Data-artisans
RideCleansing-exercise) or CsvSourceTable() ?
I am not sure CsvTableSource() can undertand timestamps ? I have not
found good examples about that.
Hi
I'd like to build application, which reads many different type of csv-files
(with time series data), searches certain events from cvs-files in the desired
order and stores "attributes" of found events. The attributes can be used to
search next searched event. This search process acts like st
To: user@flink.apache.org
Cc: Esa Heikkinen
Subject: Re: Is Flink easy to deploy ?
I think you simply missing a bunch of the Flink artifacts.
Flink is broken into dozens of pieces, and you need to select from a large set
of artifacts what you need to depend on.
Typically, there is one Flink
Hi
Whether these instructions of IDE are only for Java, but no or Scala ?
Best,
Esa
From: xingcan [mailto:xc...@foxmail.com]
Sent: Saturday, February 24, 2018 3:25 AM
To: Esa Heikkinen
Cc: user@flink.apache.org
Subject: Re: Is Flink easy to deploy ?
Hi Esa,
maybe you can follow the
.org/projects/flink/flink-docs-release-1.4/internals/ide_setup.html> to
import Flink to your IDE and run the examples in it. That’s a good
beginning I think.
Best,
Xingcan
On 24 Feb 2018, at 3:22 AM, Esa Heikkinen <mailto:heikk...@student.tut.fi>> wrote:
Yes i have looked. For exam
be run out of the IDE.
Fabian
[1]
https://github.com/apache/flink/tree/master/flink-examples/flink-examples-streaming/src/main/scala/org/apache/flink/streaming/scala/examples
2018-02-23 13:30 GMT+01:00 Esa Heikkinen <mailto:esa.heikki...@student.tut.fi>>:
I have lot of difficulti
I have lot of difficulties to deploy Flink. That is maybe because I am new with
Flink and its (Java and Maven) development environment, but I would hear the
opinions of others. I would like to use Scala.
There are many examples, but often there are missing "imports" and settings in
pom.xml. It
I found interesting Scala example from:
https://flink.apache.org/news/2017/03/29/table-sql-api-update.html
But what imports I should use ?
And what in pom.xml and which versions ?
BR Esa
at
java.lang.ClassLoader.loadClass(ClassLoader.java:357)
What would be the reason for that ?
BR Esa
From: Esa Heikkinen [mailto:esa.heikki...@student.tut.fi]
Sent: Thursday, February 22, 2018 1:01 PM
To: Xingcan Cui
Cc: Fabian Hueske ; user@flink.apache.org
Subject: RE: Problems to use toAppendStream
Hi
It
Hi
It works now. Thank you ☺
How to know what the imports are incompatible or something like that ?
BR Esa
From: Xingcan Cui [mailto:xingc...@gmail.com]
Sent: Thursday, February 22, 2018 12:00 PM
To: Esa Heikkinen
Cc: Fabian Hueske ; user@flink.apache.org
Subject: Re: Problems to use
, 2018 10:35 AM
To: Esa Heikkinen
Cc: Xingcan Cui ; user@flink.apache.org
Subject: Re: Problems to use toAppendStream
Hi Esa,
which Scala version do you use?
Flink supports Scala 2.11 (and Scala 2.10 support was dropped with Flink 1.4.0).
Fabian
2018-02-22 9:28 GMT+01:00 Esa Heikkinen
, 2018 10:09 AM
To: Esa Heikkinen
Cc: user@flink.apache.org
Subject: Re: Problems to use toAppendStream
Hi Esa,
just to remind that don’t miss the dot and underscore.
Best,
Xingcan
On 22 Feb 2018, at 3:59 PM, Esa Heikkinen
mailto:esa.heikki...@student.tut.fi>> wrote:
Hi
Actually I have als
: Wednesday, February 21, 2018 9:41 PM
To: Esa Heikkinen
Cc: user@flink.apache.org
Subject: Re: Problems to use toAppendStream
Hi Esa,
whenever you observe the error "could not find implicit value for evidence
parameter of type X" in a streaming program, you need to add the following
impo
|Hi |
|I have tried to solve below Errors for long time, but no succeed yet.
Could you give some hint how to solve it ? |Errors in compiling:
--
||Error:(56,46)could not find implicitvalue forevidence parameter of
typeorg.apache.flink.api.common.typeinfo.TypeInformation[org.apac
Hi
I am quite new with Flink and Scala. I have had a bit of trouble finding
corrects "imports".
What would be the best way to find them ?
For example the imports for StreamTableEnvironment and CsvTableSource.
And how do I know if I should put something pom.xml ?
Esa
Hello
I use (but it is not familiar) IntelliJ IDEA, but there are many settings. What
is the exactly place for enabling the auto-completion imports ?
Esa
-Original Message-
From: m@xi [mailto:makisnt...@gmail.com]
Sent: Friday, February 16, 2018 12:42 PM
To: user@flink.apache.org
Subje
Hi
I am newbie with Flink, Maven and Scala.
What is the best way to find the correct imports packages ? (There are not
always in documentation ? at least not very clearly)
For example I did not find what "import package" I should use in readCsvFile() ?
Also something needs to be added pom.xml
can use the
"-c" switch:
./bin/flink run -c .WordCount ./
Alternatively, uncomment the following section in the pom.xml, set your desired
job class, and rebuild the jar.
On 14.02.2018 10:57, Esa Heikkinen wrote:
Hi
RNING] Multiple versions of scala libraries detected!
On 14.02.2018 09:43, Esa Heikkinen wrote:
Hi
I have tried sample project from:
https://ci.apache.org/projects/flink/flink-docs-release-1.4/quickstart/scala_api_quickstart.html#maven
Versions:
Linux: DISTRIB_DESCRIPTION="Ubuntu 16.04.
Hi
Good news. Is it way to supply Scala-code from file to REPL ?
It seems the compiling is too complicated operation.. Actually I don’t get it
to work yet.
Esa
From: Piotr Nowojski [mailto:pi...@data-artisans.com]
Sent: Wednesday, February 14, 2018 10:55 AM
To: Esa Heikkinen
Cc: Esa
Hi
I have tried sample project from:
https://ci.apache.org/projects/flink/flink-docs-release-1.4/quickstart/scala_api_quickstart.html#maven
Versions:
Linux: DISTRIB_DESCRIPTION="Ubuntu 16.04.3 LTS"
Maven: Apache Maven 3.3.9
Java: openjdk version "1.8.0_151"
OpenJDK Runtime Environment (build 1.8
ongoing work with that:
https://issues.apache.org/jira/browse/FLINK-5886
3. CEP doesn’t work with Flink Batch, you have to use Flink Streaming
for that:
https://ci.apache.org/projects/flink/flink-docs-release-1.4/dev/api_concepts.html#dataset-and-datastream
Piotrek
On 13 Feb 2018, at 13:21, Esa
, February 10, 2018 1:07 PM
To: Esa Heikkinen
Cc: Timo Walther ; user@flink.apache.org
Subject: Re: CEP for time series in csv-file
Hi,
I'm not aware of any example project that ticks all your requirements. As you
said, too many combinations...
Flink uses Maven. So, most examples provid
What the difference is to use Python and Scala in Flink ?
Can I do all the same things with Python and Scala ? For examples CEP with
files.
, Esa Heikkinen
mailto:esa.heikki...@student.tut.fi>> wrote:
Hi
I have cvs-file(s) that contain an event in every row and first column is time
stamp of event. Rest of columns are data and “attributes” of event.
I’d want to write simple Scala code that: 1) reads data of csv-file 2) converts
Hi
I have cvs-file(s) that contain an event in every row and first column is time
stamp of event. Rest of columns are data and "attributes" of event.
I'd want to write simple Scala code that: 1) reads data of csv-file 2) converts
data of csv-file compatible for CEP 3) sets pattern for CEP 4) Ru
Hello
I am newbie with Flink.
I'd want to develop my Flink scala-application in Windows IDE (for examples
IntelliJ IDEA) and run them in Linux (Ubuntu).
Is that good or bad idea ? Or is it some remote use possible ?
At this moment there are no graphical interface (GUI) in Linux. Or would it be
Hi
Thanks for the reply, but because I am a newbie with Flink, do you have any
good Scala code examples about this ?
Esa
From: Fabian Hueske [mailto:fhue...@gmail.com]
Sent: Wednesday, February 7, 2018 11:21 AM
To: Esa Heikkinen
Cc: user@flink.apache.org
Subject: Re: Flink CEP with files and
Hello
I am trying to use CEP of Flink for log files (as batch job), but not for
streams (as realtime).
Is that possible ? If yes, do you know examples Scala codes about that ?
Or should I convert the log files (with time stamps) into streams ?
But how to handle time stamps in Flink ?
If I can n
77 matches
Mail list logo