Hi,
Problem: - Currently i am using flink as an embedded library in
one of my application, eventually the application will be the Job and will be
deployed in the flink cluster , but right its not a cluster but a standalone
single process running flink within the same process.
Hi,
I am getting below error while starting the flink as a standalone single jvm
process through jar. The kafka is deployed as a seperate cluster and the
process is not able to start and fails with below error after 60sec:
org.apache.kafka.common.errors.TimeoutException: Timeout of 6ms expi
/roadmap.html
On Fri, 10 Sept 2021 at 09:09, Dipanjan Mazumder wrote:
Hi Jing,
Thanks for the input another question i had was can Gelly be used for
processing the graph that flink receives through kafka and then using Gelly i
decompose the graph into its nodes and edges and then process them
could handle this user case.
Don't worry that Flink can't handle this kind of data scale because Flink is a
distributed engine. As long as the problem of data skew is carefully avoided,
the input throughput can be handled through appropriate resources.
Best,JING ZHANG
Dipanjan Mazumder
Hi,
I am working on a usecase and thinking of using flink for the same. The use
case is i will be having many large resource graphs , i need to parse that
graph for each node and edge and evaluate each one of them against some suddhi
rules , right now the implementation for evaluating individ
ica/lab-flink-repository-analytics
On Tue, Aug 3, 2021 at 10:54 AM Dipanjan Mazumder wrote:
Hi ,
I was wondering there were so many problems and there solutions were
discussed in mails , and do we have any jira issues repo where all these
problems and there solutions a re maintained so t
Hi ,
I was wondering there were so many problems and there solutions were
discussed in mails , and do we have any jira issues repo where all these
problems and there solutions a re maintained so that any user can checkout the
issues in the repo to understand if their problems are already dis
hile a few features are not supported yet, you could view the
document [2] to check.
[1]
https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/libs/cep/[2]
https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/dev/table/sql/queries/match_recognize/
Best,JING ZHANG
Dipanjan Mazumd
Hi,
Can we say that Flink SQL is kind of a DSL overlay on flink CEP , i mean i
need a DSL for flink CEP , so that i can decouple the CEP rules from code and
pass them dynamically to be applied on different data streams. Flink CEP doen't
have any DSL implementation , so is it that Flink SQL c
Paul wrote:
Hi Dipanjan,
I am afraid there are no foreseeable efforts planned but if you find a nice
addition, you can start a discussion in the community about this feature.
Best, Fabian
On 2. Jun 2021, at 12:10, Dipanjan Mazumder wrote:
Hi Fabian,
Understood but is
solution you have to directly reach out to the maintainers.
Best,Fabian
On 2. Jun 2021, at 08:37, Dipanjan Mazumder wrote:
Hi ,
I am currently using Siddhi CEP with flink , but the flink-siddhi library
has limited support for flnk versions and i will either need to fix the library
or get tied
Hi ,
I am currently using Siddhi CEP with flink , but the flink-siddhi library
has limited support for flnk versions and i will either need to fix the library
or get tied to a fix version of Flink to use th library.
I am looking at Flink CEP as an option , and also came across a Flink CEP DSL
Hi ,
I have integrated flink-siddhi library ([com.github.haoch/flink-siddhi_2.11
"0.2.2-SNAPSHOT"]) and i tried to configure and implement control stream from
flink-siddh and it broke with AbstractMethodError. When i tried running the
same with flink 1.11.0 it worked.
More Details is given
used for development discussions.
Cheers,Till
On Tue, Jun 1, 2021 at 1:31 PM Dipanjan Mazumder wrote:
Hi ,
I have integrated flink-siddhi library ([com.github.haoch/flink-siddhi_2.11
"0.2.2-SNAPSHOT"]) and i tried to configure and implement control stream from
flink-siddh an
31, 2021 at 7:27 PM Dipanjan Mazumder
wrote:
Hi ,
I was trying to do checkpointing while using siddhi as the CEP engine
running on flink. While using siddhi windowing , it uses an internal state to
aggregated or perform operation on a bucket of events pertaining to a specific
time window
, 12:48:59 PM GMT+5:30, Dipanjan
Mazumder wrote:
Hi All,
Found the solution ,
Problem: I was actually using an intermediate library to integrate siddhi with
Flink (https://github.com/haoch/flink-siddhi) and i was creating a SiddhiCEP
instance and then calling "define()" on tha
Hi All,
Found the solution ,
Problem: I was actually using an intermediate library to integrate siddhi with
Flink (https://github.com/haoch/flink-siddhi) and i was creating a SiddhiCEP
instance and then calling "define()" on that instance , while i was registering
the extension on the crea
Hi ,
i am trying to integrate siddhi with flink while trying to use siddhi
extension function on deploying the job in flink cluster it is not able to find
those libraries at run time , so i had to explicitly put those libraries to the
/opt/flink/lib folder for the jobmanager and taskmanager
Hi ,
Thanks again for responding on my earlier queries..
I was again going through the Flink SQL client code and came across the
default custom command-line , few days back i came to know that Flink sql
client is not supported in a full fledged cluster with different resource
managers like
GMT+5:30, Dipanjan Mazumder
wrote:
Hi,
I was going through the Flink Sql client code and came through a
flow where we are loading flink-conf.yaml in the configuration object as
prerequisite for the SQL client to start. I can see that the configuration file
has properties
Hi,
I was going through the Flink Sql client code and came through a
flow where we are loading flink-conf.yaml in the configuration object as
prerequisite for the SQL client to start. I can see that the configuration file
has properties pertaining to the Flink cluster. As far as m
deployed in standalone mode.
Hope that helps.
Best,Xingcan
On Sep 7, 2019, at 1:57 AM, Dipanjan Mazumder wrote:
Hi Guys,
I was going through the Flink sql client configuration YAML from the
training example and came across a section in the configuration as below
Hi Guys,
I was going through the Flink sql client configuration YAML from the
training example and came across a section in the configuration as below:
# Deployment
properties allow for describing the cluster to which table# pr
23 matches
Mail list logo