ingle event, how do we
evaluate rules that require pattern detection across events (Match rule if
A happens, followed by B).
3. How do you dynamically define a window function.
--Aarti
--
Aarti Gupta <https://www.linkedin.com/company/qualys>
Director, Engineering, Correlation
+Ken.
--Aarti
On Thu, Jul 5, 2018 at 6:48 PM, Aarti Gupta wrote:
> Thanks everyone, will take a look.
>
> --Aarti
>
> On Thu, Jul 5, 2018 at 6:37 PM, Fabian Hueske wrote:
>
>> Hi,
>>
>> > Flink doesn't support connecting multiple streams with heter
>> evaluate all these streams against your defined rules.
>>
>> d) For Storing your aggregations and rules you can build your cache layer
>> and pass as a argument
>> to the constructor of that flatmap.
>>
>>
>>
>>
>>
>>
>>
unction to
>> evaluate all these streams against your defined rules.
>>
>> d) For Storing your aggregations and rules you can build your cache layer
>> and pass as a argument
>> to the constructor of that flatmap.
>>
>>
>>
>>
>>
>>
>>
&
--
Aarti Gupta <https://www.linkedin.com/company/qualys>
Director, Engineering, Correlation
aagu...@qualys.com
T
Qualys, Inc. – Blog <https://qualys.com/blog> | Community
<https://community.qualys.com> | Twitter <https://twitter.com/qualys>
<https://www.qualys.com/email-banner>
k
>
> With that you should be able to get rid of the full Logstash piece and use
> only the Grok part.
>
> Another solution, for example if you have logs/events in CEF Format, you
> can just use 'split' in the flatmap function for example.
>
> Hope will help.
>
e ingested.
The same object instance is shared with the API and the Flink Execution
environment, however, the output of the API does not get ingested into the
Flink DataStream.
Is this the right pattern to use, or is Kafka the recommended way of
streaming data into Flink ?
--Aarti
--
Aarti Gu