Hi Sebastien,
I think you can do that with Flink's Table API / SQL and the
KafkaJsonTableSource.
Note that in Flink 1.4.2, the KafkaJsonTableSource does not support flat
JSON yet.
You'd also need a table-valued UDFs for the parsing of the message and
joining the result with the original row. Depen
HI ,
Assuming that your looking for streaming use case , i think this is a
better approach
1. Send Avro from logstash ,better performance.
2. Deserialize it to POJO .
3. Do logic...
On Mon, Apr 23, 2018 at 4:03 PM, Lehuede sebastien
wrote:
> Hi Guys,
>
> I'm actually trying to un
Hi Guys,
I'm actually trying to understand the purpose of Table and in particular
KafkaJsonTableSource. I try to see if for my use case ths can be usefull.
Here is my context :
I send logs on logstash, i add some information (Type, Tags), Logstash send
logs to Kafka in JSON format and finally i