Actually, your use case should be doable with Flink's Table & SQL API with some additional UDFs. The API can handle JSON objects if they are valid composite types and you can access arrays as well. The splitting might be a bit tricky in SQL, you could model it simply as a where() clause or maybe a groupBy().

If you experience additional limitations or problems, let us know about it.

Good luck.

Regards,
Timo


Am 11/20/17 um 9:16 PM schrieb Lothium:
Hey Timo,
thanks for your warm welcome and for creating a ticket to fix this!
My scenario is the following:
I receive different JSON entities from an AMQP queue. I have a source to
collect the events, after that I parse them into the different internal case
classes and split the stream via the split function by the entity type.
After that I want to transform the different entities and write the new JSON
format (per entity type) to a file sink.
I know I can do the transformation of the case classes to other case classes
via scala code, but I thought it would be sometimes easier to do the
transformations and also aggregations via the SQL or Table-API, but there
are sometimes JSON arrays with objects in there and this seems to be the
problem currently.
I will have a look at the UDFs, maybe I can write something for my purpose.

Thanks!



--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/


Reply via email to