Many thanks!
Am Mi., 15. Juli 2020 um 15:58 Uhr schrieb Aljoscha Krettek <
aljos...@apache.org>:
> On 11.07.20 10:31, Georg Heiler wrote:
> > 1) similarly to spark the Table API works on some optimized binary
> > representation
> > 2) this is only available in the SQL way of interaction - there i
On 11.07.20 10:31, Georg Heiler wrote:
1) similarly to spark the Table API works on some optimized binary
representation
2) this is only available in the SQL way of interaction - there is no
programmatic API
yes it's available from SQL, but also the Table API, which is a
programmatic declarati
Hi,
Many thanks.
So do I understand correctly that:
1) similarly to spark the Table API works on some optimized binary
representation
2) this is only available in the SQL way of interaction - there is no
programmatic API
This leads me then to some questions:
q1) I have read somewhere (I think
Hi Georg,
I'm afraid the other suggestions are missing the point a bit. From your
other emails it seems you want to use Kafka with JSON records together
with the Table API/SQL. For that, take a look at [1] which describes how
to define data sources for the Table API. Especially the Kafka and J
Hi Georg, you can try using the circe library for this which has a way to
automatically generate JSON decoders for scala case classes.
As it was mentioned earlier, Flink does not come packaged with
JSON-decoding generators for Scala like spark does.
On Thu, Jul 9, 2020 at 4:45 PM Georg Heiler
wr
Great. Thanks.
But would it be possible to automate this i.e. to have this work
automatically for the case class / product?
Am Do., 9. Juli 2020 um 20:21 Uhr schrieb Taher Koitawala <
taher...@gmail.com>:
> The performant way would be to apply a map function over the stream and
> then use the Jac
The performant way would be to apply a map function over the stream and
then use the Jackson ObjectMapper to convert to scala objects. In flink
there is no API like Spark to automatically get all fields.
On Thu, Jul 9, 2020, 11:38 PM Georg Heiler
wrote:
> How can I use it with a scala case class
How can I use it with a scala case class?
If I understand it correctly for better performance the Object Mapper is
already initialized in each KafkaConsumer and returning ObjectNodes. So
probably I should rephrase to: how can I then map these to case classes
without handcoding it? https://github.c
You can try the Jackson ObjectMapper library and that will get you from
json to object.
Regards,
Taher Koitawala
On Thu, Jul 9, 2020, 9:54 PM Georg Heiler wrote:
> Hi,
>
> I want to map a stream of JSON documents from Kafka to a scala case-class.
> How can this be accomplished using the JSONKey