(forwarding this to user@ as it is more suited to be located there)
Hi Sunil,
With remote functions (using the Python SDK), messages sent to / from them
must be Protobuf messages.
This is a requirement since remote functions can be written in any
language, and we use Protobuf as a means for cross
checking to see if this is possible currently.
Read json data from kafka topic => process using statefun => write out to kafka
in json format.
I could have a separate process to read the source json data convert to
protobuf into another kafka topic but it sounds in-efficient.
e.g.
Read json dat
Thanks Igal,
I dont have control over the data source inside kafka ( current kafka topic
contains either json or avro formats only, i am trying to reproduce this
scenario using my test data generator ).
is it possible to convert the json to proto at the receiving end of statefun
applicaiton?
Hi,
The values must be valid encoded Protobuf messages [1], while in your
attached code snippet you are sending utf-8 encoded JSON strings.
You can take a look at this example with a generator that produces Protobuf
messages [2][3]
[1] https://developers.google.com/protocol-buffers/docs/pythontut