You have shared the Kafka connect properties but not the source connector
config.
Which source connector are you using? Does it override the default settings you
provided?
Are you running the connector in standalone mode or distributed mode?
Also what are you using to consume the messages and see
Hi Hans,
Thank you for your quick response, appreciate it.
In *kafka-connect* docker, I see below settings in
*kafka-connect.properties* file in *kafka-connect *directory:
key.converter.schemas.enable=false
key.converter.schema.registry.url=http://kafka-schema-registry:
value.converter.schema.re
My earlier comment still applies but in Kafka Connect the equivalent of a
serializer/deserializer (serdes) is called a “converter”.
Check which converter you have configured for your source connector and if it
is overriding whatever the default converter is configured for the connect
worker it
Check which serializer you have configured in your producer. You are probably
using an Avro serializer which will add the schema and modify the payload to
avro data. You can use a String serializer or a ByteArray serializer and the
data will either be Base64 encoded or not encoded at all.
-han
Hi,
I would like to add that I use kafka-connect and schema-registery version `
3.2.1-6`.
Best regards,
Mina
On Fri, Jun 2, 2017 at 10:59 AM, Mina Aslani wrote:
> Hi.
>
> Is there any way that I get the data into a Kafka topic in Json format?
> The source that I ingest the data from have the d
Hi.
Is there any way that I get the data into a Kafka topic in Json format?
The source that I ingest the data from have the data in Json format,
however when I look that data in the kafka topic, schema and payload fields
are added and data is not in json format.
I want to avoid implementing a tra