Hey All, I'm using the JDBC sink to materialise a kafka topic in a postgres table. The records in the topic are avro messages. I'm running into this problem because one of the fields in the messages is an array:
org.apache.kafka.connect.errors.ConnectException: null (ARRAY) type doesn't have a mapping to the SQL database column type Now, that makes sense to me. I'm wondering if there is anyway to get this data into the database with the built in JDBC connector or otherwise. I have no problem with it just being a string representation of the array, although I would prefer if I could use postgres native JSON or ARRAY types. Anyone had any experience with this? The only real constraint is I want to use upsert semantics, or at least, some method which doesn't result in duplicates in the table. Thanks Alex