Re: Json to JDBC using Kafka JDBC connector Sink

2017-01-10 Thread Gwen Shapira
Ewen: I think he was looking for exactly what you were guessing he doesn't: "My goal is to pipe that json document in a postgres table that has two columns: id and json." Postgres has some nice built-in functions that make this actually useful and not as nuts as it may appear. As Ewen mentioned,

Re: Json to JDBC using Kafka JDBC connector Sink

2017-01-10 Thread Ewen Cheslack-Postava
Anything with a table structure is probably not going to handle schemaless data (i.e. JSON) very well without some extra help -- tables usually expect schemas and JSON doesn't have a schema. As it stands today, the JDBC sink connector will probably not handle your use case. To send schemaless data

Re: Json to JDBC using Kafka JDBC connector Sink

2017-01-09 Thread william tellme
unsubscribe On Mon, Jan 9, 2017 at 6:14 PM, Stephane Maarek < steph...@simplemachines.com.au> wrote: > Hi, > > I’m wondering if the following is feasible… > I have a json document with pretty much 0 schema. The only thing I know for > sure is that it’s a json document. > My goal is to pipe that j

Json to JDBC using Kafka JDBC connector Sink

2017-01-09 Thread Stephane Maarek
Hi, I’m wondering if the following is feasible… I have a json document with pretty much 0 schema. The only thing I know for sure is that it’s a json document. My goal is to pipe that json document in a postgres table that has two columns: id and json. The id column is basically topic+partition+off