Hi,

I am trying out Kafka connect and have a couple questions. We are directly 
publishing raw binary data to kafka from one of our apps and wanted to create a 
Kafka Connector Sink to move the raw data to something like Cassandra. Since 
this data is directly published to Kafka it doesn't have any of the Kafka 
Connect metadata such as Schema. So our Cassandra Sink Connector fails at 
parsing it. It seems we could write a custom converter to handle this raw data 
and fill out SchemaAndValue in very basic way to make it work. I'm not sure if 
this is the correct approach. If it is, it seems the only way to use the 
converter, at least in the stand alone mode, is to use the same converter for 
all of Connect, because it doesn't look like we could overwrite the converter 
config on a connector basis. So if we were to write new Sinks and Sources it 
would have to use our custom converter instead of the default.

Thanks,

Eric


________________________________

    Eric Lachman
     Software Developer I

     Spot Trading L.L.C
     440 South LaSalle St., Suite 2800
     Chicago, IL 60605
     Office: 312.362.4550
     Direct:
     Fax: 312.362.4551
     eric.lach...@spottradingllc.com
     www.spottradingllc.com<http://www.spottradingllc.com/>

________________________________

The information contained in this message may be privileged and confidential 
and protected from disclosure. If the reader of this message is not the 
intended recipient, or an employee or agent responsible for delivering this 
message to the intended recipient, you are hereby notified that any 
dissemination, distribution or copying of this communication is strictly 
prohibited. If you have received this communication in error, please notify us 
immediately by replying to the message and deleting it from your computer. 
Thank you. Spot Trading, LLC


Reply via email to