You can use Kafka to store data on Hadoop via the Hadoop consumer in
contrib, and then use Talend or Pig to ETL it, before finally emitting the
ETL's records via the Hadoop producer in contrib.

https://github.com/kafka-dev/kafka/tree/master/contrib
http://docs.hortonworks.com/CURRENT/index.htm#Data_Integration_Services_With_HDP/Using_Data_Integration_Services_Powered_By_Talend/Using_Talend.htm

Russell Jurney http://datasyndrome.com

On Jan 6, 2013, at 2:29 PM, David Arthur <mum...@gmail.com> wrote:

Storm has support for Kafka, if that's the sort of thing you're looking
for. Maybe you could describe your use case a bit more?

On Sunday, January 6, 2013, Guy Doulberg wrote:

Hi


I am looking for an ETL tool that can connect to kafka, as a consumer and

as a producer,


Have you heard of such a tool?


Thanks

Guy




-- 
David Arthur

Reply via email to