I'm new to Flume and thinking to use Flume in the below scenario.

Our system receives events as HTTP POST, and we need to store them in Kafka(for 
processing) as well as HDFS(as permanent store).

Can we configure Flume as below ?

*         Source:  HTTP (expecting JSON event as HTTP body, with a dynamic 
topic name in the URI)

*         Channel: KAFKA (should store the received JSON body, to a topic 
mentioned in the URI)

*         Sink:  HDFS (should store the data in a folder mentioned in the URI.

For example, If I receive a JSON event from a HTTP source with the below 
attributes,

*         URL: https://xx.xx.xx.xx/event/abc

*         Body of POST:  { name: xyz, value=123}


The event should be saved to Kafka channel - with topic 'abc' and written to 
HDFS to a folder as 'abc'.
This 'abc' will be dynamic and change from event to event.

Is this possible with Flume ?

Thanks in advance
Hemanth

Reply via email to