Numan,

you may also want to take a look at Kafka Streams, which is a new stream
processing library that's included in Apache Kafka since version 0.10.
Kafka Streams is definitely more convenient and quicker to implement than
the "normal" Kafka producer/consumer clients.  Also, Kafka Streams does not
require a separate processing cluster like Spark or Storm -- you only need
the Kafka cluster to read data from / to write data to.

> My question is:Is it possible to consume messages from a Kafka topic in
real-time
> and write directly into another topic without using any streaming
technology such
> as storm or spark? If yes, do you have any examples to do that in Java?

There's one specific examples that does exactly this:
https://github.com/confluentinc/examples/blob/master/kafka-streams/src/test/java/io/confluent/examples/streams/PassThroughIntegrationTest.java

Here are the key snippets that you would be using:

    Properties streamsConfiguration = new Properties();
    streamsConfiguration.put(StreamsConfig.APPLICATION_ID_CONFIG,
"pass-through-integration-test");
    //...more configs here...

    KStreamBuilder builder = new KStreamBuilder();
    // Write the input data as-is to the output topic.
    builder.stream("my-input-topic").to("my-output-topic");
    KafkaStreams streams = new KafkaStreams(builder, streamsConfiguration);
    streams.start();

Of course you can also transform the messages easily prior to writing them
to the output topic.  See the other examples/references down below.

Hope this helps,
Michael


Kafka Streams docs:
http://kafka.apache.org/documentation.html#streams
http://docs.confluent.io/3.0.0/streams/index.html

Some Kafka Streams examples:
https://github.com/confluentinc/examples/tree/master/kafka-streams

Some Kafka Streams articles:
http://www.confluent.io/blog/introducing-kafka-streams-stream-processing-made-simple



On Fri, Jul 1, 2016 at 4:29 PM, numangoceri <numangoc...@yahoo.com.invalid>
wrote:

> Hi,
>
> Thanks for your answer. I meant actually if we can verify the data
> reliability without using Storm or Spark. Because i know that by using
> Storm, you can guarantee the messages (depending on the type of the
> Topology) such as exactly once, at least once.
> If i simply use kafka consumer and another producer to forward the
> messages, could the data tranfer completely be guaranteed as well?
>
>
> Numan Göceri
>
> ---
>
> Rakesh Vidyadharan <rvidyadha...@gracenote.com> wrote:
>
> >Definitely.  You can read off kafka using the samples shown in
> KafkaConsumer javadoc, transform if necessary and publish to the
> destination topic.
> >
> >
> >
> >
> >On 01/07/2016 03:24, "numan goceri" <numangoc...@yahoo.com.INVALID>
> wrote:
> >
> >>Hello everyone,
> >>I've a quick question:I'm using Apache Kafka producer to write the
> messages into a topic. My source at the moment a csv file but in the future
> i am supposed to read the messages from another kafka topic.My question
> is:Is it possible to consume messages from a Kafka topic in real-time and
> write directly into another topic without using any streaming technology
> such as storm or spark? If yes, do you have any examples to do that in Java?
> >>To sum up, it should be looking like this:Kafka reads from topic
> "kafkaSource" and writes into the topic "kafkaResult".
> >>
> >>Thanks in advance and Best Regards, Numan
>



-- 
Best regards,
Michael Noll



*Michael G. Noll | Product Manager | Confluent | +1 650.453.5860Download
Apache Kafka and Confluent Platform: www.confluent.io/download
<http://www.confluent.io/download>*

Reply via email to