Hmm.. Ok. Maybe this could be a feature request for a future release. This
can be accomplished easily in Spark Structured Streaming... just saying :)

In Spark Structured Streaming we can have separate configurations for
'readStream' & 'writeStream'. I am a bit surprised this is not available in
Kafka Streams.

Even during development sometimes we want to read from a topic in a QA
cluster & write it to a topic on the Local cluster.



On Tue, Dec 1, 2020 at 11:33 AM Matthias J. Sax <mj...@apache.org> wrote:

> KafkaStreams can only connect to a single cluster.
>
> If you really need to read from one cluster and write to another, you
> have 3 main options:
>
>  - use KafkaStreams on the source cluster and mirror the output topic
> from the source to the target cluster
>
>  - mirror the input topic from the source cluster to the target cluster
> and use KafkaStreams on the target cluster
>
>  - don't use KafkaStreams but plain consumer/producer
>
>
>
> -Matthias
>
> On 12/1/20 10:58 AM, Eric Beabes wrote:
> > I need to read from a topic in one bootstrap server & write it to another
> > topic in another bootstrap server. Since there's only one
> > StreamsConfig.BOOTSTRAP_SERVERS_CONFIG property, I am wondering how to
> > accomplish this?
> >
> > Do I need to create 2 different KafkaStreams objects? One for reading &
> the
> > other for writing?
> >
>

Reply via email to