ok, thanks for confirming, I will do it this way.
Regards
Srini
On Tue, Jun 9, 2020 at 11:31 PM Gerard Maas wrote:
> Hi Srinivas,
>
> Reading from different brokers is possible but you need to connect to each
> Kafka cluster separately.
> Trying to mix connections to two different Kafka cluster
Hi Srinivas,
Reading from different brokers is possible but you need to connect to each
Kafka cluster separately.
Trying to mix connections to two different Kafka clusters in one subscriber
is not supported. (I'm sure that it would give all kind of weird errors)
The "kafka.bootstrap.servers" opti
Thanks for the quick reply. This may work but I have like 5 topics to
listen to right now, I am trying to keep all topics in an array in a
properties file and trying to read all at once. This way it is dynamic and
you have one code block like below and you may add or delete topics from
the config f
Hello,
I've never tried that, this doesn't work?
val df_cluster1 = spark
.read
.format("kafka")
.option("kafka.bootstrap.servers", "cluster1_host:cluster1_port")
.option("subscribe", "topic1")
val df_cluster2 = spark
.read
.format("kafka")
.option("kafka.bootstrap.servers", "cluste
Hello,
In Structured Streaming, is it possible to have one spark application with
one query to consume topics from multiple kafka clusters?
I am trying to consume two topics each from different Kafka Cluster, but it
gives one of the topics as an unknown topic and the job keeps running
without com