· Kafka Connect for ingress “E”
· Kafka Streams , Flink or Spark Streaming for “T” – Read from and
write back to Kafka – Keep the sources of data for you processing engine small
Separation of concerns, why should Spark care about where you upstream sources
are for example
·
Hello all;
I easily reproduce an annoying scenario, using Kafka
brokers kafka_2.10-0.8.2.0, but I believe it has nothing to do with the
brokersbut only with the consumer API.
Here is the problem: a producer is continuously writing using the sync
producer api to a topic (2 partitions, replicated)
Is it possible to use the topic filter whitelist within a Kafka Streaming
application? Or can it only be done in a consumer job?
I have a scenario where, for a given topic I'll have 500 consumers (1
consumer per instance of an app). I've setup the topic so it has 500
partitions, thus ensuring each consumer will eventually get work (the data
produced into kafka use the default partitioning strategy).
note: These consumer ap
AFAIK (not actually using myself), for cross DC replication people tend to
use MirrorMaker to transfer one cluster's data to another, usually a kind
of central DC that unifies all "regional" DCs, but the layout depends on
your business reqs.
Then your consumer are assigned only with local brokers'
take a look at kafka client https://github.com/gerritjvv/kafka-fast, it
uses a different approach where you can have more than several consumers
per topic+partition (i.e no relation between topic partitions and
consumers). It uses redis but only for offsets and work distribution, not
for the messag
Hi,
I am facing issues with jdbc Sink Connector when working with Oracle DB.
This functionality was working fine when I was using MySQL DB.
First error I had was when trying to create table using auto.create = true.
It tried to create table for STRING fields as NVARCHAR2(4000) (which I see
is by
Hi Gary,
In the upcoming 0.10.1 release you can do regex subscription - will that
help?
Thanks,
Damian
On Fri, 30 Sep 2016 at 14:57 Gary Ogden wrote:
> Is it possible to use the topic filter whitelist within a Kafka Streaming
> application? Or can it only be done in a consumer job?
>
So how exactly would that work? For example, I can currently do this:
KStream
textLines = builder.stream(stringSerde, stringSerde, SYSTEM_TOPIC);
Are you saying that I could put a regex in place of the SYSTEM_TOPIC and
that one KStream would be streaming from multiple topics that match th
Hi,
Yeah I am aware of MirrorMaker. We tried to simplify our architecture so as
to avoid needing to use MirrorMaker and just rely on the rack replication
for cross datacenter replication. I think the only missing piece to this is
making consumers only read from a subset of the nodes in the cluster,
Unfortunately, that’s not the way Kafka works wrt Consumers. When a partition
is replicated, only one replica is the Leader — all reads and writes are done
via the Leader. The other replicas are Followers; their only job is to keep up
with the Leader. No read requests from Consumers go to Follow
Hi Srikrishna,
For future please address questions related to Confluent's connectors to
the relevant ML (https://groups.google.com/forum/#!forum/confluent-platform
).
The NVARCHAR2(4000) mapping for string types for Oracle was based on my
reading of the documentation which states it can hold up t
I am using kafka api 0.10.
//Sample code
List topicsList = new ArrayList<>();
topicsList.add("topic1");
topicsList.add("topic2");
KafkaConsumer consumer = new KafkaConsumer(props);
consumer.subscribe(topicsList);
Problem:
For each topic, I w
Thanks for the response Shikar. The issue was happening because the table
metadata was sending the column names of the table in upper case and the
connector is expecting the table names to be in lower case. I fixed it by
creating the table with table columns like "updated_by". This way, I am no
lon
Please see my SO answer:
https://stackoverflow.com/questions/39799293/kafka-new-api-0-10-doesnt-provide-a-list-of-stream-and-consumer-objects-per-top/39803689#39803689
-Matthias
On 09/30/2016 12:13 PM, Mudassir Maredia wrote:
> I am using kafka api 0.10.
>
> //Sample code
> List
15 matches
Mail list logo