Re: Using Kafka Producer inside Oracle DB

2017-07-06 Thread Stephen Durfey
Just to add to this, depending upon your use case it may be beneficial to use kafka connect for pulling data out of oracle to publish to kafka. With the JDBC connector you would just need a few configs to stand up kafka connect and start publishing data to kafka, either via a select statement or a

Re: struggling with runtime Schema in connect

2017-07-09 Thread Stephen Durfey
I'll try to answer this for you. I'm going to assume you are using the pre-packaged kafka connect distro from confluent. org.apache.kafka.connect.data.Schema is an abstraction of the type definition for the data being passed around. How that is defined generally falls onto the connector being used

Re: struggling with runtime Schema in connect

2017-07-10 Thread Stephen Durfey
h the data > from source to sink? so set key.converter.schemas.enable=true and > value.converter.schemas.enable=true? > > is it a correct assumption that kafka-connect wouldn't work if i chose the > "raw" json serialization that discards the schema? > > On Sun, Jul

Kafka Connect distributed mode rebalance

2017-07-20 Thread Stephen Durfey
I'm seeing some behavior with the DistributedHerder that I am trying to understand. I'm working on setting up a cluster of kafka connect nodes and have a relatively large number of connectors to submit to it (392 connectors right now that will soon become over 1100). As for the deployment of it I a

Re: Kafka Connect distributed mode rebalance

2017-07-26 Thread Stephen Durfey
l get these updates in yet. One other thing to > > consider is if it is possible to use fewer connectors at a time. One of > our > > goals was to encourage broad copying by default; fewer connectors/tasks > > doesn't necessarily solve your problem, but depending on the conn

Re: Kafka Connect with HdfsSinkConnector on DC/OS ..

2017-08-03 Thread Stephen Durfey
That sounds like either a protobuf dependency compatibility issue between what is on the classpath of kafka connect and the hadoop cluster you are trying to write to (e.g. you're on a newer version of protobuf than your cluster, or vice versa), or a wire incompatilibty of the communcation protocol

Re: Kafka Connect with HdfsSinkConnector on DC/OS ..

2017-08-03 Thread Stephen Durfey
a look to see if they include different versions of protobuf. > > regards. > > On Thu, Aug 3, 2017 at 7:24 PM, Stephen Durfey wrote: > > > That sounds like either a protobuf dependency compatibility issue between > > what is on the classpath of kafka connect and the hado

Re: Different Schemas on same Kafka Topic

2017-08-17 Thread Stephen Durfey
There is a little nuance to this topic (hehe). When it comes down to it, yes, each unique avro schema has a unique id associated with it. That id can be used across multiple different topics. The enforcement of which schemas are allowed in a particular topic comes down to the combination of the sub

Re: Different Schemas on same Kafka Topic

2017-08-17 Thread Stephen Durfey
so that even if schema > registry holds same id for a topic, each schema version under it is > entirely different schema. Is my understanding correct ? > > But, when defines NONE, the purpose of the schema registry itself > lost.Rght ? > > Regards > Sreejith > > On 1

Re: Different Schemas on same Kafka Topic

2017-08-18 Thread Stephen Durfey
gt; schema id of ur choice and ask avro serialize/deserialize. But in connect > > framework all these things are abstracted. > > > > Its a good pointer on using NONE compatibility type so that even if > schema > > registry holds same id for a topic, each schema vers

Re: Avro With Kafka

2017-08-18 Thread Stephen Durfey
Yes, the confluent SerDe's support nested avro records. Underneath the covers they are using avro classes (DatumReader and DatumWriter) to carry out those operations. So, as long as you're sending valid avro data to be produced or consumed, the confluent SerDe's will handle it just fine. __

Re: Pinning clients to specific brokers

2017-08-23 Thread Stephen Durfey
Mohit, Can you describe your use case around why you want this to happen? Thanks From: Joao Reis Sent: Wednesday, August 23, 2017 11:08:02 AM To: users@kafka.apache.org Subject: Re: Pinning clients to specific brokers Hey Mohit, I agree with Hans, and addition

Re: how to use Confluent connector with Apache Kafka

2017-09-29 Thread Stephen Durfey
The confluent platform download has everything pre-configured for everything to work after unzipping the download, including any dependencies that it needs to run. The shell script that starts up the kafka connect worker ensures that everything that needs to be on the classpath is on the classpath

Re: how to use Confluent connector with Apache Kafka

2017-09-29 Thread Stephen Durfey
You can choose to run just kafka connect in the confluent platform (just run the kafka connect shell script(s)) and configure the connectors to point towards your kafka installation. The confluent platform uses vanilla kafka under the covers, but there isn't anything requiring you to run kafka foun

Re: Can I commit from a thread?

2017-12-29 Thread Stephen Durfey
Hey Skip, the Kafka consumer is NOT thread safe. If you want to share across threads you will need to properly lock. Alternatively you can create a consumer per thread. Check out the class javadoc under the "multithreaded processing" for more details and suggestions. ___

Overriding JAAS config for connector

2017-02-23 Thread Stephen Durfey
Now that 0.10.2.0 is out, I was looking forward to checking out the inclusion of KIP-85 . I have a potential need for multi-tenancy in a single kafka connect instance, and wanted to be able to

Re: Overriding JAAS config for connector

2017-02-24 Thread Stephen Durfey
-0600, "Ismael Juma" wrote: Hi Stephen, Did you get an error when you set this as a String? It should work fine. Ismael On Thu, Feb 23, 2017 at 8:09 PM, Stephen Durfey wrote: > Now that 0.10.2.0 is out, I was looking forward to checking out the > inclusion of KIP-85 >

Re: Overriding JAAS config for connector

2017-02-27 Thread Stephen Durfey
way it works for the consumer/producer is that the string config gets > converted into a Password instance during the parsing stage. Seems like > this is not happening in the connect case for some reason. > > Ismael > > On 24 Feb 2017 5:56 pm, "Stephen Durfey" wrote: >

Re: Kafka connection to start from latest offset

2017-03-14 Thread Stephen Durfey
Producer and consumer overrides used by the connect worker can be overridden by prefixing the specific kafka config with either 'producer.' or 'consumer.'. So, you should be able to set 'consumer.auto.offset.reset=latest' in your worker config to do that. http://docs.confluent.io/3.0.0/connect/use

Re: Kafka connection to start from latest offset

2017-03-15 Thread Stephen Durfey
8 On Wed, Mar 15, 2017 at 8:42 AM, Aaron Niskode-Dossett < aniskodedoss...@etsy.com.invalid> wrote: > Thank you Stephen! That's a very coarse setting, as you note, since it's > at the worker level, but I'll take it. > > -Aaron > > On Tue, Mar 14, 2017 at 8:07