Re: kafka stream to new topic based on message key

2016-10-07 Thread Gary Ogden
> > beyond that for now. > > > > Note that in KIP-4 we are trying to introduce the admin client for such > > tasks such as create / delete topics, it has added such requests in the > > upcoming 0.10.1.0 release, but the full implementation is yet to be > > comple

Re: kafka stream to new topic based on message key

2016-10-06 Thread Gary Ogden
> branch the source stream into multiple ones based on the content, with each > branched stream to a different topic. > > > Guozhang > > > On Wed, Oct 5, 2016 at 7:48 AM, Gary Ogden wrote: > > > Guozhang. I was just looking at the source for this, and it looks like

Re: kafka stream to new topic based on message key

2016-10-05 Thread Gary Ogden
ess to the ProcessorContext interface, which doesn't expose the Supplier. On 5 October 2016 at 09:42, Gary Ogden wrote: > What if we were to use kafka connect instead of streams? Does it have the > ability to specify partitions, rf, segment size etc? > > On 5 October 2016 at 09:42, Gary Ogden

Re: kafka stream to new topic based on message key

2016-10-05 Thread Gary Ogden
What if we were to use kafka connect instead of streams? Does it have the ability to specify partitions, rf, segment size etc? On 5 October 2016 at 09:42, Gary Ogden wrote: > Thanks Guozhang. > > So there's no way we could also use InternalTopicManager to specify the > number

Re: kafka stream to new topic based on message key

2016-10-05 Thread Gary Ogden
u then I think it should be fine. > > > Guozhang > > > > On Tue, Oct 4, 2016 at 12:51 PM, Gary Ogden wrote: > > > Is it possible, in a kafka streaming job, to write to another topic based > > on the key in the messages? > > > > For example, say

Re: kafka streams with dynamic content and filtering

2016-10-05 Thread Gary Ogden
Sorry. I responded to the wrong message On 5 October 2016 at 09:40, Gary Ogden wrote: > Thanks Guozhang. > > So there's no way we could also use InternalTopicManager to specify the > number of partitions and RF? > > https://github.com/apache/kafka/blob/0.10.1/strea

Re: kafka streams with dynamic content and filtering

2016-10-05 Thread Gary Ogden
I works please read the corresponding sections on the web > docs: > > http://docs.confluent.io/3.0.1/streams/developer-guide.html#processor-api > > > Guozhang > > On Mon, Oct 3, 2016 at 6:51 AM, Gary Ogden wrote: > > > I have a use case, and I'm wondering if it's

kafka stream to new topic based on message key

2016-10-04 Thread Gary Ogden
Is it possible, in a kafka streaming job, to write to another topic based on the key in the messages? For example, say the message is: 123456#{"id":56789,"type":1} where the key is 123456, # is the delimeter, and the {} is the json data. And I want to push the json data to another topic that wi

Re: Kafka streaming and topic filter whitelist

2016-10-03 Thread Gary Ogden
What if topics are created or deleted after the application has started? Will they be added/removed automatically, or do we need to restart the application to pick up the changes? On 1 October 2016 at 04:42, Damian Guy wrote: > That is correct. > > On Fri, 30 Sep 2016 at 18:00 Gary Ogd

kafka streams with dynamic content and filtering

2016-10-03 Thread Gary Ogden
I have a use case, and I'm wondering if it's possible to do this with Kafka. Let's say we will have customers that will be uploading JSON to our system, but that JSON layout will be different between each customer. They are able to define the schema of the JSON being uploaded. They will then be a

Re: Kafka streaming and topic filter whitelist

2016-09-30 Thread Gary Ogden
that regex? If so, that could be useful. Gary On 30 September 2016 at 13:35, Damian Guy wrote: > Hi Gary, > > In the upcoming 0.10.1 release you can do regex subscription - will that > help? > > Thanks, > Damian > > On Fri, 30 Sep 2016 at 14:57 Gary Ogden wrote: >

Kafka streaming and topic filter whitelist

2016-09-30 Thread Gary Ogden
Is it possible to use the topic filter whitelist within a Kafka Streaming application? Or can it only be done in a consumer job?

Re: non-blocking sends when cluster is down

2015-02-26 Thread Gary Ogden
metadata.max.age.ms=30 > > > On Thu, Feb 26, 2015 at 4:47 AM, Gary Ogden wrote: > > > I was actually referring to the metadata fetch. Sorry I should have been > > more descriptive. I know we can decrease the metadata.fetch.timeout.ms > > setting to be a lot lower, but it&

Re: non-blocking sends when cluster is down

2015-02-26 Thread Gary Ogden
the send() call will throw a BufferExhaustedException > which, in your case, can be caught and ignore and allow the message to drop > on the floor. > > Guozhang > > > > On Wed, Feb 25, 2015 at 5:08 AM, Gary Ogden wrote: > > > Say the entire kafka cluster is down and ther

non-blocking sends when cluster is down

2015-02-25 Thread Gary Ogden
Say the entire kafka cluster is down and there's no brokers to connect to. Is it possible to use the java producer send method and not block until there's a timeout? Is it as simple as registering a callback method? We need the ability for our application to not have any kind of delay when sendin

Re: understanding partition key

2015-02-12 Thread Gary Ogden
've always found > there's something that we're doing that is affecting the overall throughput > of the systems, be it needing to play with the number of partitions, > adjusting batch size, resizing the hadoop cluster to meet increased need > there. > > On Thu, Feb 12,

Re: understanding partition key

2015-02-12 Thread Gary Ogden
; the same information throughout the day, it lets us maintain a system where > we have near-real-time access to most of the data we're ingesting. > > This certainly is something we've had to tweak in terms of the numbers of > consumers / partitions and batch sizes to get to

Re: understanding partition key

2015-02-12 Thread Gary Ogden
ing HBase you can use Pig jobs that would read only > the records created between specific timestamps. > > David > > On Thu, Feb 12, 2015 at 7:44 AM, Gary Ogden wrote: > > > So it's not possible to have 1 topic with 1 partition and many consumers > of > > tha

Re: understanding partition key

2015-02-12 Thread Gary Ogden
f you have multiple partitions > (say 3 for example), then you can fire up 3 consumer instances under the > same consumer group, and each will only consume 1 partition's data. if > order in each partition matters, then you need to do some work on the > producer side.Hope this helpsEdwi

understanding partition key

2015-02-11 Thread Gary Ogden
I'm trying to understand how the partition key works and whether I need to specify a partition key for my topics or not. What happens if I don't specify a PK and I have more than one consumer that wants all messages in a topic for a certain period of time? Will those consumers get all the message