Log 4j Conversion Pattern not working kafka.producer.KafkaLog4jAppender in Kafka.0.8.1.1

2014-09-22 Thread Lakshmanan Muthuraman
Hello , I am trying to use KafkaLog4jAppender to write to kafka and also a file appender to write to a file. The conversion pattern works for the file, but for the kafka appender it is not working The output of the file appender is *2014-09-19T22:30:14.781Z INFO com.test.poc.StartProgram Message1

Re: Routing modifications at runtime

2015-01-28 Thread Lakshmanan Muthuraman
Hi Toni, Couple of thoughts. 1. Kafka behaviour need not be changed at run time. Your producers which push your MAC data into kafka should know to which topic it should write. Your producer can be flume, log stash or it can be your own custom written java producer. As long as your producer know

Re: Resilient Producer

2015-01-28 Thread Lakshmanan Muthuraman
We have been using Flume to solve a very similar usecase. Our servers write the log files to a local file system, and then we have flume agent which ships the data to kafka. Flume you can use as exec source running tail. Though the exec source runs well with tail, there are issues if the agent goe

Re: Resilient Producer

2015-01-29 Thread Lakshmanan Muthuraman
ot; and "spoolDir". Take a look here: > > https://issues.apache.org/jira/browse/FLUME-2498 > > > > > 2015-01-29 0:24 GMT+01:00 Lakshmanan Muthuraman : > > > We have been using Flume to solve a very similar usecase. Our servers > write > > the log files to a

Re: Trying to get kafka data to Hadoop

2015-03-04 Thread Lakshmanan Muthuraman
I think the libjars is not required. Maven package command for the camus project, builds the uber jar(fat jar) which contains all the dependencies in it. I generally run camus the following way. hadoop jar camus-example-0.1.0-SNAPSHOT-shaded.jar com.linkedin.camus.etl.kafka.CamusJob -P camus.prope