I assume this is kafka 0.8, right? Are there any corresponding errors in the broker logs? With the configuration below, I don't think any errors will be reported back to the producer.
You could also try setting erquest.required.acks=1 to see if errors are reported back to the client. On 8/29/13 4:40 AM, "Lu Xuechao" <lux...@gmail.com> wrote: >Hi , > >I am trying to enable gzip compression for my events. But after I switched >compression.codec to "1" I found the produced events were even not be >persisted to disk log file. Of course, the consumer could not receive any >compressed events. I sent 10,000 or more events but the broker's log file >not changed. Seems no events were actually send to broker? Below is my >producer's code: > > Properties props = new Properties(); > props.put("serializer.class", "kafka.serializer.StringEncoder"); > props.put("metadata.broker.list", "127.0.0.1:9092"); > props.put("partitioner.class", >"kafka.producer.DefaultPartitioner"); > props.put("queue.enqueue.timeout.ms", "-1"); > props.put("request.required.acks", "0"); > props.put("producer.type", "async"); > > props.put("batch.num.messages", "100"); > > props.put("compression.codec", "1"); > > ProducerConfig config = new ProducerConfig(props); > producer = new Producer<String, String>(config); > > KeyedMessage<String, String> data = new KeyedMessage<String, >String>("topic1", messageStr, msg); > producer.send(data); > > >If I comment out this line of code : props.put("compression.codec", "1"); >then everything works fine. Did I miss something? > >thanks, >xlu