ConsumerConnector connecting to Zookeeper: EndOfStreamException
Hello, I get an error when trying to 'consume' messages from Kafka (2.9.2-0.8.1) with a Zookeer stand-alone (3.4.5). You can see the source code below as well as the error message and logfile from Zookeeper. I'm not sure if the Java libraries are incompatible, because I added dependency kafka_0.9.2 (0.8.1) via Maven which automatically resolved dependency of zkclient (0.3) and zookeeper (3.3.4). The consumer source code: import java.util.Properties; import kafka.consumer.Consumer; import kafka.consumer.ConsumerConfig; import kafka.javaapi.consumer.ConsumerConnector; public class ConsumerTest { public static void main(String[] args) { try { Properties props = new Properties(); props.put("zookeeper.connect", "192.168.0.1:2181/kafka"); props.put("group.id", "my-consumer"); props.put("zookeeper.session.timeout.ms", "400"); props.put("zookeeper.sync.time.ms", "200"); props.put("auto.commit.interval.ms", "1000"); ConsumerConfig config = new ConsumerConfig(props); @SuppressWarnings("unused") ConsumerConnector consumer = Consumer.createJavaConsumerConnector(config); } catch(Exception e) { System.out.println(e.getMessage()); e.printStackTrace(); } } } The pom.xml: http://maven.apache.org/POM/4.0.0"; xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"; xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd";> 4.0.0 test.my kafka-consumer 0.0.1-SNAPSHOT org.apache.kafka kafka_2.9.2 jms javax.jms jmxtools com.sun.jdmk jmxri com.sun.jmx org.apache.kafka kafka_2.9.2 0.8.1 The exception message and stack trace: Unable to connect to zookeeper server within timeout: 400 org.I0Itec.zkclient.exception.ZkTimeoutException: Unable to connect to zookeeper server within timeout: 400 at org.I0Itec.zkclient.ZkClient.connect(ZkClient.java:880) at org.I0Itec.zkclient.ZkClient.(ZkClient.java:98) at org.I0Itec.zkclient.ZkClient.(ZkClient.java:84) at kafka.consumer.ZookeeperConsumerConnector.connectZk(ZookeeperConsumerConne
kafka dynamic topic discovery
Hi friends, I am working on Kafka based pub/sub project.I would like know more about Kafka dynamic topic consumer.My producer will publish data into any topic. My consumer need to consume all the data from all the topic. In current Kafka version is it support wild card topic consumer? Thanks , Kannadhasan s
still looking for Maven POM for Java Kafka
All, I am running some examples from the packet book Apache Kafka. I am creating a HighLevelconsumer. I am getting an error when I try to run the code: -- C:\Users\david.j.novogrodsky\Documents\TestingKafka\target>java -cp Kafka_test-1.0-SNAPSHOT.jar com.cat.HighLevelConsumer test Exception in thread "main" java.lang.NoClassDefFoundError: kafka/consumer/ConsumerConfig at com.cat.HighLevelConsumer.(HighLevelConsumer.java:32) at com.cat.HighLevelConsumer.main(HighLevelConsumer.java:61) Caused by: java.lang.ClassNotFoundException: kafka.consumer.ConsumerConfig at java.net.URLClassLoader$1.run(Unknown Source) at java.net.URLClassLoader$1.run(Unknown Source) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(Unknown Source) at java.lang.ClassLoader.loadClass(Unknown Source) at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source) at java.lang.ClassLoader.loadClass(Unknown Source) -- Is this an error with the POM? David Novogrodsky david.novogrod...@gmail.com http://www.linkedin.com/in/davidnovogrodsky
Re: still looking for Maven POM for Java Kafka
Looks like Kafka classes is not on your classpath. You should either assemble an uber-jar from your project (e. g. using Maven Assembly plugin with jar-with-dependencies descriptor ref) or add location of Kafka classes to your classpath. 2014-05-06 19:11 GMT+04:00 David Novogrodsky : > All, > > I am running some examples from the packet book Apache Kafka. I am > creating a HighLevelconsumer. I am getting an error when I try to run the > code: > -- > C:\Users\david.j.novogrodsky\Documents\TestingKafka\target>java -cp > Kafka_test-1.0-SNAPSHOT.jar com.cat.HighLevelConsumer test > Exception in thread "main" java.lang.NoClassDefFoundError: > kafka/consumer/ConsumerConfig > at com.cat.HighLevelConsumer.(HighLevelConsumer.java:32) > at com.cat.HighLevelConsumer.main(HighLevelConsumer.java:61) > Caused by: java.lang.ClassNotFoundException: kafka.consumer.ConsumerConfig > at java.net.URLClassLoader$1.run(Unknown Source) > at java.net.URLClassLoader$1.run(Unknown Source) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(Unknown Source) > at java.lang.ClassLoader.loadClass(Unknown Source) > at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source) > at java.lang.ClassLoader.loadClass(Unknown Source) > -- > Is this an error with the POM? > David Novogrodsky > david.novogrod...@gmail.com > http://www.linkedin.com/in/davidnovogrodsky >
Re: kafka dynamic topic discovery
Hello, You may find this doc helpful in using a wildcard topic subscription: https://kafka.apache.org/documentation.html#highlevelconsumerapi Guozhang On Tue, May 6, 2014 at 7:00 AM, Kannadhasan Sekar wrote: > Hi friends, >I am working on Kafka based pub/sub project.I would like know more about > Kafka dynamic topic consumer.My producer will publish data into any topic. > My consumer need to consume all the data from all the topic. >In current Kafka version is it support wild card topic consumer? > > Thanks , > Kannadhasan s > -- -- Guozhang
Re: JAVA HEAP settings for KAFKA in production
Hey Todd, Doc patch? :-) svn co http://svn.apache.org/repos/asf/kafka/site/081/ Don't stress about html or formatting, I'm happy to do that part. I would love to give people more authoritative advice. Right now everything is a bit obsolete and wrong. -Jay On Mon, May 5, 2014 at 10:36 PM, Todd Palino wrote: > I apologize for taking a couple days to jump in on this. We¹re currently > running JDK 1.7 u51, and we¹ve switched over to the G1 collector. If you > do this (and I highly recommend it), make sure you¹re on u51. We tried out > u21 in testing, but we had a number of problems with the GC implementation > in that version. > > Our tuning looks like this: > -Xms4g -Xmx4g -XX:PermSize=48m -XX:MaxPermSize=48m -XX:+UseG1GC > -XX:MaxGCPauseMillis=20 -XX:InitiatingHeapOccupancyPercent=35 > > For reference, I¹m looking at the stats on one of our busiest clusters (at > peak): > - 15 brokers > - 15.5k partitions (replication factor 2) > > - 400k messages/sec in > - 70 MB/sec inbound, 400 MB/sec+ outbound > > The tuning looks fairly aggressive, but all of our brokers in that cluster > have a 90% GC pause time of about 21ms, and they¹re doing less than 1 > young GC per second. We haven¹t seen a single full GC on those brokers in > the last month, and previous to that I think we only saw them when I was > messing around with the cluster in a very painful way, not under anything > approaching normal traffic. > > -Todd > > > On 5/1/14, 9:21 PM, "Neha Narkhede" wrote: > > >The GC settings at http://kafka.apache.org/documentation.html#java are > >old. > >We meant to update the documentation with the new GC settings using the G7 > >collector, but we haven't gotten around to doing that. Let me reach out to > >our engineer, Todd Palino, who worked on tuning GC for Kafka at LinkedIn > >to > >see if we can update our docs. > > > >Thanks, > >Neha > > > > > >On Thu, May 1, 2014 at 9:02 PM, Jun Rao wrote: > > > >> http://kafka.apache.org/documentation.html#java > >> > >> Thanks, > >> > >> Jun > >> > >> > >> On Thu, May 1, 2014 at 12:19 PM, Cassa L wrote: > >> > >> > Hi, > >> > I want to know what usually, are the JAVA_HEAP settings recommended > >>for > >> > kafka servers in production. > >> > > >> > Thanks > >> > LCassa > >> > > >> > >
Re: still looking for Maven POM for Java Kafka
Hi Dave, Here's a POM I use to build the examples. It's pretty simple, including kafka 0.8.0 with scala 2.9.2 libs, and targeting Java 7. I also made a couple of mods to make things easier: 1) I use log4j 1.2.17 to eliminate a log4j 1.2.15 dependency on the sun jdmk. 2) I use the maven dependency plugin to build a lib folder that has all the jars you need for the runtime classpath. Hope it helps, Steve On Tue, May 6, 2014 at 8:11 AM, David Novogrodsky < david.novogrod...@gmail.com> wrote: > All, > > I am running some examples from the packet book Apache Kafka. I am > creating a HighLevelconsumer. I am getting an error when I try to run the > code: > -- > C:\Users\david.j.novogrodsky\Documents\TestingKafka\target>java -cp > Kafka_test-1.0-SNAPSHOT.jar com.cat.HighLevelConsumer test > Exception in thread "main" java.lang.NoClassDefFoundError: > kafka/consumer/ConsumerConfig > at com.cat.HighLevelConsumer.(HighLevelConsumer.java:32) > at com.cat.HighLevelConsumer.main(HighLevelConsumer.java:61) > Caused by: java.lang.ClassNotFoundException: kafka.consumer.ConsumerConfig > at java.net.URLClassLoader$1.run(Unknown Source) > at java.net.URLClassLoader$1.run(Unknown Source) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(Unknown Source) > at java.lang.ClassLoader.loadClass(Unknown Source) > at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source) > at java.lang.ClassLoader.loadClass(Unknown Source) > -- > Is this an error with the POM? > David Novogrodsky > david.novogrod...@gmail.com > http://www.linkedin.com/in/davidnovogrodsky > -- Steve Robenalt Software Architect HighWire | Stanford University 425 Broadway St, Redwood City, CA 94063 srobe...@stanford.edu http://highwire.stanford.edu http://maven.apache.org/POM/4.0.0"; xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"; xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd";> 4.0.0 org.example.kafka.test org.example.kafka.test 0.2.0-SNAPSHOT kafkatest src maven-compiler-plugin 3.0 1.7 1.7 org.apache.maven.plugins maven-dependency-plugin 2.8 copy-dependencies package copy-dependencies ${project.build.directory}/lib false false true runtime org.apache.kafka kafka_2.9.2 0.8.0 log4j log4j 1.2.17
Re: kafka dynamic topic discovery
Yes, wildcard consumption is supported in the latest stable release of Kafka (0.8.1.1). Thanks, Neha On Tue, May 6, 2014 at 7:00 AM, Kannadhasan Sekar wrote: > Hi friends, >I am working on Kafka based pub/sub project.I would like know more about > Kafka dynamic topic consumer.My producer will publish data into any topic. > My consumer need to consume all the data from all the topic. >In current Kafka version is it support wild card topic consumer? > > Thanks , > Kannadhasan s >
Re: JAVA HEAP settings for KAFKA in production
Updated the docs with Todd's recommendation - http://kafka.apache.org/081/documentation.html#java Thanks, Neha On Tue, May 6, 2014 at 9:40 AM, Jay Kreps wrote: > Hey Todd, > > Doc patch? :-) > > svn co http://svn.apache.org/repos/asf/kafka/site/081/ > > Don't stress about html or formatting, I'm happy to do that part. I would > love to give people more authoritative advice. Right now everything is a > bit obsolete and wrong. > > -Jay > > > On Mon, May 5, 2014 at 10:36 PM, Todd Palino wrote: > > > I apologize for taking a couple days to jump in on this. We¹re currently > > running JDK 1.7 u51, and we¹ve switched over to the G1 collector. If you > > do this (and I highly recommend it), make sure you¹re on u51. We tried > out > > u21 in testing, but we had a number of problems with the GC > implementation > > in that version. > > > > Our tuning looks like this: > > -Xms4g -Xmx4g -XX:PermSize=48m -XX:MaxPermSize=48m -XX:+UseG1GC > > -XX:MaxGCPauseMillis=20 -XX:InitiatingHeapOccupancyPercent=35 > > > > For reference, I¹m looking at the stats on one of our busiest clusters > (at > > peak): > > - 15 brokers > > - 15.5k partitions (replication factor 2) > > > > - 400k messages/sec in > > - 70 MB/sec inbound, 400 MB/sec+ outbound > > > > The tuning looks fairly aggressive, but all of our brokers in that > cluster > > have a 90% GC pause time of about 21ms, and they¹re doing less than 1 > > young GC per second. We haven¹t seen a single full GC on those brokers in > > the last month, and previous to that I think we only saw them when I was > > messing around with the cluster in a very painful way, not under anything > > approaching normal traffic. > > > > -Todd > > > > > > On 5/1/14, 9:21 PM, "Neha Narkhede" wrote: > > > > >The GC settings at http://kafka.apache.org/documentation.html#java are > > >old. > > >We meant to update the documentation with the new GC settings using the > G7 > > >collector, but we haven't gotten around to doing that. Let me reach out > to > > >our engineer, Todd Palino, who worked on tuning GC for Kafka at LinkedIn > > >to > > >see if we can update our docs. > > > > > >Thanks, > > >Neha > > > > > > > > >On Thu, May 1, 2014 at 9:02 PM, Jun Rao wrote: > > > > > >> http://kafka.apache.org/documentation.html#java > > >> > > >> Thanks, > > >> > > >> Jun > > >> > > >> > > >> On Thu, May 1, 2014 at 12:19 PM, Cassa L wrote: > > >> > > >> > Hi, > > >> > I want to know what usually, are the JAVA_HEAP settings > recommended > > >>for > > >> > kafka servers in production. > > >> > > > >> > Thanks > > >> > LCassa > > >> > > > >> > > > > >
Re: still looking for Maven POM for Java Kafka
Yury, Thank you for the reply. It was helpful. David Novogrodsky david.novogrod...@gmail.com http://www.linkedin.com/in/davidnovogrodsky On Tue, May 6, 2014 at 10:47 AM, Yury Ruchin wrote: > Looks like Kafka classes is not on your classpath. You should either > assemble an uber-jar from your project (e. g. using Maven Assembly plugin > with jar-with-dependencies descriptor ref) or add location of Kafka classes > to your classpath. > > > 2014-05-06 19:11 GMT+04:00 David Novogrodsky >: > > > All, > > > > I am running some examples from the packet book Apache Kafka. I am > > creating a HighLevelconsumer. I am getting an error when I try to run > the > > code: > > -- > > C:\Users\david.j.novogrodsky\Documents\TestingKafka\target>java -cp > > Kafka_test-1.0-SNAPSHOT.jar com.cat.HighLevelConsumer test > > Exception in thread "main" java.lang.NoClassDefFoundError: > > kafka/consumer/ConsumerConfig > > at com.cat.HighLevelConsumer.(HighLevelConsumer.java:32) > > at com.cat.HighLevelConsumer.main(HighLevelConsumer.java:61) > > Caused by: java.lang.ClassNotFoundException: > kafka.consumer.ConsumerConfig > > at java.net.URLClassLoader$1.run(Unknown Source) > > at java.net.URLClassLoader$1.run(Unknown Source) > > at java.security.AccessController.doPrivileged(Native Method) > > at java.net.URLClassLoader.findClass(Unknown Source) > > at java.lang.ClassLoader.loadClass(Unknown Source) > > at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source) > > at java.lang.ClassLoader.loadClass(Unknown Source) > > -- > > Is this an error with the POM? > > David Novogrodsky > > david.novogrod...@gmail.com > > http://www.linkedin.com/in/davidnovogrodsky > > >
Re: QOS on Producer Side
So to retrieve the message that was not transmitted due to error, I will have to store the message in each call back the message and store this in a file.Is there an alternative solution ? What is best way to achieve QOS on producer side ? Also, will new Producer API will work with Broker version 0.8.0 ? Please let me know. Thanks, Bhavesh On Tue, May 6, 2014 at 8:46 AM, Jun Rao wrote: > The callback will be invoked on completion. The exception tells you whether > the send completed or not. This api will be available in the next release. > > Thanks, > > Jun > > > On Mon, May 5, 2014 at 10:03 PM, Bhavesh Mistry > wrote: > > > Thanks for answers. > > > > Does the callback get call on failure only or for success as well ? > Also, > > how do I do this on Kafka 0.8.0 ? Is there any plan for adding > buffering > > on disk for next version ? Also, when application restart, kafka > producer > > will have to transmit the messages (buffered messages on disk) ? > > > > I can not upgrade to Kafka 0.8.1.1 right now. how can I achieve this on > > Kafka version 0.8.0. Also, when retransmitting the messages I wanted to > > transfer dedicated Kafka back-up or reserved partition for topic > (near-real > > time topic). > > > > How would you handle retransmission of messages ? > > > > Any idea about implementation will be very helpful. > > > > Thanks, > > > > Bhavesh > > > > On Mon, May 5, 2014 at 9:29 PM, Jun Rao wrote: > > > > > At LinkedIn, we only buffer messages in memory in the producer. We try > as > > > much as we can to make the Kafka cluster always available (with > > > replication). The "on failure" feature you mentioned can be implemented > > by > > > providing a callback. You can see the example in the javadoc. > > > > > > *send > > > < > > > > > > http://empathybox.com/kafka-javadoc/kafka/clients/producer/KafkaProducer.html#send(kafka.clients.producer.ProducerRecord > > > , > > > kafka.clients.producer.Callback)>*(ProducerRecord< > > > > > > http://empathybox.com/kafka-javadoc/kafka/clients/producer/ProducerRecord.html > > > > > > > record, Callback< > > > > http://empathybox.com/kafka-javadoc/kafka/clients/producer/Callback.html > > > > > > callback) > > > > > > Thanks, > > > > > > Jun > > > > > > On Mon, May 5, 2014 at 6:30 PM, Bhavesh Mistry > > > wrote: > > > > > > > We are using Kafka for operation metrics and we do not want to loose > > any > > > > data at all if there is issue with Network or all brokers needs to be > > > > rebooted or operation reason down time while all producers are still > > > > producing data on front end side. We use async to publish messages > and > > > we > > > > are using Kafka version 0.8.0. > > > > > > > > > > > > > > > > Has any one implemented buffering on local disk (on producer side) > and > > > > transmit messages when network connection is restored? How do I get > > > > handle to list of messages async thread could not transfer after x > > reties > > > > ? I know new producer API has callback interface, but is per message > > not > > > > per producer instance ? > > > > > > > > > > > > > > > > Is this final new Producer API ? > > > > > > > > > > > > > > http://empathybox.com/kafka-javadoc/index.html?kafka/clients/producer/KafkaProducer.html > > > > > > > > > > > > is there a plan to add method like. onFailure(List > messages, > > > > Exception exception ) ? > > > > > > > > > > > > > > > > > > > > Basically, I have to address the QOS on producer side, and be able to > > > > buffer on disk and retransmit all message to partitions that are > > reserved > > > > for messages that happened in past. > > > > > > > > > > > > > > > > How does Linked-in handle QOS on producer side ? > > > > > > > > > > > > Is there any plan to add this QOS feature on Producer Side with > > > Strategies > > > > to > > > > store and retransmit the message ? If we do get the list of messages > > is > > > > call back, will it be compressed data ? > > > > > > > > > > > > > > > > I would appreciate Kafka Developers and others feedback on how to > > > implement > > > > QOS. > > > > > > > > > > > > Thanks, > > > > > > > > > > > > Bhavesh > > > > > > > > > >