Update : Turns out this error happens in 2 scenarios
1. When there is a mis-match between the broker and zookeeper libs inside of your process (found that from stackoverflow) 2.Apparetly when anything that uses scala parser combinators libs (in our case scala.util.parsing.json.JSON) runs within the same process as your consumer 1 is easily fixed. 2 happens consistently and it is often caused by other libs that you use, which makes it impossible to use kafka at all. It's quite silly but totally disabling. We would appreciate it a lot, if someone could provide us a workaround / point us to the exact issue so that we can custom patch it / provide a patch ourselves. Thanks Arun From: Arunkumar Srambikkal (asrambik) Sent: Wednesday, March 04, 2015 5:27 PM To: users@kafka.apache.org Subject: JSON parsing causing rebalance to fail Hi, When I start a new consumer, it throws a Rebalance exception. However I hit it only on some machines where the run time libraries are different The stack given below is what I encounter - is this a known issue? I saw this Jira but it's not resolved so thought to confirm - https://issues.apache.org/jira/browse/KAFKA-1405 Thanks Arun [2015-03-04 14:30:37,609] INFO [name], exception during rebalance (kafka.consumer.ZookeeperConsumerConnector) kafka.common.KafkaException: Failed to parse the broker info from zookeeper: {"jmx_port":-1,"timestamp":"1425459616502","host":"1.1.1.1","version":1,"port":64356} at kafka.cluster.Broker$.createBroker(Broker.scala:35) ...... Caused by: java.lang.ClassCastException: java.lang.Double cannot be cast to java.lang.Integer at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:105) at kafka.cluster.Broker$.createBroker(Broker.scala:40)