check your libstdc++ version On Tuesday, April 8, 2014, 陈小军 <chenxiao...@nhn.com> wrote:
> > Hi all, > > I try to add compress feature on kafka-node js driver, for snappy I > use node-snappy library https://github.com/kesla/node-snappy. when I test > my code, the server always output following error, I don't know this is my > code error, or server side compatible problem. > I use same way to implement gzip compress it works OK. I think my > logic is correct to compress messageSet. > > [2014-04-08 17:22:45,556] ERROR [KafkaApi-0] Error processing > ProducerRequest with correlation id 1 from client kafka-node-client on > partition [nelo2-crash-logs,0] > (kafka.server.KafkaApis)java.lang.NoClassDefFoundError: Could not > initialize class org.xerial.snappy.Snappy at > org.xerial.snappy.SnappyInputStream.readFully(SnappyInputStream.java:125) > at > org.xerial.snappy.SnappyInputStream.readHeader(SnappyInputStream.java:88) > at > org.xerial.snappy.SnappyInputStream.<init>(SnappyInputStream.java:58) > at > kafka.message.CompressionFactory$.apply(CompressionFactory.scala:45) > at > kafka.message.ByteBufferMessageSet$.decompress(ByteBufferMessageSet.scala:66) > at > kafka.message.ByteBufferMessageSet$$anon$1.makeNextOuter(ByteBufferMessageSet.scala:178) > at > kafka.message.ByteBufferMessageSet$$anon$1.makeNext(ByteBufferMessageSet.scala:191) > at > kafka.message.ByteBufferMessageSet$$anon$1.makeNext(ByteBufferMessageSet.scala:145) > at > kafka.utils.IteratorTemplate.maybeComputeNext(IteratorTemplate.scala:66) > at kafka.utils.IteratorTemplate.hasNext(IteratorTemplate.scala:58) > at scala.collection.Iterator$$anon$19.hasNext(Iterator.scala:400) > at scala.collection.Iterator$class.foreach(Iterator.scala:772) at > scala.collection.Iterator$$anon$19.foreach(Iterator.scala:399) at > scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48) > at > scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:102) > at > scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:250) > at scala.collection.Iterator$$anon$19.toBuffer(Iterator.scala:399) > at > kafka.message.ByteBufferMessageSet.assignOffsets(ByteBufferMessageSet.scala:219) > at kafka.log.Log.liftedTree1$1(Log.scala:249) at > kafka.log.Log.append(Log.scala:248) at > kafka.cluster.Partition.appendMessagesToLeader(Partition.scala:354) > at > kafka.server.KafkaApis$$anonfun$appendToLocalLog$2.apply(KafkaApis.scala:285) > at > kafka.server.KafkaApis$$anonfun$appendToLocalLog$2.apply(KafkaApis.scala:275) > at > scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233) > at > scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233) > at > scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:95) > at > scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:95) > at scala.collection.Iterator$class.foreach(Iterator.scala:772) > at scala.collection.mutable.HashTable$$anon$1.foreach(HashTable.scala:157) > at > scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:190) > at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:45) > at scala.collection.mutable.HashMap.foreach(HashMap.scala:95) > at scala.collection.TraversableLike$class.map(TraversableLike.scala:233) > at scala.collection.mutable.HashMap.map(HashMap.scala:45) at > kafka.server.KafkaApis.appendToLocalLog(KafkaApis.scala:275) at > kafka.server.KafkaApis.handleProducerRequest(KafkaApis.scala:201) at > kafka.server.KafkaApis.handle(KafkaApis.scala:68) at > kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:42) > at java.lang.Thread.run(Thread.java:722) > Best Regards > Jerry