Hi, Has anyone used Protobuff with spark-cassandra connector? I am using protobuff-3.0-beta with spark-1.4 and cassandra-connector-2.10. I keep getting "Unable to find proto buffer class" in my code. I checked version of protobuff jar and it is loaded with 3.0-beta in classpath. Protobuff is coming form KAfka stream.
5/11/16 15:32:21 ERROR Executor: Exception in task 2.0 in stage 13.0 (TID 35) java.lang.RuntimeException: Unable to find proto buffer class: com.test.serializers.TestEvent$Event at com.google.protobuf.GeneratedMessageLite$SerializedForm.readResolve(GeneratedMessageLite.java:1063) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) Here is my code: JavaDStream<AppRawData> rddStream =protoBuffMsgs.map(protoBuff -> StreamRawData.convertProtoBuffToRawData(protoBuff)); rddStream.foreachRDD(rdd -> { StreamRawData.writeToCassandra(rdd); return null; }); public static void writeToCassandra(JavaRDD<MyData> rowRDD){ //write to Cassandra javaFunctions(rowRDD).writerBuilder("keyspace", "data", mapToRow(MyData.class)).saveToCassandra(); } If I remove writeToCassandra() from my code, it works. It also counts and filters on my protobuff stream of data.