On Fri, Feb 28, 2014 at 7:17 PM, Egor Pahomov <pahomov.e...@gmail.com>wrote:

> Spark 0.9 uses protobuf 2.5.0
>

Spark 0.9 uses 2.4.1:

https://github.com/apache/incubator-spark/blob/4d880304867b55a4f2138617b30600b7fa013b14/pom.xml#L118

Is there another pom for when hadoop 2.2 is used? I don't see another
branch for hadooop 2.2.


> Hadoop 2.2 uses protobuf 2.5.0
> protobuf 2.5.0 can read massages serialized with protobuf 2.4.1
>

Protobuf java code generated by ptotoc 2.4 does not compile with protobuf
library 2.5. This is what the OP's error message is about.


> So there is not any reason why you can't read some messages from hadoop
> 2.2 with protobuf 2.5.0, probably you somehow have 2.4.1 in your class
> path. Of course it's very bad, that you have both 2.4.1 and 2.5.0 in your
> classpath. Use excludes or whatever to get rid of 2.4.1.
>
> Personally, I spend 3 days to move my project to protobuf 2.5.0 from
> 2.4.1. But it has to be done for the whole your project.
>
> 2014-02-28 21:49 GMT+04:00 Aureliano Buendia <buendia...@gmail.com>:
>
> Doesn't hadoop 2.2 also depend on protobuf 2.4?
>>
>>
>> On Fri, Feb 28, 2014 at 5:45 PM, Ognen Duzlevski <
>> og...@plainvanillagames.com> wrote:
>>
>>> A stupid question, by the way, you did compile Spark with Hadoop 2.2.0
>>> support?
>>>
>>> Ognen
>>>
>>> On 2/28/14, 10:51 AM, Prasad wrote:
>>>
>>>> Hi
>>>> I am getting the protobuf error.... while reading HDFS file using spark
>>>> 0.9.0 -- i am running on hadoop 2.2.0 .
>>>>
>>>> When i look thru, i find that i have both 2.4.1 and 2.5 and some blogs
>>>> suggest that there is some incompatability issues betwen 2.4.1 and 2.5
>>>>
>>>> hduser@prasadHdp1:~/spark-0.9.0-incubating$ find ~/ -name
>>>> protobuf-java*.jar
>>>> /home/hduser/.m2/repository/com/google/protobuf/protobuf-
>>>> java/2.4.1/protobuf-java-2.4.1.jar
>>>> /home/hduser/.m2/repository/org/spark-project/protobuf/
>>>> protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar
>>>> /home/hduser/spark-0.9.0-incubating/lib_managed/
>>>> bundles/protobuf-java-2.5.0.jar
>>>> /home/hduser/spark-0.9.0-incubating/lib_managed/jars/
>>>> protobuf-java-2.4.1-shaded.jar
>>>> /home/hduser/.ivy2/cache/com.google.protobuf/protobuf-java/
>>>> bundles/protobuf-java-2.5.0.jar
>>>> /home/hduser/.ivy2/cache/org.spark-project.protobuf/
>>>> protobuf-java/jars/protobuf-java-2.4.1-shaded.jar
>>>>
>>>>
>>>> Can someone please let me know if you faced these issues and how u
>>>> fixed it.
>>>>
>>>> Thanks
>>>> Prasad.
>>>> Caused by: java.lang.VerifyError: class
>>>> org.apache.hadoop.security.proto.SecurityProtos$
>>>> GetDelegationTokenRequestProto
>>>> overrides final method
>>>> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
>>>>          at java.lang.ClassLoader.defineClass1(Native Method)
>>>>          at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
>>>>          at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>>>          at java.net.URLClassLoader.defineClass(URLClassLoader.
>>>> java:449)
>>>>          at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
>>>>          at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>>>>          at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>>>          at java.security.AccessController.doPrivileged(Native Method)
>>>>          at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>>>          at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>>>          at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>>>          at java.lang.Class.getDeclaredMethods0(Native Method)
>>>>          at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
>>>>          at java.lang.Class.privateGetPublicMethods(Class.java:2651)
>>>>          at java.lang.Class.privateGetPublicMethods(Class.java:2661)
>>>>          at java.lang.Class.getMethods(Class.java:1467)
>>>>          at
>>>> sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:426)
>>>>          at
>>>> sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:323)
>>>>          at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:636)
>>>>          at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:722)
>>>>          at
>>>> org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(
>>>> ProtobufRpcEngine.java:92)
>>>>          at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)
>>>>
>>>>
>>>> Caused by: java.lang.reflect.InvocationTargetException
>>>>          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>          at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(
>>>> NativeMethodAccessorImpl.java:57)
>>>>          at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(
>>>> DelegatingMethodAccessorImpl.java:43)
>>>>          at java.lang.reflect.Method.invoke(Method.java:606)
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> View this message in context: http://apache-spark-user-list.
>>>> 1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-
>>>> 0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158.html
>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>
>>>
>>> --
>>> Some people, when confronted with a problem, think "I know, I'll use
>>> regular expressions." Now they have two problems.
>>> -- Jamie Zawinski
>>>
>>>
>>
>
>
> --
>
>
>
> *Sincerely yours Egor PakhomovScala Developer, Yandex*
>

Reply via email to