I run a 2.2.0 based HDFS cluster and I use Spark-0.9.0 without any
problems to read the files.
Ognen
On 2/28/14, 10:51 AM, Prasad wrote:
Hi
I am getting the protobuf error.... while reading HDFS file using spark
0.9.0 -- i am running on hadoop 2.2.0 .
When i look thru, i find that i have both 2.4.1 and 2.5 and some blogs
suggest that there is some incompatability issues betwen 2.4.1 and 2.5
hduser@prasadHdp1:~/spark-0.9.0-incubating$ find ~/ -name protobuf-java*.jar
/home/hduser/.m2/repository/com/google/protobuf/protobuf-java/2.4.1/protobuf-java-2.4.1.jar
/home/hduser/.m2/repository/org/spark-project/protobuf/protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar
/home/hduser/spark-0.9.0-incubating/lib_managed/bundles/protobuf-java-2.5.0.jar
/home/hduser/spark-0.9.0-incubating/lib_managed/jars/protobuf-java-2.4.1-shaded.jar
/home/hduser/.ivy2/cache/com.google.protobuf/protobuf-java/bundles/protobuf-java-2.5.0.jar
/home/hduser/.ivy2/cache/org.spark-project.protobuf/protobuf-java/jars/protobuf-java-2.4.1-shaded.jar
Can someone please let me know if you faced these issues and how u fixed it.
Thanks
Prasad.
Caused by: java.lang.VerifyError: class
org.apache.hadoop.security.proto.SecurityProtos$GetDelegationTokenRequestProto
overrides final method
getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
at java.lang.Class.privateGetPublicMethods(Class.java:2651)
at java.lang.Class.privateGetPublicMethods(Class.java:2661)
at java.lang.Class.getMethods(Class.java:1467)
at
sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:426)
at
sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:323)
at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:636)
at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:722)
at
org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(ProtobufRpcEngine.java:92)
at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.