Okay so I think the issue here is just a conflict between your application
code and the Hadoop code.
Hadoop 2.0.0 depends on protobuf 2.4.0a:
https://svn.apache.org/repos/asf/hadoop/common/tags/release-2.0.0-alpha/hadoop-project/pom.xml
Your code is depending on protobuf 2.5.X
The protobuf libra
Any update on this? We are still facing this issue.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Using-ProtoBuf-2-5-for-messages-with-Spark-Streaming-tp3396p4015.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Any word on this one ?
On Apr 2, 2014, at 12:26 AM, Vipul Pandey wrote:
> I downloaded 0.9.0 fresh and ran the mvn command - the assembly jar thus
> generated also has both shaded and real version of protobuf classes
>
> Vipuls-MacBook-Pro-3:spark-0.9.0-incubating vipul$ jar -ftv
> ./assembly/
I downloaded 0.9.0 fresh and ran the mvn command - the assembly jar thus
generated also has both shaded and real version of protobuf classes
Vipuls-MacBook-Pro-3:spark-0.9.0-incubating vipul$ jar -ftv
./assembly/target/scala-2.10/spark-assembly_2.10-0.9.0-incubating-hadoop2.0.0-cdh4.2.1.jar
| g
It's this: mvn -Dhadoop.version=2.0.0-cdh4.2.1 -DskipTests clean package
On Tue, Apr 1, 2014 at 5:15 PM, Vipul Pandey wrote:
> how do you recommend building that - it says
> ERROR] Failed to execute goal
> org.apache.maven.plugins:maven-assembly-plugin:2.2-beta-5:assembly
> (default-cli) on pro
how do you recommend building that - it says
ERROR] Failed to execute goal
org.apache.maven.plugins:maven-assembly-plugin:2.2-beta-5:assembly
(default-cli) on project spark-0.9.0-incubating: Error reading assemblies: No
assembly descriptors found. -> [Help 1]
upon runnning
mvn -Dhadoop.version
Do you get the same problem if you build with maven?
On Tue, Apr 1, 2014 at 12:23 PM, Vipul Pandey wrote:
> SPARK_HADOOP_VERSION=2.0.0-cdh4.2.1 sbt/sbt assembly
>
> That's all I do.
>
> On Apr 1, 2014, at 11:41 AM, Patrick Wendell wrote:
>
> Vidal - could you show exactly what flags/commands y
SPARK_HADOOP_VERSION=2.0.0-cdh4.2.1 sbt/sbt assembly
That's all I do.
On Apr 1, 2014, at 11:41 AM, Patrick Wendell wrote:
> Vidal - could you show exactly what flags/commands you are using when you
> build spark to produce this assembly?
>
>
> On Tue, Apr 1, 2014 at 12:53 AM, Vipul Pandey
Vidal - could you show exactly what flags/commands you are using when you
build spark to produce this assembly?
On Tue, Apr 1, 2014 at 12:53 AM, Vipul Pandey wrote:
> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be
> getting pulled in unless you are directly using akk
I've removed the dependency on akka in a separate project but still running
into the same error. In the POM Dependency Hierarchy I do see 2.4.1 - shaded
and 2.5.0 being included. If there is a conflict with project dependency I
would think I should be getting the same error in my local setup as wel
Yes I'm using akka as well. But if that is the problem then I should have
been facing this issue in my local setup as well. I'm only running into this
error on using the spark standalone cluster.
But will try out your suggestion and let you know.
Thanks
Kanwal
--
View this message in context:
btw, this is where it fails
14/04/01 00:59:32 INFO storage.MemoryStore: ensureFreeSpace(84106) called with
curMem=0, maxMem=4939225497
14/04/01 00:59:32 INFO storage.MemoryStore: Block broadcast_0 stored as values
to memory (estimated size 82.1 KB, free 4.6 GB)
java.lang.UnsupportedOperation
> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be
> getting pulled in unless you are directly using akka yourself. Are you?
No i'm not. Although I see that protobuf libraries are directly pulled into the
0.9.0 assembly jar - I do see the shaded version as well.
e.g. b
Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be
getting pulled in unless you are directly using akka yourself. Are you?
Does your project have other dependencies that might be indirectly pulling
in protobuf 2.4.1? It would be helpful if you could list all of your
depende
I'm using ScalaBuff (which depends on protobuf2.5) and facing the same issue.
any word on this one?
On Mar 27, 2014, at 6:41 PM, Kanwaldeep wrote:
> We are using Protocol Buffer 2.5 to send messages to Spark Streaming 0.9 with
> Kafka stream setup. I have protocol Buffer 2.5 part of the uber jar
We are using Protocol Buffer 2.5 to send messages to Spark Streaming 0.9 with
Kafka stream setup. I have protocol Buffer 2.5 part of the uber jar deployed
on each of the spark worker nodes.
The message is compiled using 2.5 but then on runtime it is being
de-serialized by 2.4.1 as I'm getting the
16 matches
Mail list logo