ies that may be relevant:
>>>>
>>>> val hadoop2Common = "org.apache.hadoop" % "hadoop-common" %
>>>> hadoop2Version excludeAll(
>>>>
>>>> ExclusionRule(organization = "javax.ser
% "3.0.20100224"
)
On Sep 11, 2014 8:05 PM, mailto:sp...@orbit-x.de>> wrote:
Hi guys,
any luck with this issue, anyone?
I aswell tried all the possible exclusion combos to a no
avail.
thanks for your ideas
hadoop2MapRedClient,
hadoop2Common,
"org.mortbay.jetty" % "servlet-api" % "3.0.20100224"
)
On Sep 11, 2014 8:05 PM, mailto:sp...@orbit-x.de>> wrote:
Hi guys,
any luck with this issue, an
Hi Siyuan,
Thanks for the input. We are preferring to use the SparkBuild.scala
instead of maven. I did not see any protobuf.version related settings in
that file. But - as noted by Sean Owen - in any case the issue we are
facing presently is about the duplicate incompatible javax.servlet entries
Hi Stephen,
I am using spark1.0+ HBase0.96.2. This is what I did:
1) rebuild spark using: mvn -Dhadoop.version=2.3.0 -Dprotobuf.version=2.5.0
-DskipTests clean package
2) In spark-env.sh, set SPARK_CLASSPATH =
/path-to/hbase-protocol-0.96.2-hadoop2.jar
Hopefully it can help.
Siyuan
On Sat, Jun
Thanks Sean. I had actually already added exclusion rule for
org.mortbay.jetty - and that had not resolved it.
Just in case I used your precise formulation:
val excludeMortbayJetty = ExclusionRule(organization = "org.mortbay.jetty")
..
,("org.apache.spark" % "spark-core_2.10" % sparkVersion
w
This sounds like an instance of roughly the same item as in
https://issues.apache.org/jira/browse/SPARK-1949 Have a look at
adding that exclude to see if it works.
On Fri, Jun 27, 2014 at 10:21 PM, Stephen Boesch wrote:
> The present trunk is built and tested against HBase 0.94.
>
>
> I have tri