currently, no solutions find!
Dereck Li
Apache Spark Contributor
Continuing Learner
@Hangzhou,China
jason_xu 于2021年5月11日周二 上午8:01写道:
> Hi Jiahong, I got the same failure on building spark 3.1.1 with hadoop
> 2.8.5.
> Any chance you find a solution?
>
>
>
> --
> Sent from: http://apache-spark-u
Hi Jiahong, I got the same failure on building spark 3.1.1 with hadoop 2.8.5.
Any chance you find a solution?
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
-
To unsubscribe e-mail: user-unsubscr...@spark.ap
Maybe it is my environment cause
jiahong li 于2021年3月11日周四 上午11:14写道:
> it not the cause,when i set -Phadoop-2.7 instead of
> -Dhadoop.version=2.6.0-cdh5.13.1, the same errors come out.
>
> Attila Zsolt Piros 于2021年3月10日周三 下午8:56写道:
>
>> I see, this must be because of hadoop version you are sele
it not the cause,when i set -Phadoop-2.7 instead of
-Dhadoop.version=2.6.0-cdh5.13.1, the same errors come out.
Attila Zsolt Piros 于2021年3月10日周三 下午8:56写道:
> I see, this must be because of hadoop version you are selecting by using
> "-Dhadoop.version=2.6.0-cdh5.13.1".
> Spark 3.1.1 only support h
I see, this must be because of hadoop version you are selecting by using
"-Dhadoop.version=2.6.0-cdh5.13.1".
Spark 3.1.1 only support hadoop-2.7 and hadoop-3.2, at least these two can
be given via profiles: -Phadoop-2.7 and -Phadoop-3.2 (the default).
On Wed, Mar 10, 2021 at 12:26 PM jiahong li
i use ./build/mvn to compile ,and after execute command
:./build/zinc-0.3.15/bin/zinc
-shutdown
and execute command like this: /dev/make-distribution.sh --name
custom-spark --pip --tgz -Phive -Phive-thriftserver -Pyarn
-Dhadoop.version=2.6.0-cdh5.13.1 -DskipTests
same error appear.
and execute com
hi!
Are you compiling Spark itself?
Do you use "./build/mvn" from the project root?
If you compiled an other version of Spark before and there the scala
version was different then zinc/nailgun could cached the old classes which
can cause similar troubles.
In that case this could help:
./build/zin