Maybe it is my environment cause

jiahong li <monkeyboy....@gmail.com> 于2021年3月11日周四 上午11:14写道:

> it not the cause,when i set -Phadoop-2.7 instead of
> -Dhadoop.version=2.6.0-cdh5.13.1, the same errors come out.
>
> Attila Zsolt Piros <piros.attila.zs...@gmail.com> 于2021年3月10日周三 下午8:56写道:
>
>> I see, this must be because of hadoop version you are selecting by using
>> "-Dhadoop.version=2.6.0-cdh5.13.1".
>> Spark 3.1.1 only support hadoop-2.7 and hadoop-3.2, at least these two
>> can be given via profiles:  -Phadoop-2.7  and -Phadoop-3.2 (the default).
>>
>>
>> On Wed, Mar 10, 2021 at 12:26 PM jiahong li <monkeyboy....@gmail.com>
>> wrote:
>>
>>> i use ./build/mvn to compile ,and after execute command 
>>> :./build/zinc-0.3.15/bin/zinc
>>> -shutdown
>>> and execute command like this: /dev/make-distribution.sh --name
>>> custom-spark --pip  --tgz -Phive -Phive-thriftserver -Pyarn
>>> -Dhadoop.version=2.6.0-cdh5.13.1 -DskipTests
>>> same error appear.
>>> and execute command: ps -ef |grep zinc, there is nothing containe zinc
>>>
>>> Attila Zsolt Piros <piros.attila.zs...@gmail.com> 于2021年3月10日周三
>>> 下午6:55写道:
>>>
>>>> hi!
>>>>
>>>> Are you compiling Spark itself?
>>>> Do you use "./build/mvn" from the project root?
>>>> If you compiled an other version of Spark before and there the scala
>>>> version was different then zinc/nailgun could cached the old classes which
>>>> can cause similar troubles.
>>>> In that case this could help:
>>>>
>>>> ./build/zinc-0.3.15/bin/zinc -shutdown
>>>>
>>>> Best Regards,
>>>> Attila
>>>>
>>>> On Wed, Mar 10, 2021 at 11:27 AM jiahong li <monkeyboy....@gmail.com>
>>>> wrote:
>>>>
>>>>> hi, everybody, when i compile spark 3.1.1 from tag v3.1.1 ,encounter
>>>>> error like this:
>>>>>
>>>>> INFO] --- scala-maven-plugin:4.3.0:compile (scala-compile-first) @
>>>>> spark-core_2.12 ---
>>>>> [INFO] Using incremental compilation using Mixed compile order
>>>>> [INFO] Compiler bridge file:
>>>>> .sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.3.1-bin_2.12.10__52.0-1.3.1_20191012T045515.jar
>>>>> [INFO] compiler plugin:
>>>>> BasicArtifact(com.github.ghik,silencer-plugin_2.12.10,1.6.0,null)
>>>>> [INFO] Compiling 560 Scala sources and 99 Java sources to
>>>>> git/spark/core/target/scala-2.12/classes ...
>>>>> [ERROR] [Error]
>>>>> git/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
>>>>> type mismatch;
>>>>>  found   : K where type K
>>>>>  required: String
>>>>> [ERROR] [Error]
>>>>> git/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
>>>>> value map is not a member of V
>>>>> [ERROR] [Error]
>>>>> git/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
>>>>> missing argument list for method stripXSS in class XssSafeRequest
>>>>> Unapplied methods are only converted to functions when a function type
>>>>> is expected.
>>>>> You can make this conversion explicit by writing `stripXSS _` or
>>>>> `stripXSS(_)` instead of `stripXSS`.
>>>>> [ERROR] [Error]
>>>>> git/spark/core/src/main/scala/org/apache/spark/ui/PagedTable.scala:307:
>>>>> value startsWith is not a member of K
>>>>> [ERROR] [Error]
>>>>> git/spark/core/src/main/scala/org/apache/spark/util/Utils.scala:580: value
>>>>> toLowerCase is not a member of object org.apache.hadoop.util.StringUtils
>>>>> [ERROR] 5 errors found
>>>>>
>>>>> anybody encounter error like this?
>>>>>
>>>>>
>>>>

Reply via email to