You can always try. But Hadoop 3 is not yet supported by Spark.

On Fri, Apr 5, 2019 at 11:13 AM Anton Kirillov
<akirillov.mail...@gmail.com> wrote:
>
> Marcelo, Sean, thanks for the clarification. So in order to support Hadoop 3+ 
> the preferred way would be to use Hadoop-free builds and provide Hadoop 
> dependencies in the classpath, is that correct?
>
> On Fri, Apr 5, 2019 at 10:57 AM Marcelo Vanzin <van...@cloudera.com> wrote:
>>
>> The hadoop-3 profile doesn't really work yet, not even on master.
>> That's being worked on still.
>>
>> On Fri, Apr 5, 2019 at 10:53 AM akirillov <akirillov.mail...@gmail.com> 
>> wrote:
>> >
>> > Hi there! I'm trying to run Spark unit tests with the following profiles:
>> >
>> > And 'core' module fails with the following test failing with
>> > NoClassDefFoundError:
>> >
>> > In the meantime building a distribution works fine when running:
>> >
>> > Also, there are no problems with running tests using Hadoop 2.7 profile.
>> > Does this issue look familiar? Any help appreciated!
>> >
>> >
>> >
>> > --
>> > Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
>> >
>> > ---------------------------------------------------------------------
>> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>> >
>>
>>
>> --
>> Marcelo



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to