We may be able to fix this from the Spark side by adding appropriate
exclusions in our Hadoop dependencies, right?  If possible, I think that we
should do this.

On Wed, Jul 15, 2015 at 7:10 AM, Ted Yu <yuzhih...@gmail.com> wrote:

> I attached a patch for HADOOP-12235
>
> BTW openstack was not mentioned in the first email from Gil.
> My email and Gil's second email were sent around the same moment.
>
> Cheers
>
> On Wed, Jul 15, 2015 at 2:06 AM, Steve Loughran <ste...@hortonworks.com>
> wrote:
>
>>
>>  On 14 Jul 2015, at 12:22, Ted Yu <yuzhih...@gmail.com> wrote:
>>
>>  Looking at Jenkins, master branch compiles.
>>
>>  Can you try the following command ?
>>
>> mvn -Phive -Phadoop-2.6 -DskipTests clean package
>>
>>  What version of Java are you using ?
>>
>>
>>  Ted, Giles has stuck in hadoop-openstack, it's that which is creating
>> the problem
>>
>>  Giles, I don't know why hadoop-openstack has a mockito dependency as
>>  it should be test time only
>>
>>  Looking at the POM it's tag
>>
>>  in hadoop-2.7 tis scoped to compile, which
>>      <dependency>
>>       <groupId>org.mockito</groupId>
>>       <artifactId>mockito-all</artifactId>
>>       <scope>compile</scope>
>>     </dependency>
>>
>>  it should be "provided", shouldn't it?
>>
>>  Created https://issues.apache.org/jira/browse/HADOOP-12235 : if someone
>> supplies a patch I'll get it in.
>>
>>  -steve
>>
>
>

Reply via email to