Sorry premature send:

The PR builder currently builds against Hadoop 2.3
https://github.com/apache/spark/blob/master/dev/run-tests#L54

We can set this to whatever we want. 2.2 might make sense since it's the
default in our published artifacts.

- Patrick

On Fri, May 15, 2015 at 11:53 AM, Patrick Wendell <pwend...@gmail.com>
wrote:

> The PR builder currently builds against Hadoop 2.3.
>
> - Patrick
>
> On Fri, May 15, 2015 at 11:40 AM, Marcelo Vanzin <van...@cloudera.com>
> wrote:
>
>> Funny thing, since I asked this question in a PR a few minutes ago...
>>
>> Ignoring the rotation suggestion for a second, can the PR builder at
>> least cover hadoop 2.2? That's the actual version used to create the
>> official Spark artifacts for maven, and the oldest version Spark supports
>> for YARN..
>>
>> Kinda the same argument as the "why do we build with java 7 when we
>> support java 6" discussion we had recently.
>>
>>
>> On Fri, May 15, 2015 at 11:34 AM, Ted Yu <yuzhih...@gmail.com> wrote:
>>
>>> bq. would be prohibitive to build all configurations for every push
>>>
>>> Agreed.
>>>
>>> Can PR builder rotate testing against hadoop 2.3, 2.4, 2.6 and 2.7 (each
>>> test run still uses one hadoop profile) ?
>>>
>>> This way we would have some coverage for each of the major hadoop
>>> releases.
>>>
>>> Cheers
>>>
>>> On Fri, May 15, 2015 at 10:30 AM, Sean Owen <so...@cloudera.com> wrote:
>>>
>>>> You all are looking only at the pull request builder. It just does one
>>>> build to sanity-check a pull request, since that already takes 2 hours and
>>>> would be prohibitive to build all configurations for every push. There is a
>>>> different set of Jenkins jobs that periodically tests master against a lot
>>>> more configurations, including Hadoop 2.4.
>>>>
>>>> On Fri, May 15, 2015 at 6:02 PM, Frederick R Reiss <frre...@us.ibm.com>
>>>> wrote:
>>>>
>>>>> The PR builder seems to be building against Hadoop 2.3. In the log for
>>>>> the most recent successful build (
>>>>> https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/32805/consoleFull
>>>>> ) I see:
>>>>>
>>>>>
>>>>> =========================================================================
>>>>> Building Spark
>>>>>
>>>>> =========================================================================
>>>>> [info] Compile with Hive 0.13.1
>>>>> [info] Building Spark with these arguments: -Pyarn -Phadoop-2.3
>>>>> -Dhadoop.version=2.3.0 -Pkinesis-asl -Phive -Phive-thriftserver
>>>>> ...
>>>>>
>>>>> =========================================================================
>>>>> Running Spark unit tests
>>>>>
>>>>> =========================================================================
>>>>> [info] Running Spark tests with these arguments: -Pyarn -Phadoop-2.3
>>>>> -Dhadoop.version=2.3.0 -Pkinesis-asl test
>>>>>
>>>>> Is anyone testing individual pull requests against Hadoop 2.4 or 2.6
>>>>> before the code is declared "clean"?
>>>>>
>>>>> Fred
>>>>>
>>>>> [image: Inactive hide details for Ted Yu ---05/15/2015 09:29:09
>>>>> AM---Jenkins build against hadoop 2.4 has been unstable recently: 
>>>>> https]Ted
>>>>> Yu ---05/15/2015 09:29:09 AM---Jenkins build against hadoop 2.4 has been
>>>>> unstable recently: https://amplab.cs.berkeley.edu/jenkins/
>>>>>
>>>>> From: Ted Yu <yuzhih...@gmail.com>
>>>>> To: Andrew Or <and...@databricks.com>
>>>>> Cc: "dev@spark.apache.org" <dev@spark.apache.org>
>>>>> Date: 05/15/2015 09:29 AM
>>>>> Subject: Re: Recent Spark test failures
>>>>> ------------------------------
>>>>>
>>>>>
>>>>>
>>>>> Jenkins build against hadoop 2.4 has been unstable recently:
>>>>>
>>>>> *https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-Master-Maven-with-YARN/HADOOP_PROFILE=hadoop-2.4,label=centos/*
>>>>> <https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-Master-Maven-with-YARN/HADOOP_PROFILE=hadoop-2.4,label=centos/>
>>>>>
>>>>> I haven't found the test which hung / failed in recent Jenkins builds.
>>>>>
>>>>> But PR builder has several green builds lately:
>>>>> *https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/*
>>>>> <https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/>
>>>>>
>>>>> Maybe PR builder doesn't build against hadoop 2.4 ?
>>>>>
>>>>> Cheers
>>>>>
>>>>> On Mon, May 11, 2015 at 1:11 PM, Ted Yu <*yuzhih...@gmail.com*
>>>>> <yuzhih...@gmail.com>> wrote:
>>>>>
>>>>>    Makes sense.
>>>>>
>>>>>    Having high determinism in these tests would make Jenkins build
>>>>>    stable.
>>>>>
>>>>>
>>>>>    On Mon, May 11, 2015 at 1:08 PM, Andrew Or <*and...@databricks.com*
>>>>>    <and...@databricks.com>> wrote:
>>>>>       Hi Ted,
>>>>>
>>>>>       Yes, those two options can be useful, but in general I think
>>>>>       the standard to set is that tests should never fail. It's actually 
>>>>> the
>>>>>       worst if tests fail sometimes but not others, because we can't 
>>>>> reproduce
>>>>>       them deterministically. Using -M and -A actually tolerates flaky 
>>>>> tests to a
>>>>>       certain extent, and I would prefer to instead increase the 
>>>>> determinism in
>>>>>       these tests.
>>>>>
>>>>>       -Andrew
>>>>>
>>>>>       2015-05-08 17:56 GMT-07:00 Ted Yu <*yuzhih...@gmail.com*
>>>>>       <yuzhih...@gmail.com>>:
>>>>>       Andrew:
>>>>>          Do you think the -M and -A options described here can be
>>>>>          used in test runs ?
>>>>>          *http://scalatest.org/user_guide/using_the_runner*
>>>>>          <http://scalatest.org/user_guide/using_the_runner>
>>>>>
>>>>>          Cheers
>>>>>
>>>>>          On Wed, May 6, 2015 at 5:41 PM, Andrew Or <
>>>>>          *and...@databricks.com* <and...@databricks.com>> wrote:
>>>>>             Dear all,
>>>>>
>>>>>             I'm sure you have all noticed that the Spark tests have
>>>>>             been fairly
>>>>>             unstable recently. I wanted to share a tool that I use to
>>>>>             track which tests
>>>>>             have been failing most often in order to prioritize
>>>>>             fixing these flaky
>>>>>             tests.
>>>>>
>>>>>             Here is an output of the tool. This spreadsheet reports
>>>>>             the top 10 failed
>>>>>             tests this week (ending yesterday 5/5):
>>>>>
>>>>>             
>>>>> *https://docs.google.com/spreadsheets/d/1Iv_UDaTFGTMad1sOQ_s4ddWr6KD3PuFIHmTSzL7LSb4*
>>>>>             
>>>>> <https://docs.google.com/spreadsheets/d/1Iv_UDaTFGTMad1sOQ_s4ddWr6KD3PuFIHmTSzL7LSb4>
>>>>>
>>>>>             It is produced by a small project:
>>>>> *https://github.com/andrewor14/spark-test-failures*
>>>>>             <https://github.com/andrewor14/spark-test-failures>
>>>>>
>>>>>             I have been filing JIRAs on flaky tests based on this
>>>>>             tool. Hopefully we
>>>>>             can collectively stabilize the build a little more as we
>>>>>             near the release
>>>>>             for Spark 1.4.
>>>>>
>>>>>             -Andrew
>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>>
>> --
>> Marcelo
>>
>
>

Reply via email to