we're going to go ahead and do this on monday.  i'll send out another
email later this week w/the details.

On Wed, Apr 27, 2016 at 8:50 AM, shane knapp <skn...@berkeley.edu> wrote:
> this will be postponed due to the 2.0 code freeze.  sorry for the late notice.
>
> On Mon, Apr 25, 2016 at 4:50 PM, shane knapp <skn...@berkeley.edu> wrote:
>> another project hosted on our jenkins (e-mission) needs anaconda scipy
>> upgraded from 0.15.1 to 0.17.0.  this will also upgrade a few other
>> libs, which i've included at the end of this email.
>>
>> i've spoken w/josh @ databricks and we don't believe that this will
>> impact the spark builds at all.  if this causes serious breakage, i
>> will roll everything back to pre-update.
>>
>> i have created a JIRA issue to look in to creating conda environments
>> for spark builds, something that we should have done long ago:
>>
>> https://issues.apache.org/jira/browse/SPARK-14905
>>
>> builds will be paused:  ~7am PDT
>> anaconda package updates:  ~8am
>> jenkins quiet time ends:  ~9am at the latest
>>
>> i do not expect the downtime to last very long, and will update this
>> thread w/updates as they come.
>>
>> here's what will be updated under anaconda:
>>
>> The following NEW packages will be INSTALLED:
>>
>>     libgfortran: 3.0-0
>>     mkl:         11.3.1-0
>>     wheel:       0.29.0-py27_0
>>
>> The following packages will be UPDATED:
>>
>>     conda:       3.10.1-py27_0     --> 4.0.5-py27_0
>>     conda-env:   2.1.4-py27_0      --> 2.4.5-py27_0
>>     numpy:       1.9.2-py27_0      --> 1.11.0-py27_0
>>     openssl:     1.0.1k-1          --> 1.0.2g-0
>>     pip:         6.1.1-py27_0      --> 8.1.1-py27_1
>>     python:      2.7.9-2           --> 2.7.11-0
>>     pyyaml:      3.11-py27_0       --> 3.11-py27_1
>>     requests:    2.6.0-py27_0      --> 2.9.1-py27_0
>>     scipy:       0.15.1-np19py27_0 --> 0.17.0-np111py27_2
>>     setuptools:  15.0-py27_0       --> 20.7.0-py27_0
>>     sqlite:      3.8.4.1-1         --> 3.9.2-0
>>     yaml:        0.1.4-0           --> 0.1.6-0

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to