+1 (non-binding)

I tested it on Ubuntu 16.04 and OpenJDK8 on ppc64le. All of the tests for 
core/sql-core/sql-catalyst/mllib/mllib-local have passed.

$ java -version
openjdk version "1.8.0_131"
OpenJDK Runtime Environment (build 
1.8.0_131-8u131-b11-2ubuntu1.16.04.3-b11)
OpenJDK 64-Bit Server VM (build 25.131-b11, mixed mode)

% build/mvn -DskipTests -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7 -T 
24 clean package install
% build/mvn -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7 test -pl core 
-pl 'sql/core' -pl 'sql/catalyst' -pl mllib -pl mllib-local
...
Run completed in 13 minutes, 54 seconds.
Total number of tests run: 1118
Suites: completed 170, aborted 0
Tests: succeeded 1118, failed 0, canceled 0, ignored 6, pending 0
All tests passed.
[INFO] 
------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Spark Project Core ................................. SUCCESS [17:13 
min]
[INFO] Spark Project ML Local Library ..................... SUCCESS [ 
6.065 s]
[INFO] Spark Project Catalyst ............................. SUCCESS [11:51 
min]
[INFO] Spark Project SQL .................................. SUCCESS [17:55 
min]
[INFO] Spark Project ML Library ........................... SUCCESS [17:05 
min]
[INFO] 
------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] 
------------------------------------------------------------------------
[INFO] Total time: 01:04 h
[INFO] Finished at: 2017-11-30T01:48:15+09:00
[INFO] Final Memory: 128M/329M
[INFO] 
------------------------------------------------------------------------
[WARNING] The requested profile "hive" could not be activated because it 
does not exist.

Kazuaki Ishizaki



From:   Dongjoon Hyun <dongjoon.h...@gmail.com>
To:     Hyukjin Kwon <gurwls...@gmail.com>
Cc:     Spark dev list <dev@spark.apache.org>, Felix Cheung 
<felixche...@apache.org>, Sean Owen <so...@cloudera.com>
Date:   2017/11/29 12:56
Subject:        Re: [VOTE] Spark 2.2.1 (RC2)



+1 (non-binding)

RC2 is tested on CentOS, too.

Bests,
Dongjoon.

On Tue, Nov 28, 2017 at 4:35 PM, Hyukjin Kwon <gurwls...@gmail.com> wrote:
+1

2017-11-29 8:18 GMT+09:00 Henry Robinson <he...@apache.org>:
(My vote is non-binding, of course). 

On 28 November 2017 at 14:53, Henry Robinson <he...@apache.org> wrote:
+1, tests all pass for me on Ubuntu 16.04. 

On 28 November 2017 at 10:36, Herman van Hövell tot Westerflier <
hvanhov...@databricks.com> wrote:
+1

On Tue, Nov 28, 2017 at 7:35 PM, Felix Cheung <felixche...@apache.org> 
wrote:
+1

Thanks Sean. Please vote!

Tested various scenarios with R package. Ubuntu, Debian, Windows r-devel 
and release and on r-hub. Verified CRAN checks are clean (only 1 
NOTE!) and no leaked files (.cache removed, /tmp clean)


On Sun, Nov 26, 2017 at 11:55 AM Sean Owen <so...@cloudera.com> wrote:
Yes it downloads recent releases. The test worked for me on a second try, 
so I suspect a bad mirror. If this comes up frequently we can just add 
retry logic, as the closer.lua script will return different mirrors each 
time.

The tests all pass for me on the latest Debian, so +1 for this release.

(I committed the change to set -Xss4m for tests consistently, but this 
shouldn't block a release.)


On Sat, Nov 25, 2017 at 12:47 PM Felix Cheung <felixche...@apache.org> 
wrote:
Ah sorry digging through the history it looks like this is changed 
relatively recently and should only download previous releases.

Perhaps we are intermittently hitting a mirror that doesn’t have the 
files? 


https://github.com/apache/spark/commit/daa838b8886496e64700b55d1301d348f1d5c9ae


On Sat, Nov 25, 2017 at 10:36 AM Felix Cheung <felixche...@apache.org> 
wrote:
Thanks Sean.

For the second one, it looks like the  HiveExternalCatalogVersionsSuite is 
trying to download the release tgz from the official Apache mirror, which 
won’t work unless the release is actually, released?



val preferredMirror =


Seq("wget", "https://www.apache.org/dyn/closer.lua?preferred=true";, "-q", 
"-O", "-").!!.trim

val url = s
"$preferredMirror/spark/spark-$version/spark-$version-bin-hadoop2.7.tgz"



It’s proabbly getting an error page instead.


On Sat, Nov 25, 2017 at 10:28 AM Sean Owen <so...@cloudera.com> wrote:
I hit the same StackOverflowError as in the previous RC test, but, pretty 
sure this is just because the increased thread stack size JVM flag isn't 
applied consistently. This seems to resolve it:

https://github.com/apache/spark/pull/19820

This wouldn't block release IMHO.


I am currently investigating this failure though -- seems like the 
mechanism that downloads Spark tarballs needs fixing, or updating, in the 
2.2 branch?

HiveExternalCatalogVersionsSuite:
gzip: stdin: not in gzip format
tar: Child returned status 1
tar: Error is not recoverable: exiting now
*** RUN ABORTED ***
  java.io.IOException: Cannot run program "./bin/spark-submit" (in 
directory "/tmp/test-spark/spark-2.0.2"): error=2, No such file or 
directory

On Sat, Nov 25, 2017 at 12:34 AM Felix Cheung <felixche...@apache.org> 
wrote:
Please vote on releasing the following candidate as Apache Spark version 
2.2.1. The vote is open until Friday December 1, 2017 at 8:00:00 am UTC 
and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.2.1

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/


The tag to be voted on is v2.2.1-rc2 
https://github.com/apache/spark/tree/v2.2.1-rc2  (
e30e2698a2193f0bbdcd4edb884710819ab6397c)

List of JIRA tickets resolved in this release can be found here 
https://issues.apache.org/jira/projects/SPARK/versions/12340470


The release files, including signatures, digests, etc. can be found at:
https://dist.apache.org/repos/dist/dev/spark/spark-2.2.1-rc2-bin/

Release artifacts are signed with the following key:
https://dist.apache.org/repos/dist/dev/spark/KEYS

The staging repository for this release can be found at:
https://repository.apache.org/content/repositories/orgapachespark-1257/

The documentation corresponding to this release can be found at:
https://dist.apache.org/repos/dist/dev/spark/spark-2.2.1-rc2-docs/_site/index.html


FAQ

How can I help test this release?

If you are a Spark user, you can help us test this release by taking an 
existing Spark workload and running on this release candidate, then 
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the 
current RC and see if anything important breaks, in the Java/Scala you can 
add the staging repository to your projects resolvers and test with the RC 
(make sure to clean up the artifact cache before/after so you don't end up 
building with a out of date RC going forward).

What should happen to JIRA tickets still targeting 2.2.1?

Committers should look at those and triage. Extremely important bug fixes, 
documentation, and API tweaks that impact compatibility should be worked 
on immediately. Everything else please retarget to 2.2.2.

But my bug isn't fixed!??!

In order to make timely releases, we will typically not hold the release 
unless the bug in question is a regression from 2.2.0. That being said if 
there is something which is a regression form 2.2.0 that has not been 
correctly targeted please ping a committer to help target the issue (you 
can see the open issues listed as impacting Spark 2.2.1 / 2.2.2 here.

What are the unresolved issues targeted for 2.2.1?

At the time of the writing, there is one intermited failure SPARK-20201
 which we are tracking since 2.2.0.











Reply via email to