+1 (binding)
Verified checksums, ran tests, staged convenience binaries.
I also ran a few tests using Spark 3.0.0 and Spark 2.4.5 and the runtime
Jars. For anyone that would like to use spark-sql or spark-shell, here are
the commands that I used:
~/Apps/spark-3.0.0-bin-hadoop2.7/bin/spark-sql \
1. Verify the signature: OK
2. Verify the checksum: OK
3. Untar the archive tarball: OK
4. Run RAT checks to validate license headers: RAT checks passed
5. Build and test the project: all unit tests passed.
+1 (non-binding)
I did see that my build took >12 minutes and used all 100% of all 8 cores