Tag 1.2.0 is older than 1.2.0-rc2. I wonder if it just didn't get updated. I assume it's going to be 1.2.0-rc2 plus a few commits related to the release process.
On Fri, Dec 19, 2014 at 9:50 AM, Shixiong Zhu <zsxw...@gmail.com> wrote: > Congrats! > > A little question about this release: Which commit is this release based on? > v1.2.0 and v1.2.0-rc2 are pointed to different commits in > https://github.com/apache/spark/releases > > Best Regards, > > Shixiong Zhu > > 2014-12-19 16:52 GMT+08:00 Patrick Wendell <pwend...@gmail.com>: >> >> I'm happy to announce the availability of Spark 1.2.0! Spark 1.2.0 is >> the third release on the API-compatible 1.X line. It is Spark's >> largest release ever, with contributions from 172 developers and more >> than 1,000 commits! >> >> This release brings operational and performance improvements in Spark >> core including a new network transport subsytem designed for very >> large shuffles. Spark SQL introduces an API for external data sources >> along with Hive 13 support, dynamic partitioning, and the >> fixed-precision decimal type. MLlib adds a new pipeline-oriented >> package (spark.ml) for composing multiple algorithms. Spark Streaming >> adds a Python API and a write ahead log for fault tolerance. Finally, >> GraphX has graduated from alpha and introduces a stable API along with >> performance improvements. >> >> Visit the release notes [1] to read about the new features, or >> download [2] the release today. >> >> For errata in the contributions or release notes, please e-mail me >> *directly* (not on-list). >> >> Thanks to everyone involved in creating, testing, and documenting this >> release! >> >> [1] http://spark.apache.org/releases/spark-release-1-2-0.html >> [2] http://spark.apache.org/downloads.html >> >> --------------------------------------------------------------------- >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org >> For additional commands, e-mail: user-h...@spark.apache.org >> > --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional commands, e-mail: dev-h...@spark.apache.org