Hmpf. Just looked into this. In the Hadoop 2.X Scala 2.11 jar, Curator is not shaded. Thus, it fails to load the shaded classes. After we fix this, we will have to create a new RC.
On Fri, Oct 30, 2015 at 11:57 AM, Fabian Hueske <fhue...@gmail.com> wrote: > I'm sorry, but I have to give a -1 for this RC. > > Starting a Scala 2.11 build (hadoop2 and hadoop24) with > ./bin/start-local.sh fails with a ClassNotFoundException: > > ava.lang.NoClassDefFoundError: > org/apache/flink/shaded/org/apache/curator/RetryPolicy > at > > org.apache.flink.runtime.jobmanager.JobManager$.parseArgs(JobManager.scala:1721) > at > > org.apache.flink.runtime.jobmanager.JobManager$.liftedTree2$1(JobManager.scala:1384) > at > org.apache.flink.runtime.jobmanager.JobManager$.main(JobManager.scala:1383) > at > org.apache.flink.runtime.jobmanager.JobManager.main(JobManager.scala) > Caused by: java.lang.ClassNotFoundException: > org.apache.flink.shaded.org.apache.curator.RetryPolicy > at java.net.URLClassLoader$1.run(URLClassLoader.java:372) > at java.net.URLClassLoader$1.run(URLClassLoader.java:361) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:360) > at java.lang.ClassLoader.loadClass(ClassLoader.java:424) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) > at java.lang.ClassLoader.loadClass(ClassLoader.java:357) > > This happens on OSX and Windows 10 with Cygwin. > > 2015-10-30 10:47 GMT+01:00 Maximilian Michels <m...@apache.org>: > > > For testing, please refer to this document: > > > > > https://docs.google.com/document/d/1OtiAwILpnIwCqPF1Sk_8EcXuJOVc4uYtlP4i8m2c9rg/edit > > > > > > On Fri, Oct 30, 2015 at 9:05 AM, Maximilian Michels <m...@apache.org> > > wrote: > > > > > Please vote on releasing the following candidate as Apache Flink > version > > > 0.10.0: > > > > > > The commit to be voted on: > > > 2cd5a3c05ceec7bb9c5969c502c2d51b1ec00d0c > > > > > > Branch: > > > release-0.10.0-rc3 (see > > > https://git1-us-west.apache.org/repos/asf/flink/?p=flink.git) > > > > > > The release artifacts to be voted on can be found at: > > > http://people.apache.org/~mxm/flink-0.10.0-rc3/ > > > > > > The release artifacts are signed with the key with fingerprint > C2909CBF: > > > http://www.apache.org/dist/flink/KEYS > > > > > > The staging repository for this release can be found at: > > > https://repository.apache.org/content/repositories/orgapacheflink-1050 > > > > > > ------------------------------------------------------------- > > > > > > The vote is open for the next 72 hours and passes if a majority of at > > > least three +1 PMC votes are cast. > > > > > > The vote ends on Monday November 2, 2015. > > > > > > [ ] +1 Release this package as Apache Flink 0.10.0 > > > [ ] -1 Do not release this package because ... > > > > > > =================================== > > > > > > The following commits have been added on top of release-0.10.0-rc2: > > > > > > e1f30b0 [FLINK-2559] Clean up JavaDocs > > > 44b03f2 [FLINK-2800] [kryo] Fix Kryo serialization to clear buffered > data > > > cdc0dfd [FLINK-2932] Examples in docs now download shell script using > > > https instead of http > > > fcc1eed [FLINK-2902][web-dashboard] Sort finished jobs by their end > time, > > > running jobs by start time > > > 6a13b9f [FLINK-2934] Remove placeholder pages for job.statistics, > > > taskmanager.log and taskmanager.stdout > > > 51ac46e [FLINK-1610][docs] fix javadoc building for aggregate-scaladoc > > > profile > > > 54375b9 [scala-shell][docs] add scala sources in earlier phase > > > > > >