This is the JIRA: https://issues.apache.org/jira/browse/FLINK-3565 It has been resolved by Max.
I'll merge the fix and create a new RC now On Wed, Mar 2, 2016 at 12:15 PM, Aljoscha Krettek <aljos...@apache.org> wrote: > Hi, > I saw this one when trying my job that was built against Scala 2.11: > > java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class > at kafka.utils.Pool.<init>(Pool.scala:28) > at > kafka.consumer.FetchRequestAndResponseStatsRegistry$.<init>(FetchRequestAndResponseStats.scala:60) > at > kafka.consumer.FetchRequestAndResponseStatsRegistry$.<clinit>(FetchRequestAndResponseStats.scala) > at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39) > at > kafka.javaapi.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:34) > at > org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.getPartitionsForTopic(FlinkKafkaConsumer08.java:518) > at > org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:218) > at > org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:193) > at > org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:160) > at com.dataartisans.querywindow.WindowJob.main(WindowJob.java:93) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) > at > org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:505) > at > org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:403) > at > org.apache.flink.client.program.Client.runBlocking(Client.java:248) > at > org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:866) > at org.apache.flink.client.CliFrontend.run(CliFrontend.java:333) > at > org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1189) > at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1239) > Caused by: java.lang.ClassNotFoundException: > scala.collection.GenTraversableOnce$class > at java.net.URLClassLoader$1.run(URLClassLoader.java:366) > at java.net.URLClassLoader$1.run(URLClassLoader.java:355) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:354) > at java.lang.ClassLoader.loadClass(ClassLoader.java:425) > at java.lang.ClassLoader.loadClass(ClassLoader.java:358) > ... 21 more > > The job uses the Kafka 0.8 consumer. All deps have 2.11 suffix, the flink > build is "Hadoop 2.7 Scala 2.11" > > On 02 Mar 2016, at 11:36, Till Rohrmann <trohrm...@apache.org> wrote: > > > > Do we continue using the google shared doc for RC3 for the release > testing > > coordination? > > > > On Wed, Mar 2, 2016 at 11:31 AM, Aljoscha Krettek <aljos...@apache.org> > > wrote: > > > >> By the way, this is the commits that where added since rc3, so most of > the > >> testing that we already did should also be valid for this RC: > >> > >> $ git log origin/release-1.0.0-rc3..origin/release-1.0.0-rc4 > >> commit 1b0a8c4e9d5df35a7dea9cdd6d2c6e35489bfefa > >> Author: Robert Metzger <rmetz...@apache.org> > >> Date: Tue Mar 1 16:40:31 2016 +0000 > >> > >> Commit for release 1.0.0 > >> > >> commit 23dc2a4acf8e886384a66587ff393c2e62a69037 > >> Author: Stephan Ewen <se...@apache.org> > >> Date: Mon Feb 29 19:24:34 2016 +0100 > >> > >> [FLINK-2788] [apis] Add TypeHint class to allow type-safe generic > type > >> parsing > >> > >> This closes #1744 > >> > >> commit 43e5975d5426e22eb4ef90e0f468bd7f6cd35736 > >> Author: Stephan Ewen <se...@apache.org> > >> Date: Tue Mar 1 14:31:26 2016 +0100 > >> > >> [FLINK-3554] [streaming] Emit a MAX Watermark after finite sources > >> finished > >> > >> This closes #1750 > >> > >> commit 8949ccf66b211b3c5cd8e66557afbff21fb093a6 > >> Author: Till Rohrmann <trohrm...@apache.org> > >> Date: Tue Mar 1 12:36:22 2016 +0100 > >> > >> [FLINK-3557] [stream, scala] Introduce secondary parameter list for > >> fold function > >> > >> The fold API call takes an initial value as well as a fold function. > >> In Scala it is possible > >> to provide an anonymous function. In order to easily support multi > >> line anonymous functions > >> as well as being consistent with Scala's collection API, this PR adds > >> another parameter list > >> to the fold API call, which contains the fold function parameter. > >> > >> Insert spaces between first parameter list and curly braces of > >> anonymous function > >> > >> This closes #1748. > >> > >> commit 2d56081e29996f3f83e1a882151c06e44233d38f > >> Author: Ufuk Celebi <u...@apache.org> > >> Date: Tue Mar 1 14:58:01 2016 +0100 > >> > >> [FLINK-3559] [dist] Don't print INFO if no active process > >> > >> This closes #1751. > >> > >> commit 6262a0edde07f3ca968f88814e25927be7ed07c2 > >> Author: Ufuk Celebi <u...@apache.org> > >> Date: Tue Mar 1 12:23:31 2016 +0100 > >> > >> [FLINK-3556] [runtime] Remove false check in HA blob store > >> configuration > >> > >> This closes #1749. > >> > >> commit 8e30f86657e4432a226d28810bac54cdcc906c04 > >> Author: vasia <va...@apache.org> > >> Date: Mon Feb 29 22:49:35 2016 +0100 > >> > >> [docs] fix readme typos; use the same scala style in the examples > >> > >> This closes #1743 > >> > >>> On 02 Mar 2016, at 10:26, Robert Metzger <rmetz...@apache.org> wrote: > >>> > >>> The release binaries are now located here: > >>> http://home.apache.org/~rmetzger/flink-1.0.0-rc4/ > >>> > >>> > >>> On Wed, Mar 2, 2016 at 10:16 AM, Robert Metzger <rmetz...@apache.org> > >> wrote: > >>> > >>>> Yes, There was an email from Infra that they are going to shut down > >>>> people.apache.org on March 1. > >>>> I'll try to move the binaries to the new server ("home.apache.org"). > >>>> > >>>> On Wed, Mar 2, 2016 at 9:27 AM, Ufuk Celebi <u...@apache.org> wrote: > >>>> > >>>>> I get a 404 for the binaries. It's a INFRA thing I guess, because my > >>>>> personal apache user page is also down/gone. :-( > >>>>> > >>>>> On Tue, Mar 1, 2016 at 10:42 PM, Robert Metzger <rmetz...@apache.org > > > >>>>> wrote: > >>>>>> Dear Flink community, > >>>>>> > >>>>>> Please vote on releasing the following candidate as Apache Flink > >>>>> version 1.0 > >>>>>> .0. > >>>>>> > >>>>>> This is the fourth RC. > >>>>>> Here is a document to report on the testing and release > verification: > >>>>>> > >>>>> > >> > https://docs.google.com/document/d/1hoQ5k4WQteNj2OoPwpQPD4ZVHrCwM1pTlUVww8ld7oY/edit#heading=h.2v6zy51pgj33 > >>>>>> > >>>>>> > >>>>>> The commit to be voted on ( > >>>>>> http://git-wip-us.apache.org/repos/asf/flink/commit/1b0a8c4e) > >>>>>> 1b0a8c4e9d5df35a7dea9cdd6d2c6e35489bfefa > >>>>>> > >>>>>> Branch: > >>>>>> release-1.0.0-rc4 (see > >>>>>> > >>>>> > >> > https://git1-us-west.apache.org/repos/asf/flink/repo?p=flink.git;a=shortlog;h=refs/heads/release-1.0.0-rc4 > >>>>>> ) > >>>>>> > >>>>>> The release artifacts to be voted on can be found at: > >>>>>> http://people.apache.org/~rmetzger/flink-1.0.0-rc4/ > >>>>>> > >>>>>> The release artifacts are signed with the key with fingerprint > >> D9839159: > >>>>>> http://www.apache.org/dist/flink/KEYS > >>>>>> > >>>>>> The staging repository for this release can be found at: > >>>>>> > >> https://repository.apache.org/content/repositories/orgapacheflink-1066 > >>>>>> > >>>>>> ------------------------------------------------------------- > >>>>>> > >>>>>> The vote is open until Friday and passes if a majority of at least > >> three > >>>>>> +1 PMC votes are cast. > >>>>>> > >>>>>> The vote ends on Friday, March 4, 23:00 CET. > >>>>>> > >>>>>> [ ] +1 Release this package as Apache Flink 1.0.0 > >>>>>> [ ] -1 Do not release this package because ... > >>>>> > >>>> > >>>> > >> > >> > >