See <https://builds.apache.org/job/Kafka-trunk/384/changes>
Changes: [jay.kreps] KAFKA-1760: New consumer. ------------------------------------------ [...truncated 1171 lines...] kafka.log.BrokerCompressionTest > testBrokerSideCompression[14] PASSED kafka.log.BrokerCompressionTest > testBrokerSideCompression[15] PASSED kafka.log.BrokerCompressionTest > testBrokerSideCompression[16] PASSED kafka.log.BrokerCompressionTest > testBrokerSideCompression[17] PASSED kafka.log.BrokerCompressionTest > testBrokerSideCompression[18] PASSED kafka.log.BrokerCompressionTest > testBrokerSideCompression[19] PASSED kafka.log.LogCleanerIntegrationTest > cleanerTest PASSED kafka.log.LogManagerTest > testCreateLog PASSED kafka.log.LogManagerTest > testGetNonExistentLog PASSED kafka.log.LogManagerTest > testCleanupExpiredSegments PASSED kafka.log.LogManagerTest > testCleanupSegmentsToMaintainSize PASSED kafka.log.LogManagerTest > testTimeBasedFlush PASSED kafka.log.LogManagerTest > testLeastLoadedAssignment PASSED kafka.log.LogManagerTest > testTwoLogManagersUsingSameDirFails PASSED kafka.log.LogManagerTest > testCheckpointRecoveryPoints PASSED kafka.log.LogManagerTest > testRecoveryDirectoryMappingWithTrailingSlash PASSED kafka.log.LogManagerTest > testRecoveryDirectoryMappingWithRelativeDirectory PASSED kafka.log.LogTest > testTimeBasedLogRoll PASSED kafka.log.LogTest > testTimeBasedLogRollJitter PASSED kafka.log.LogTest > testSizeBasedLogRoll PASSED kafka.log.LogTest > testLoadEmptyLog PASSED kafka.log.LogTest > testAppendAndReadWithSequentialOffsets PASSED kafka.log.LogTest > testAppendAndReadWithNonSequentialOffsets PASSED kafka.log.LogTest > testReadAtLogGap PASSED kafka.log.LogTest > testReadOutOfRange PASSED kafka.log.LogTest > testLogRolls PASSED kafka.log.LogTest > testCompressedMessages PASSED kafka.log.LogTest > testThatGarbageCollectingSegmentsDoesntChangeOffset PASSED kafka.log.LogTest > testMessageSetSizeCheck PASSED kafka.log.LogTest > testMessageSizeCheck PASSED kafka.log.LogTest > testLogRecoversToCorrectOffset PASSED kafka.log.LogTest > testIndexRebuild PASSED kafka.log.LogTest > testTruncateTo PASSED kafka.log.LogTest > testIndexResizingAtTruncation PASSED kafka.log.LogTest > testBogusIndexSegmentsAreRemoved PASSED kafka.log.LogTest > testReopenThenTruncate PASSED kafka.log.LogTest > testAsyncDelete PASSED kafka.log.LogTest > testOpenDeletesObsoleteFiles PASSED kafka.log.LogTest > testAppendMessageWithNullPayload PASSED kafka.log.LogTest > testCorruptLog PASSED kafka.log.LogTest > testCleanShutdownFile PASSED kafka.log.LogTest > testParseTopicPartitionName PASSED kafka.log.LogTest > testParseTopicPartitionNameForEmptyName PASSED kafka.log.LogTest > testParseTopicPartitionNameForNull PASSED kafka.log.LogTest > testParseTopicPartitionNameForMissingSeparator PASSED kafka.log.LogTest > testParseTopicPartitionNameForMissingTopic PASSED kafka.log.LogTest > testParseTopicPartitionNameForMissingPartition PASSED kafka.log.OffsetIndexTest > truncate PASSED kafka.log.OffsetIndexTest > randomLookupTest PASSED kafka.log.OffsetIndexTest > lookupExtremeCases PASSED kafka.log.OffsetIndexTest > appendTooMany PASSED kafka.log.OffsetIndexTest > appendOutOfOrder PASSED kafka.log.OffsetIndexTest > testReopen PASSED kafka.log.LogSegmentTest > testTruncate PASSED kafka.log.LogSegmentTest > testReadOnEmptySegment PASSED kafka.log.LogSegmentTest > testReadBeforeFirstOffset PASSED kafka.log.LogSegmentTest > testMaxOffset PASSED kafka.log.LogSegmentTest > testReadAfterLast PASSED kafka.log.LogSegmentTest > testReadFromGap PASSED kafka.log.LogSegmentTest > testTruncateFull PASSED kafka.log.LogSegmentTest > testNextOffsetCalculation PASSED kafka.log.LogSegmentTest > testChangeFileSuffixes PASSED kafka.log.LogSegmentTest > testRecoveryFixesCorruptIndex PASSED kafka.log.LogSegmentTest > testRecoveryWithCorruptMessage PASSED kafka.log.CleanerTest > testCleanSegments PASSED kafka.log.CleanerTest > testCleaningWithDeletes PASSED kafka.log.CleanerTest > testCleanSegmentsWithAbort PASSED kafka.log.CleanerTest > testSegmentGrouping PASSED kafka.log.CleanerTest > testBuildOffsetMap PASSED kafka.log.OffsetMapTest > testBasicValidation PASSED kafka.log.OffsetMapTest > testClear PASSED kafka.javaapi.message.ByteBufferMessageSetTest > testWrittenEqualsRead PASSED kafka.javaapi.message.ByteBufferMessageSetTest > testIteratorIsConsistent PASSED kafka.javaapi.message.ByteBufferMessageSetTest > testSizeInBytes PASSED kafka.javaapi.message.ByteBufferMessageSetTest > testEquals PASSED kafka.javaapi.message.ByteBufferMessageSetTest > testIteratorIsConsistentWithCompression PASSED kafka.javaapi.message.ByteBufferMessageSetTest > testSizeInBytesWithCompression PASSED kafka.javaapi.message.ByteBufferMessageSetTest > testEqualsWithCompression PASSED kafka.javaapi.consumer.ZookeeperConsumerConnectorTest > testBasic PASSED :core:copyDependantLibs UP-TO-DATE :core:jar :examples:compileJava :examples:processResources UP-TO-DATE :examples:classes :examples:compileTestJava UP-TO-DATE :examples:processTestResources UP-TO-DATE :examples:testClasses UP-TO-DATE :examples:test UP-TO-DATE :contrib:hadoop-consumer:compileJavaNote: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: <https://builds.apache.org/job/Kafka-trunk/ws/contrib/hadoop-consumer/src/main/java/kafka/etl/impl/DataGenerator.java> uses unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. :contrib:hadoop-consumer:processResources UP-TO-DATE :contrib:hadoop-consumer:classes :contrib:hadoop-consumer:compileTestJava UP-TO-DATE :contrib:hadoop-consumer:processTestResources UP-TO-DATE :contrib:hadoop-consumer:testClasses UP-TO-DATE :contrib:hadoop-consumer:test UP-TO-DATE :contrib:hadoop-producer:compileJava :contrib:hadoop-producer:processResources UP-TO-DATE :contrib:hadoop-producer:classes :contrib:hadoop-producer:compileTestJava UP-TO-DATE :contrib:hadoop-producer:processTestResources UP-TO-DATE :contrib:hadoop-producer:testClasses UP-TO-DATE :contrib:hadoop-producer:test UP-TO-DATE BUILD SUCCESSFUL Total time: 7 mins 49.712 secs Setting GRADLE_2_1_HOME=/home/jenkins/jenkins-slave/tools/hudson.plugins.gradle.GradleInstallation/Gradle_2.1 [Kafka-trunk] $ /bin/bash -xe /tmp/hudson8983126368946461415.sh + ./gradlew -PscalaVersion=2.11 test To honour the JVM settings for this build a new JVM will be forked. Please consider using the daemon: http://gradle.org/docs/2.0/userguide/gradle_daemon.html. Building project 'core' with Scala version 2.11 :clients:compileJava UP-TO-DATE :clients:processResources UP-TO-DATE :clients:classes UP-TO-DATE :clients:compileTestJava UP-TO-DATE :clients:processTestResources UP-TO-DATE :clients:testClasses UP-TO-DATE :clients:test UP-TO-DATE :contrib:compileJava UP-TO-DATE :contrib:processResources UP-TO-DATE :contrib:classes UP-TO-DATE :contrib:compileTestJava UP-TO-DATE :contrib:processTestResources UP-TO-DATE :contrib:testClasses UP-TO-DATE :contrib:test UP-TO-DATE :clients:jar UP-TO-DATE :core:compileJava UP-TO-DATE :core:compileScala <https://builds.apache.org/job/Kafka-trunk/ws/core/src/main/scala/kafka/admin/AdminUtils.scala>:264: non-variable type argument String in type pattern scala.collection.Map[String,_] is unchecked since it is eliminated by erasure case Some(map: Map[String, _]) => ^ <https://builds.apache.org/job/Kafka-trunk/ws/core/src/main/scala/kafka/admin/AdminUtils.scala>:267: non-variable type argument String in type pattern scala.collection.Map[String,String] is unchecked since it is eliminated by erasure case Some(config: Map[String, String]) => ^ <https://builds.apache.org/job/Kafka-trunk/ws/core/src/main/scala/kafka/consumer/ConsumerIterator.scala>:107: A try without a catch or finally is equivalent to putting its body in a block; no exceptions are handled. try { ^ <https://builds.apache.org/job/Kafka-trunk/ws/core/src/main/scala/kafka/server/KafkaApis.scala>:459: value asScalaIterable is not a member of object scala.collection.JavaConversions val topics = JavaConversions.asScalaIterable(joinGroupReq.body.topics()).toSet ^ <https://builds.apache.org/job/Kafka-trunk/ws/core/src/main/scala/kafka/server/KafkaApis.scala>:463: value asJavaList is not a member of object scala.collection.JavaConversions val response = new JoinGroupResponse(ErrorMapping.NoError, this.consumerGroupGenerationId, joinGroupReq.body.consumerId, JavaConversions.asJavaList(partitionList)) ^ <https://builds.apache.org/job/Kafka-trunk/ws/core/src/main/scala/kafka/server/KafkaServer.scala>:175: a pure expression does nothing in statement position; you may be omitting necessary parentheses ControllerStats.uncleanLeaderElectionRate ^ <https://builds.apache.org/job/Kafka-trunk/ws/core/src/main/scala/kafka/server/KafkaServer.scala>:176: a pure expression does nothing in statement position; you may be omitting necessary parentheses ControllerStats.leaderElectionTimer ^ 5 warnings found two errors found :core:compileScala FAILED FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':core:compileScala'. > Compilation failed * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. BUILD FAILED Total time: 33.871 secs Build step 'Execute shell' marked build as failure Setting GRADLE_2_1_HOME=/home/jenkins/jenkins-slave/tools/hudson.plugins.gradle.GradleInstallation/Gradle_2.1