Why did you directly jump to spark-streaming-mqtt module ?

Can you drop 'spark-streaming-mqtt' and try again ?

Not sure why 1.5.0-SNAPSHOT showed up.
Were you using RC2 source ?

Cheers

On Sun, Nov 8, 2015 at 7:28 PM, 欧锐 <494165...@qq.com> wrote:

>
> build spark-streaming-mqtt_2.10 failed!
>
> nohup mvn -X -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive
> -Phive-thriftserver -DskipTests clean package -rf
> :spark-streaming-mqtt_2.10 &
>
> [DEBUG] org.scala-tools.testing:test-interface:jar:0.5:test
> [DEBUG] org.apache.activemq:activemq-core:jar:5.7.0:test
> [DEBUG] org.apache.geronimo.specs:geronimo-jms_1.1_spec:jar:1.1.1:test
> [DEBUG] org.apache.activemq:kahadb:jar:5.7.0:test
> [DEBUG] org.apache.activemq.protobuf:activemq-protobuf:jar:1.1:test
> [DEBUG] org.fusesource.mqtt-client:mqtt-client:jar:1.3:test
> [DEBUG] org.fusesource.hawtdispatch:hawtdispatch-transport:jar:1.11:test
> [DEBUG] org.fusesource.hawtdispatch:hawtdispatch:jar:1.11:test
> [DEBUG] org.fusesource.hawtbuf:hawtbuf:jar:1.9:test
> [DEBUG]
> org.apache.geronimo.specs:geronimo-j2ee-management_1.1_spec:jar:1.0.1:test
> [DEBUG] org.springframework:spring-context:jar:3.0.7.RELEASE:test
> [DEBUG] org.springframework:spring-aop:jar:3.0.7.RELEASE:test
> [DEBUG] aopalliance:aopalliance:jar:1.0:test
> [DEBUG] org.springframework:spring-beans:jar:3.0.7.RELEASE:test
> [DEBUG] org.springframework:spring-core:jar:3.0.7.RELEASE:test
> [DEBUG] commons-logging:commons-logging:jar:1.1.1:test
> [DEBUG] org.springframework:spring-expression:jar:3.0.7.RELEASE:test
> [DEBUG] org.springframework:spring-asm:jar:3.0.7.RELEASE:test
> [DEBUG] org.jasypt:jasypt:jar:1.9.0:test
> [DEBUG] org.spark-project.spark:unused:jar:1.0.0:compile
> [DEBUG] org.scalatest:scalatest_2.10:jar:2.2.1:test
> [DEBUG] org.scala-lang:scala-reflect:jar:2.10.4:provided
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Reactor Summary:
> [INFO]
> [INFO] Spark Project External MQTT ........................ FAILURE [
> 2.403 s]
> [INFO] Spark Project External MQTT Assembly ............... SKIPPED
> [INFO] Spark Project External ZeroMQ ...................... SKIPPED
> [INFO] Spark Project External Kafka ....................... SKIPPED
> [INFO] Spark Project Examples ............................. SKIPPED
> [INFO] Spark Project External Kafka Assembly .............. SKIPPED
> [INFO]
> ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Total time: 4.471 s
> [INFO] Finished at: 2015-11-09T11:10:57+08:00 <http://calendarclick>
> [INFO] Final Memory: 31M/173M
> [INFO]
> ------------------------------------------------------------------------
> [WARNING] The requested profile "hive" could not be activated because it
> does not exist.
> [ERROR] Failed to execute goal on project spark-streaming-mqtt_2.10: Could
> not resolve dependencies for project
> org.apache.spark:spark-streaming-mqtt_2.10:jar:1.5.0-SNAPSHOT: The
> following artifacts could not be resolved:
> org.apache.spark:spark-streaming_2.10:jar:1.5.0-SNAPSHOT,
> org.apache.spark:spark-core_2.10:jar:1.5.0-SNAPSHOT,
> org.apache.spark:spark-core_2.10:jar:tests:1.5.0-SNAPSHOT,
> org.apache.spark:spark-launcher_2.10:jar:1.5.0-SNAPSHOT,
> org.apache.spark:spark-network-common_2.10:jar:1.5.0-SNAPSHOT,
> org.eclipse.paho:org.eclipse.paho.client.mqttv3:jar:1.0.1: Failure to find
> org.apache.spark:spark-streaming_2.10:jar:1.5.0-20150818.023902-334 in
> http://maven.cnsuning.com/content/groups/public/ was cached in the local
> repository, resolution will not be reattempted until the update interval of
> suning_maven_repo has elapsed or updates are forced -> [Help 1]
> org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute
> goal on project spark-streaming-mqtt_2.10: Could not resolve dependencies
> for project org.apache.spark:spark-streaming-mqtt_2.10:jar:1.5.0-SNAPSHOT:
> The following artifacts could not be resolved:
> org.apache.spark:spark-streaming_2.10:jar:1.5.0-SNAPSHOT,
> org.apache.spark:spark-core_2.10:jar:1.5.0-SNAPSHOT,
> org.apache.spark:spark-core_2.10:jar:tests:1.5.0-SNAPSHOT,
> org.apache.spark:spark-launcher_2.10:jar:1.5.0-SNAPSHOT,
> org.apache.spark:spark-network-common_2.10:jar:1.5.0-SNAPSHOT,
> org.eclipse.paho:org.eclipse.paho.client.mqttv3:jar:1.0.1: Failure to find
> org.apache.spark:spark-streaming_2.10:jar:1.5.0-20150818.023902-334 in
> http://maven.cnsuning.com/content/groups/public/ was cached in the local
> repository, resolution will not be reattempted until the update interval of
> suning_maven_repo has elapsed or updates are forced
> 发自我的iPhone
>
>
> ------------------ 原始邮件 ------------------
> *发件人:* Denny Lee <denny.g....@gmail.com>
> *发送时间:* 2015年11月8日 08:36
> *收件人:* Mark Hamstra <m...@clearstorydata.com>, Reynold Xin <
> r...@databricks.com>
> *抄送:* dev@spark.apache.org <dev@spark.apache.org>
> *主题:* Re: [VOTE] Release Apache Spark 1.5.2 (RC2)
>
> +1
>
>
> On Sat, Nov 7, 2015 at 12:01 PM Mark Hamstra <m...@clearstorydata.com>
> wrote:
>
>> +1
>>
>> On Tue, Nov 3, 2015 at 3:22 PM, Reynold Xin <r...@databricks.com> wrote:
>>
>>> Please vote on releasing the following candidate as Apache Spark version
>>> 1.5.2. The vote is open until Sat Nov 7, 2015 at 00:00 UTC and passes if a
>>> majority of at least 3 +1 PMC votes are cast.
>>>
>>> [ ] +1 Release this package as Apache Spark 1.5.2
>>> [ ] -1 Do not release this package because ...
>>>
>>>
>>> The release fixes 59 known issues in Spark 1.5.1, listed here:
>>> http://s.apache.org/spark-1.5.2
>>>
>>> The tag to be voted on is v1.5.2-rc2:
>>> https://github.com/apache/spark/releases/tag/v1.5.2-rc2
>>>
>>> The release files, including signatures, digests, etc. can be found at:
>>> http://people.apache.org/~pwendell/spark-releases/spark-1.5.2-rc2-bin/
>>>
>>> Release artifacts are signed with the following key:
>>> https://people.apache.org/keys/committer/pwendell.asc
>>>
>>> The staging repository for this release can be found at:
>>> - as version 1.5.2-rc2:
>>> https://repository.apache.org/content/repositories/orgapachespark-1153
>>> - as version 1.5.2:
>>> https://repository.apache.org/content/repositories/orgapachespark-1152
>>>
>>> The documentation corresponding to this release can be found at:
>>> http://people.apache.org/~pwendell/spark-releases/spark-1.5.2-rc2-docs/
>>>
>>>
>>> =======================================
>>> How can I help test this release?
>>> =======================================
>>> If you are a Spark user, you can help us test this release by taking an
>>> existing Spark workload and running on this release candidate, then
>>> reporting any regressions.
>>>
>>> ================================================
>>> What justifies a -1 vote for this release?
>>> ================================================
>>> -1 vote should occur for regressions from Spark 1.5.1. Bugs already
>>> present in 1.5.1 will not block this release.
>>>
>>>
>>>

Reply via email to