When I run sbt/sbt assembly, I get the following exception. Is anyone else
experiencing a similar problem?


..........

[info] Resolving org.eclipse.jetty.orbit#javax.servlet;3.0.0.v201112011016
...

[info] Updating {file:/Users/Sung/Projects/spark_06_04_14/}assembly...

[info] Resolving org.fusesource.jansi#jansi;1.4 ...

[info] Done updating.

[info] Resolving org.eclipse.jetty#jetty-server;8.1.14.v20131031 ...

[info] Updating {file:/Users/Sung/Projects/spark_06_04_14/}examples...

[info] Resolving com.typesafe.genjavadoc#genjavadoc-plugin_2.10.4;0.5 ...

*[error] impossible to get artifacts when data has not been loaded. IvyNode
= org.slf4j#slf4j-api;1.6.1*

[info] Resolving org.fusesource.jansi#jansi;1.4 ...

[info] Done updating.

[warn]
/Users/Sung/Projects/spark_06_04_14/core/src/main/scala/org/apache/hadoop/mapred/SparkHadoopMapRedUtil.scala:43:
constructor TaskAttemptID in class TaskAttemptID is deprecated: see
corresponding Javadoc for more information.

[warn]     new TaskAttemptID(jtIdentifier, jobId, isMap, taskId, attemptId)

[warn]     ^

[warn]
/Users/Sung/Projects/spark_06_04_14/core/src/main/scala/org/apache/spark/SparkContext.scala:490:
constructor Job in class Job is deprecated: see corresponding Javadoc for
more information.

[warn]     val job = new NewHadoopJob(hadoopConfiguration)

[warn]               ^

[warn]
/Users/Sung/Projects/spark_06_04_14/core/src/main/scala/org/apache/spark/SparkContext.scala:623:
constructor Job in class Job is deprecated: see corresponding Javadoc for
more information.

[warn]     val job = new NewHadoopJob(conf)

Reply via email to