Life saver tip. Worked like a charm (was getting frustrated).
sbt/sbt clean
Did the trick.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/compile-spark-0-9-1-in-hadoop-2-2-above-exception-tp4795p5325.html
Sent from the Apache Spark User List mailing list
Try running sbt/sbt clean and re-compiling. Any luck?
On Thu, Apr 24, 2014 at 5:33 PM, martin.ou wrote:
>
>
> occure exception when compile spark 0.9.1 using sbt,env: hadoop 2.3
>
> 1. SPARK_HADOOP_VERSION=2.3.0 SPARK_YARN=true sbt/sbt assembly
>
>
>
> 2.found Exception:
>
> found : org.apache
occure exception when compile spark 0.9.1 using sbt,env: hadoop 2.3
1. SPARK_HADOOP_VERSION=2.3.0 SPARK_YARN=true sbt/sbt assembly
2.found Exception:
found : org.apache.spark.streaming.dstream.DStream[(K, V)]
[error] required: org.apache.spark.streaming.api.java.JavaPairDStream[K,V]
[error] N