Googling that error, I came across something that appears relevant:
https://groups.google.com/forum/#!msg/spark-users/T1soH67C5M4/vihzNt92anYJ
I'd try just doing sbt/sbt clean first, and if that fails, digging deeper
into that thread.
(By the way, "sbt/sbt publish-local" IS what you want, otherw
Thanks for reply Aaron.
I tried with "sbt/sbt publish local" but got below error.
[error]
/home/cloudera/at_Installation/spark-0.9.1-bin-cdh4/streaming/src/main/scala/org/apache/spark/streaming/api/java/JavaPairDStream.scala:669:
type mismatch;
[error] found : org.apache.spark.streaming.dst
I suppose you actually ran "publish-local" and not "publish local" like
your example showed. That being the case, could you show the compile error
that occurs? It could be related to the hadoop version.
On Sun, May 25, 2014 at 7:51 PM, ABHISHEK wrote:
> Hi,
> I'm trying to install Spark along w
Hi,
I'm trying to install Spark along with Shark.
Here's configuration details:
Spark 0.9.1
Shark 0.9.1
Scala 2.10.3
Spark assembly was successful but running "sbt/sbt publish-local" failed.
Please refer attached log for more details and advise.
Thanks,
Abhishek
Sparkhome>SPARK_HADOOP_VERSION=2.0