Re: Spark's Hadooop Dependency

2014-06-25 Thread Koert Kuipers
libraryDependencies ++= Seq( "org.apache.spark" %% "spark-core" % versionSpark % "provided" exclude("org.apache.hadoop", "hadoop-client") "org.apache.hadoop" % "hadoop-client" % versionHadoop % "provided" ) On Wed, Jun 25, 2014 at 11:26 AM, Robert James wrote: > To add Spark to a SBT projec

Spark's Hadooop Dependency

2014-06-25 Thread Robert James
To add Spark to a SBT project, I do: libraryDependencies += "org.apache.spark" %% "spark-core" % "1.0.0" % "provided" How do I make sure that the spark version which will be downloaded will depend on, and use, Hadoop 2, and not Hadoop 1? Even with a line: libraryDependencies += "org.apache.h