To add Spark to a SBT project, I do:
  libraryDependencies += "org.apache.spark" %% "spark-core" % "1.0.0"
% "provided"

How do I make sure that the spark version which will be downloaded
will depend on, and use, Hadoop 2, and not Hadoop 1?

Even with a line:
   libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.4.0"

I still see SBT downloading Hadoop 1:

[debug] == resolving dependencies
org.apache.spark#spark-core_2.10;1.0.0->org.apache.hadoop#hadoop-client;1.0.4
[compile->master(*)]
[debug] dependency descriptor has been mediated: dependency:
org.apache.hadoop#hadoop-client;2.4.0 {compile=[default(compile)]} =>
dependency: org.apache.hadoop#hadoop-client;1.0.4
{compile=[default(compile)]}

Reply via email to