Spark 1.3.0 is not officially out yet, so i don't think sbt will download
the hadoop dependencies for your spark by itself. You could try manually
adding the hadoop dependencies yourself (hadoop-core, hadoop-common,
hadoop-client)

Thanks
Best Regards

On Wed, Mar 11, 2015 at 9:07 PM, Patcharee Thongtra <
patcharee.thong...@uni.no> wrote:

> Hi,
>
> I have built spark version 1.3 and tried to use this in my spark scala
> application. When I tried to compile and build the application by SBT, I
> got error>
> bad symbolic reference. A signature in SparkContext.class refers to term
> conf in value org.apache.hadoop which is not available
>
> It seems hadoop library is missing, but it should be referred
> automatically by SBT, isn't it.
>
> This application is buit-able on spark version 1.2
>
> Here is my build.sbt
>
> name := "wind25t-v013"
> version := "0.1"
> scalaVersion := "2.10.4"
> unmanagedBase := baseDirectory.value / "lib"
> libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.0"
> libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.3.0"
> libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.3.0"
> libraryDependencies += "org.apache.spark" % "spark-hive_2.10" % "1.3.0"
>
> What should I do to fix it?
>
> BR,
> Patcharee
>
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to