-ABB0-94CDC3D88A09]
From: Olivier Girardot mailto:ssab...@gmail.com>>
Date: Friday, May 8, 2015 at 6:40 AM
To: Akhil Das mailto:ak...@sigmoidanalytics.com>>,
"Ganelin, Ilya"
mailto:ilya.gane...@capitalone.com>>
Cc: dev mailto:dev@spark.apache.org>>
Subject: Re: N
You're trying to launch using sbt run some "provided" dependency,
the goal of the "provided" scope is exactly to exclude this dependency from
runtime, considering it as "provided" by the environment.
You configuration is correct to create an assembly jar - but not to use sbt
run to test your proje
Looks like the jar you provided has some missing classes. Try this:
scalaVersion := "2.10.4"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.3.0",
"org.apache.spark" %% "spark-sql" % "1.3.0" % "provided",
"org.apache.spark" %% "spark-mllib" % "1.3.0" % "provided",
Hi all – I’m attempting to build a project with SBT and run it on Spark 1.3
(this previously worked before we upgraded to CDH 5.4 with Spark 1.3).
I have the following in my build.sbt:
scalaVersion := "2.10.4"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.3.0" % "prov