Hi,
  Following the Quick Start guide:
https://spark.apache.org/docs/latest/quick-start.html

I could compile and run a Spark program successfully, now my question is
how to
compile multiple programs with sbt in a bunch. E.g, two programs as:


./src
./src/main
./src/main/scala
./src/main/scala/SimpleApp_A.scala
./src/main/scala/SimpleApp_B.scala

Hopefully with "sbt package", I will get two .jar files for each of the
source program, then I can run them separately in Spark. I tried to create
two .sbt files for each program, but found only one .jar file is created.

./simpleA.sbt
name := "Simple Project A"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"

./simpleB.sbt
name := "Simple Project B"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"

  Does anybody know how to do it?

Cheers,
Dan

Reply via email to