HI, Ted,
  I will have a look at it , thanks a lot.

  Cheers,
  Dan
 2015年4月29日 下午5:00于 "Ted Yu" <yuzhih...@gmail.com>写道:

> Have you looked at
> http://www.scala-sbt.org/0.13/tutorial/Multi-Project.html ?
>
> Cheers
>
> On Wed, Apr 29, 2015 at 2:45 PM, Dan Dong <dongda...@gmail.com> wrote:
>
>> Hi,
>>   Following the Quick Start guide:
>> https://spark.apache.org/docs/latest/quick-start.html
>>
>> I could compile and run a Spark program successfully, now my question is
>> how to
>> compile multiple programs with sbt in a bunch. E.g, two programs as:
>>
>>
>> ./src
>> ./src/main
>> ./src/main/scala
>> ./src/main/scala/SimpleApp_A.scala
>> ./src/main/scala/SimpleApp_B.scala
>>
>> Hopefully with "sbt package", I will get two .jar files for each of the
>> source program, then I can run them separately in Spark. I tried to create
>> two .sbt files for each program, but found only one .jar file is created.
>>
>> ./simpleA.sbt
>> name := "Simple Project A"
>> version := "1.0"
>> scalaVersion := "2.10.4"
>> libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"
>>
>> ./simpleB.sbt
>> name := "Simple Project B"
>> version := "1.0"
>> scalaVersion := "2.10.4"
>> libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"
>>
>>   Does anybody know how to do it?
>>
>> Cheers,
>> Dan
>>
>>
>

Reply via email to