The usual way to use Spark with SBT is to package a Spark project using sbt package (e.g. per Quick Start) and submit it to Spark using the bin/ scripts from Sark distribution. For plain Scala project, you don’t need to download anything, you can just get a build.sbt file with dependencies and e.g. say “console” which will start a Scala REPL with the dependencies on the class path. Is there a way to avoid downloading Spark tarball completely, by defining the spark-core dependency in build.sbt, and using `run` or `console` to invoke Spark REPL from sbt? I.e. the goal is: create a single build.sbt file, such that if you run sbt in its directory, and then say run/console (with optional parameters), it will download all Spark dependencies and start the REPL. Should work on a fresh machine where Spark tarball had never been untarred.
A+