We use sbt for easy cross project dependencies with multiple scala versions
in a mono-repo for which it pretty good albeit with some quirks. As our
projects have matured and change less we moved away from cross project
dependencies but it was extremely useful early in the projects. We knew
that a l
Spark uses Maven as the primary build, but SBT works as well. It reads the
Maven build to some extent.
Zinc incremental compilation works with Maven (with the Scala plugin for
Maven).
Myself, I prefer Maven, for some of the reasons it is the main build in
Spark: declarative builds end up being a
I think most of the scala development in Spark happens with sbt - in the open
source world.
However, you can do it with Gradle and Maven as well. It depends on your
organization etc. what is your standard.
Some things might be more cumbersome too reach in non-sbt scala scenarios, but
this is