+1 I agree we need this too. Looks like there is already an issue for it
here;
https://spark-project.atlassian.net/browse/SPARK-750
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Unit-testing-jar-request-tp16475p18801.html
Sent from the Apache Spark User
Hi,
we are Spark users and we use some Spark's test classes for our own application
unit tests. We use LocalSparkContext and SharedSparkContext. But these classes
are not included in the spark-core library. This is a good option as it's not a
good idea to include test classes in the runtime ja