Yes, but scalatest doesn't end up in compile scope, says Maven?

...

[INFO] +- org.apache.spark:spark-tags_2.11:jar:2.1.0-SNAPSHOT:compile

[INFO] |  +- (org.scalatest:scalatest_2.11:jar:2.2.6:test - scope managed
from compile; omitted for duplicate)

[INFO] |  \- (org.spark-project.spark:unused:jar:1.0.0:compile - omitted
for duplicate)

[INFO] +- org.apache.commons:commons-crypto:jar:1.0.0:compile

[INFO] +- org.spark-project.spark:unused:jar:1.0.0:compile

[INFO] +- org.scalatest:scalatest_2.11:jar:2.2.6:test

...

On Fri, Oct 28, 2016 at 8:52 PM Jeremy Smith <jeremy.sm...@acorns.com>
wrote:

> spark-core depends on spark-launcher (compile)
> spark-launcher depends on spark-tags (compile)
> spark-tags depends on scalatest (compile)
>
> To be honest I'm not all that familiar with the project structure - should
> I just exclude spark-launcher if I'm not using it?
>
> On Fri, Oct 28, 2016 at 12:27 PM, Sean Owen <so...@cloudera.com> wrote:
>
> It's required because the tags module uses it to define annotations for
> tests. I don't see it in compile scope for anything but the tags module,
> which is then in test scope for other modules. What are you seeing that
> makes you say it's in compile scope?
>
> On Fri, Oct 28, 2016 at 8:19 PM Jeremy Smith <jeremy.sm...@acorns.com>
> wrote:
>
> Hey everybody,
>
> Just a heads up that currently Spark 2.0.1 has a compile dependency on
> Scalatest 2.2.6. It comes from spark-core's dependency on spark-launcher,
> which has a transitive dependency on spark-tags, which has a compile
> dependency on Scalatest.
>
> This makes it impossible to use any other version of Scalatest for testing
> your app if you declare a dependency on any Spark 2.0.1 module; you'll get
> a bunch of runtime errors during testing (unless you figure out the reason
> like I did and explicitly exclude Scalatest from the spark dependency).
>
> I think that dependency should probably be moved to a test dependency
> instead.
>
> Thanks,
> Jeremy
>
>
>

Reply via email to