On 29 Sep 2016, at 10:37, Olivier Girardot
mailto:o.girar...@lateral-thoughts.com>> wrote:
I know that the code itself would not be the same, but it would be useful to at
least have the pom/build.sbt transitive dependencies different when fetching
the artifact with a specific classifier, don't
No, I think that's what dependencyManagent (or equivalent) is definitely for.
On Thu, Sep 29, 2016 at 5:37 AM, Olivier Girardot
wrote:
> I know that the code itself would not be the same, but it would be useful to
> at least have the pom/build.sbt transitive dependencies different when
> fetching
I know that the code itself would not be the same, but it would be useful to at
least have the pom/build.sbt transitive dependencies different when fetching the
artifact with a specific classifier, don't you think ?For now I've overriden
them myself using the dependency versions defined in the pom.
I guess I'm claiming the artifacts wouldn't even be different in the first
place, because the Hadoop APIs that are used are all the same across these
versions. That would be the thing that makes you need multiple versions of
the artifact under multiple classifiers.
On Wed, Sep 28, 2016 at 1:16 PM,
ok, don't you think it could be published with just different classifiers
hadoop-2.6hadoop-2.4
hadoop-2.2 being the current default.
So for now, I should just override spark 2.0.0's dependencies with the ones
defined in the pom profile
On Thu, Sep 22, 2016 11:17 AM, Sean Owen so...@cloudera.
There can be just one published version of the Spark artifacts and they
have to depend on something, though in truth they'd be binary-compatible
with anything 2.2+. So you merely manage the dependency versions up to the
desired version in your .
On Thu, Sep 22, 2016 at 7:05 AM, Olivier Girardot <