Hi Patrick,

Thanks for all the explanations, that makes sense. @DeveloperApi
worries me a little bit especially because of the things Colin
mentions - it's sort of hard to make people move off of APIs, or
support different versions of the same API. But maybe if expectations
(or lack thereof) are set up front, there will be less issues.

You mentioned something in your shading argument that kinda reminded
me of something. Spark currently depends on slf4j implementations and
log4j with "compile" scope. I'd argue that's the wrong approach if
we're talking about Spark being used embedded inside applications;
Spark should only depend on the slf4j API package, and let the
application provide the underlying implementation.

The assembly jars could include an implementation (since I assume
those are currently targeted at cluster deployment and not embedding).

That way there is less sources of conflict at runtime (i.e. the
"multiple implementation jars" messages you can see when running some
Spark programs).

On Fri, May 30, 2014 at 10:54 PM, Patrick Wendell <pwend...@gmail.com> wrote:
> 2. Many libraries like logging subsystems, configuration systems, etc
> rely on static state and initialization. I'm not totally sure how e.g.
> slf4j initializes itself if you have both a shaded and non-shaded copy
> of slf4j present.

-- 
Marcelo

Reply via email to