Hi Aish,

Deprecating %dep was plan before. %dep loads library into thread context
classloader after JVM process creation. However, in few places, Spark tries
to find class from System classloader, not from classloader of thread
context. That was reason we introduced new dependency loading mechanism
(through interpreter GUI) which set library into the JVM class path on
process creation, and deprecating %dep.

But since then, we've got multiple feedbacks from user that they likes %dep
approaches for some reasons.

- %dep is self-documenting library requirement in the notebook.
- %dep allows per Note (and possible per User) library loading.

And i agree. So, It make sense to keep %dep unless we have other
alternatives.

Thanks,
moon

On Fri, Nov 4, 2016 at 12:53 PM Aish Fenton <afen...@netflix.com> wrote:

> Hi all,
> Can someone clear up if %dep is going to be deprecated? From the docs and
> warning messages it sounds like that's still the plan. But from an in
> person conversation I've had with Moon, it sounded like maybe not (although
> this was a while back)?
>
> If it's depreciated, that'll leave a big gap for us. Our typical workflow
> involves us exchanging notebooks (export/import of the json) fairly
> regularly. And every notebook has different library dependencies. Being
> able to specify these inline, in the notebook, is really useful.
>
> I remember the concern with %dep was with adding jars dynamically? If this
> is still true, then maybe a compromise would be to keep a way of
> documenting the jar dependencies per notebook, but when the user loads that
> notebook prompt them with "Jar X is missing, click yes to add to your Spark
> interpreter dependencies". So have it function more like an automatic way
> to fill in the GUI dependencies settings, but at least it keeps the
> self-documenting functionality too.
>
> Aish
>

Reply via email to