Creating a custom classloader to load classes from those jars?
El jue, 17 oct 2024, 19:47, Nimrod Ofek escribió:
>
> Hi,
>
> Thanks all for the replies.
>
> I am adding the Spark dev list as well - as I think this might be an issue
> that needs to be addressed.
>
> The options presented here wil
Hi,
That's on you as the maintainer of the derived image to ensure that your
added dependencies do not conflict with Spark's dependencies. Speaking from
experience, there are several ways to achieve this:
1. Ensure you're using packages that contain shaded and relocated packages,
if possible.
2.
Hi,
Thanks all for the replies.
I am adding the Spark dev list as well - as I think this might be an issue
that needs to be addressed.
The options presented here will get the jars - but they don't help us with
dependencies conflicts...
For example - com.google.cloud.bigdataoss:gcs-connector:hado