Thanks Marco

The code snippet has something like below.

ClassLoader cl = Thread.currentThread().getContextClassLoader();
String packagePath = "com.xxx.xxx";
final Enumeration<URL> resources = cl.getResources(packagePath);

So resources collection is always empty, indicating no classes are loaded.

As I mentioned in my original email, it works when I include those classes
in the fat jar. Our use case is that each team will create their own jar
including their own protobuf schema classes. I cannot really create a fat
jar including every class in every environment.

Chen


On Thu, Jul 7, 2016 at 12:18 PM Marco Mistroni <mmistr...@gmail.com> wrote:

> Hi Chen
>  pls post
> 1 . snippet code
> 2. exception
>
> any particular reason why you need to load classes in other jars
> programmatically?
>
> Have you tried to build a fat jar with all the dependencies ?
>
> hth
> marco
>
> On Thu, Jul 7, 2016 at 5:05 PM, Chen Song <chen.song...@gmail.com> wrote:
>
>> Sorry to spam people who are not interested. Greatly appreciate it if
>> anyone who is familiar with this can share some insights.
>>
>> On Wed, Jul 6, 2016 at 2:28 PM Chen Song <chen.song...@gmail.com> wrote:
>>
>>> Hi
>>>
>>> I ran into problems to use class loader in Spark. In my code (run within
>>> executor), I explicitly load classes using the ContextClassLoader as below.
>>>
>>> Thread.currentThread().getContextClassLoader()
>>>
>>> The jar containing the classes to be loaded is added via the --jars
>>> option in spark-shell/spark-submit.
>>>
>>> I always get the class not found exception. However, it seems to work if
>>> I compile these classes in main jar for the job (the jar containing the
>>> main job class).
>>>
>>> I know Spark implements its own class loaders in a particular way. Is
>>> there a way to work around this? In other words, what is the proper way to
>>> programmatically load classes in other jars added via --jars in Spark?
>>>
>>>
>

Reply via email to