Hi Chen
 pls post
1 . snippet code
2. exception

any particular reason why you need to load classes in other jars
programmatically?

Have you tried to build a fat jar with all the dependencies ?

hth
marco

On Thu, Jul 7, 2016 at 5:05 PM, Chen Song <chen.song...@gmail.com> wrote:

> Sorry to spam people who are not interested. Greatly appreciate it if
> anyone who is familiar with this can share some insights.
>
> On Wed, Jul 6, 2016 at 2:28 PM Chen Song <chen.song...@gmail.com> wrote:
>
>> Hi
>>
>> I ran into problems to use class loader in Spark. In my code (run within
>> executor), I explicitly load classes using the ContextClassLoader as below.
>>
>> Thread.currentThread().getContextClassLoader()
>>
>> The jar containing the classes to be loaded is added via the --jars
>> option in spark-shell/spark-submit.
>>
>> I always get the class not found exception. However, it seems to work if
>> I compile these classes in main jar for the job (the jar containing the
>> main job class).
>>
>> I know Spark implements its own class loaders in a particular way. Is
>> there a way to work around this? In other words, what is the proper way to
>> programmatically load classes in other jars added via --jars in Spark?
>>
>>

Reply via email to