Hi,

Thanks for the suggestion -- but those classpaths config options only affect 
the driver and executor processes -- not the standalone mode daemons (master 
and slave). Incidentally we have the extra jars we need set there.

I went through the docs but couldn't find a place to set extra classpath for 
the daemons. 

M

> On Nov 18, 2015, at 1:19 AM, "memorypr...@gmail.com" <memorypr...@gmail.com> 
> wrote:
> 
> Have you tried using 
> spark.driver.extraClassPath
> and 
> spark.executor.extraClassPath
> 
> ?
> 
> AFAICT these config options replace SPARK_CLASSPATH. Further info in the 
> docs. I've had good luck with these options, and for ease of use I just set 
> them in the spark defaults config.
> 
> https://spark.apache.org/docs/latest/configuration.html
> 
>> On Tue, 17 Nov 2015 at 21:06 Michal Klos <michal.klo...@gmail.com> wrote:
>> Hi,
>> 
>> We are running a Spark Standalone cluster on EMR (note: not using YARN) and 
>> are trying to use S3 w/ EmrFS as our event logging directory.
>> 
>> We are having difficulties with a ClassNotFoundException on EmrFileSystem 
>> when we navigate to the event log screen. This is to be expected as the 
>> EmrFs jars are not on the classpath.
>> 
>> But -- I have not been able to figure out a way to add additional classpath 
>> jars to the start-up of the Master daemon. SPARK_CLASSPATH has been 
>> deprecated, and looking around at spark-class, etc.. everything seems to be 
>> pretty locked down. 
>> 
>> Do I have to shove everything into the assembly jar?
>> 
>> Am I missing a simple way to add classpath to the daemons?
>> 
>> thanks,
>> Michal

Reply via email to