So we have an extra jar that we need to get loaded in the spark executor
runtime before log4j loads (yes, you've guessed it, it's a custom
appender!). We've tried putting it in spark-defaults.conf and restarting
our application, but it didn't work. I'm kinda leery of putting the user
classpath first - is this still very experimental? I can see how it could
cause problems if we mask any runtime libraries spark depends on. We could
put it in spark-env.sh, but does that mean we have to restart spark itself
to make it active? Finally, we could put it in the java options of
launching our application, but that should be the same as spark-defaults
for our purposes and the conf file is the more architecturally correct
place to put it. Generally, any advice on getting custom appenders into
log4j on the workers?

Victor

Reply via email to