How do I configure the files to be uploaded to YARN containers.  So far, I’ve 
only seen "--conf spark.yarn.jar=hdfs://….” which allows me to specify the HDFS 
location of the Spark JAR, but I’m not sure how to prescribe other files for 
uploading (e.g., spark-env.sh)

mn

> On Nov 20, 2014, at 4:08 AM, Sean Owen <so...@cloudera.com> wrote:
> 
> I think the standard practice is to include your log config file among
> the files uploaded to YARN containers, and then set
> -Dlog4j.configuration=yourfile.xml in
> spark.{executor.driver}.extraJavaOptions ?
> 
> http://spark.apache.org/docs/latest/running-on-yarn.html
> 
> On Thu, Nov 20, 2014 at 9:20 AM, Tobias Pfeiffer <t...@preferred.jp> wrote:
>> Hi,
>> 
>> I am using spark-submit to submit my application jar to a YARN cluster.  I
>> want to deliver a single jar file to my users, so I would like to avoid to
>> tell them "also, please put that log4j.xml file somewhere and add that path
>> to the spark-submit command".
>> 
>> I thought it would be sufficient that my application jar file contains a
>> log4j.xml file, but that does not seem to be the case.  If I don't add a
>> log4j.xml file to the classpath before launching spark-submit, the one
>> bundled with spark will be used -- which has a negative influence on my
>> program execution.  Is there any way I can tell spark-submit to use the
>> log4j configuration bundled in my jar file?
>> 
>> Thanks
>> Tobias
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to