2014-08-16 15:53 GMT+08:00 Sandy Ryza <[email protected]>:
> an occur if the queue do
Thank you for your feedback, I version of spark is 1.0.2, Hadoop 2.4.1.
I find the relevant code
val queueInfo: QueueInfo = super.getQueueInfo(args.amQueue)
logInfo( """Queue info ... queueName: %s, queueCurrentCapacity: %s,
queueMaxCapacity: %s,
queueApplicationCount = %s, queueChildQueueCount = %s""".format(
queueInfo.getQueueName,
queueInfo.getCurrentCapacity,
queueInfo.getMaximumCapacity,
queueInfo.getApplications.size,
queueInfo.getChildQueues.size))
if specify the queue name,
Errors will disappear.
for example
--queue sls_queue_1
../bin/spark-submit --class org.apache.spark.examples.JavaWordCount \
--master yarn \
--deploy-mode cluster \
--queue sls_queue_1 \
--verbose \
--num-executors 3 \
--driver-memory 6g \
--executor-memory 6g \
--executor-cores 3 \
../lib/spark-examples*.jar \
/user/www/tmp/audi/*
However, if set to spark-env.sh, this is not valid.
export HADOOP_CONF_DIR=/usr/local/webserver/hadoop-2.4.1/etc/hadoop/
export HADOOP_HOME=/usr/local/webserver/hadoop-2.4.1/
export JAVA_HOME=/usr/local/webserver/jdk1.7.0_67/
export SPARK_YARN_QUEUE=sls_queue_1
export YARN_CONF_DIR=$HADOOP_CONF_DIR
--
[email protected]|齐忠