Eren Avsarogullari created SPARK-17663:
------------------------------------------
Summary: SchedulableBuilder should handle invalid data access via
scheduler.allocation.file
Key: SPARK-17663
URL: https://issues.apache.org/jira/browse/SPARK-17663
Project: Spark
Issue Type: Bug
Components: Scheduler
Affects Versions: 2.0.1
Reporter: Eren Avsarogullari
If spark.scheduler.allocation.file has invalid minShare or/and weight, they
cause NumberFormatException due to function toInt and SparkContext can not be
initialized. Currently, if schedulingMode does not have valid value, a warning
message is logged and default value is set as FIFO. Same pattern can be used
for minShare(default: 0) and weight(default: 1) as well.
Reproduce Code :
val conf = new SparkConf().setAppName("spark-fairscheduler").setMaster("local")
conf.set("spark.scheduler.mode", "FAIR")
conf.set("spark.scheduler.allocation.file",
"src/main/resources/fairscheduler-invalid-data.xml")
val sc = new SparkContext(conf)
fairscheduler-invalid-data.xml file :
<allocations>
<pool name="production">
<schedulingMode>FIFO</schedulingMode>
<weight>invalid_weight</weight>
<minShare>2</minShare>
</pool>
</allocations>
Stacktrace :
Exception in thread "main" java.lang.NumberFormatException: For input string:
"invalid_weight"
at
java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Integer.parseInt(Integer.java:580)
at java.lang.Integer.parseInt(Integer.java:615)
at
scala.collection.immutable.StringLike$class.toInt(StringLike.scala:272)
at scala.collection.immutable.StringOps.toInt(StringOps.scala:29)
at
org.apache.spark.scheduler.FairSchedulableBuilder$$anonfun$org$apache$spark$scheduler$FairSchedulableBuilder$$buildFairSchedulerPool$1.apply(SchedulableBuilder.scala:127)
at
org.apache.spark.scheduler.FairSchedulableBuilder$$anonfun$org$apache$spark$scheduler$FairSchedulableBuilder$$buildFairSchedulerPool$1.apply(SchedulableBuilder.scala:102)
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]