Hi,

My spark job is failing while pulling the properties file from hdfs.Same
code is running fine when i am running in windows but not able to run when
testing it on yarn.

*spark submit script: spark-submit --class
com.mcd.sparksql.datahub.DataMarts --master  local[*] gdwspark.jar
 hdfs://10.1.144.79:8020/daas/spark/Daas.properties
<http://10.1.144.79:8020/daas/spark/Daas.properties>*

*Exception:*


*Code: Place where the code is failing in yarn.*

16/07/26 18:48:11 INFO InputStream$: File
SystemDFS[DFSClient[clientName=DFSClient_NONMAPREDUCE_1681311895_1,
ugi=hadoop (auth:SIMPLE)]]
16/07/26 18:48:11 INFO InputStream$: File Path --> hdfs://
10.1.144.79:8020/daas/spark/Daas.properties
Exception in thread "main" java.io.IOException: Filesystem closed
        at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:808)
        at
org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:868)
        at
org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:934)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at java.util.Properties$LineReader.readLine(Properties.java:434)
        at java.util.Properties.load0(Properties.java:353)
        at java.util.Properties.load(Properties.java:341)
        at com.mcd.sparksql.util.DaasUtil$.get(DaasUtil.scala:44)
        at com.mcd.sparksql.datahub.DataMarts$.main(DataMarts.scala:25)
        at com.mcd.sparksql.datahub.DataMarts.main(DataMarts.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
        at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
        at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

*Code:*

 logger.info("File System"+fs)
    try {
      val p = new Path(fname)
      logger.info("File Path --> "+p)
      //var br= new BufferedReader(new FileReader(fname))
    *  fs.open(p)*


any ideas on how to resolve this.

Thanks,
Asmath.

Reply via email to