Does the sparkContext shuts down itself by default even if I dont mention
specifically in my code?? Because, I ran the application without
sc.context(), still I get file system closed error along with correct
output.

On Tue, Dec 2, 2014 at 2:20 PM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> It could be because those threads are finishing quickly.
>
> Thanks
> Best Regards
>
> On Tue, Dec 2, 2014 at 2:19 PM, rapelly kartheek <kartheek.m...@gmail.com>
> wrote:
>
>> But, somehow, if I run this application for the second time, I find that
>> the application gets executed and the results are out regardless of the
>> same errors in logs.
>>
>> On Tue, Dec 2, 2014 at 2:08 PM, Akhil Das <ak...@sigmoidanalytics.com>
>> wrote:
>>
>>> Your code seems to have a lot of threads and i think you might be
>>> invoking sc.stop before those threads get finished.
>>>
>>> Thanks
>>> Best Regards
>>>
>>> On Tue, Dec 2, 2014 at 12:04 PM, Akhil Das <ak...@sigmoidanalytics.com>
>>> wrote:
>>>
>>>> What is the application that you are submitting? Looks like you might
>>>> have invoked fs inside the app and then closed it within it.
>>>>
>>>> Thanks
>>>> Best Regards
>>>>
>>>> On Tue, Dec 2, 2014 at 11:59 AM, rapelly kartheek <
>>>> kartheek.m...@gmail.com> wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> I face the following exception when submit a spark application. The
>>>>> log file shows:
>>>>>
>>>>> 14/12/02 11:52:58 ERROR LiveListenerBus: Listener EventLoggingListener
>>>>> threw an exception
>>>>> java.io.IOException: Filesystem closed
>>>>> at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:689)
>>>>> at
>>>>> org.apache.hadoop.hdfs.DFSOutputStream.flushOrSync(DFSOutputStream.java:1668)
>>>>> at
>>>>> org.apache.hadoop.hdfs.DFSOutputStream.hflush(DFSOutputStream.java:1629)
>>>>> at
>>>>> org.apache.hadoop.hdfs.DFSOutputStream.sync(DFSOutputStream.java:1614)
>>>>> at
>>>>> org.apache.hadoop.fs.FSDataOutputStream.sync(FSDataOutputStream.java:120)
>>>>> at
>>>>> org.apache.spark.util.FileLogger$$anonfun$flush$2.apply(FileLogger.scala:158)
>>>>> at
>>>>> org.apache.spark.util.FileLogger$$anonfun$flush$2.apply(FileLogger.scala:158)
>>>>> at scala.Option.foreach(Option.scala:236)
>>>>> at org.apache.spark.util.FileLogger.flush(FileLogger.scala:158)
>>>>> at
>>>>> org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:87)
>>>>> at
>>>>> org.apache.spark.scheduler.EventLoggingListener.onJobEnd(EventLoggingListener.scala:112)
>>>>> at
>>>>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$4.apply(SparkListenerBus.scala:52)
>>>>> at
>>>>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$4.apply(SparkListenerBus.scala:52)
>>>>> at
>>>>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:81)
>>>>> at
>>>>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:79)
>>>>> at
>>>>> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>>>>> at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>>>>> at
>>>>> org.apache.spark.scheduler.SparkListenerBus$class.foreachListener(SparkListenerBus.scala:79)
>>>>> at
>>>>> org.apache.spark.scheduler.SparkListenerBus$class.postToAll(SparkListenerBus.scala:52)
>>>>> at
>>>>> org.apache.spark.scheduler.LiveListenerBus.postToAll(LiveListenerBus.scala:32)
>>>>> at
>>>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)
>>>>> at
>>>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)
>>>>> at scala.Option.foreach(Option.scala:236)
>>>>> at
>>>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:56)
>>>>> at
>>>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
>>>>> at
>>>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
>>>>> at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1160)
>>>>> at
>>>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:46)
>>>>>
>>>>> Someone please help me resolve this!!
>>>>>
>>>>> Thanks
>>>>>
>>>>>
>>>>
>>>
>>
>

Reply via email to