Yes . I am reading thousands of files every hours. Is there any way I can
tell spark to timeout.
Thanks for your help.

-D

On Mon, Dec 22, 2014 at 4:57 AM, Shuai Zheng <szheng.c...@gmail.com> wrote:

> Is it possible too many connections open to read from s3 from one node? I
> have this issue before because I open a few hundreds of files on s3 to read
> from one node. It just block itself without error until timeout later.
>
> On Monday, December 22, 2014, durga <durgak...@gmail.com> wrote:
>
>> Hi All,
>>
>> I am facing a strange issue sporadically. occasionally my spark job is
>> hungup on reading s3 files. It is not throwing exception . or making some
>> progress, it is just hungs up there.
>>
>> Is this a known issue , Please let me know how could I solve this issue.
>>
>> Thanks,
>> -D
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/S3-files-Spark-job-hungsup-tp20806.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>

Reply via email to