Ok... but my question is why spark.shuffle.consolidateFiles is working (or
is it)? Is this a bug?


On Wed, Jul 30, 2014 at 4:29 PM, Larry Xiao <xia...@sjtu.edu.cn> wrote:

> Hi Jianshi,
>
> I've met similar situation before.
> And my solution was 'ulimit', you can use
>
> -a to see your current settings
> -n to set open files limit
> (and other limits also)
>
> And I set -n to 10240.
>
> I see spark.shuffle.consolidateFiles helps by reusing open files.
> (so I don't know to what extend does it help)
>
> Hope it helps.
>
> Larry
>
>
> On 7/30/14, 4:01 PM, Jianshi Huang wrote:
>
>> I'm using Spark 1.0.1 on Yarn-Client mode.
>>
>> SortByKey always reports a FileNotFoundExceptions with messages says "too
>> many open files".
>>
>> I already set spark.shuffle.consolidateFiles to true:
>>
>>   conf.set("spark.shuffle.consolidateFiles", "true")
>>
>> But it seems not working. What are the other possible reasons? How to fix
>> it?
>>
>> Jianshi
>>
>> --
>> Jianshi Huang
>>
>> LinkedIn: jianshi
>> Twitter: @jshuang
>> Github & Blog: http://huangjs.github.com/
>>
>
>


-- 
Jianshi Huang

LinkedIn: jianshi
Twitter: @jshuang
Github & Blog: http://huangjs.github.com/

Reply via email to