In article <[EMAIL PROTECTED]> you write:
>On Fri, Apr 06, 2007 at 06:51:24PM -0500, Gabriel Dos Reis wrote:
>> David Daney <[EMAIL PROTECTED]> writes:
>> 
>> | One could argue that issuing some type of diagnostic (either at
>> | compile time or run time) would be helpful for people that don't
>> | remember to write correct code 100% of the time.
>> 
>> I raised this very issue a long time ago; a long-term GCC contributor
>> vocally opposed checking the possible overflow.  I hope something will
>> happen this time.
>
>I don't like slowing programs down, but I like security holes even less.
>
>If a check were to be implemented, the right thing to do would be to throw
>bad_alloc (for the default new) or return 0 (for the nothrow new).  If a
>process's virtual memory limit is 400M, we throw bad_alloc if we do new
>int[2000000], so we might as well do it if we do new int[2000000000].
>There would be no reason to use a different reporting mechanism.
>
>There might be rare cases where the penalty for this check could have
>an impact, like for pool allocators that are otherwise very cheap.
>If so, there could be a flag to suppress the check.
>


Considering the issue is only for new [], I'd assume anyone who wants
less correct, but faster behavior would simply handle the computation
themselves, and deal with the overflow manually, then override whatever
operator/class they need to make things work.

Reply via email to