Hi!

> I think there is a confusion about the "servers written in PHP". Those
> applications serves more requests in a single (main) PHP request using
> the even loop. Good examples of that are Aerys or ReactPHP. So we don't
> want to kill that main request if one of the handled requests is
> malicious (ideally we just ignore that malicious request and server others).

That can't work well. PHP makes a lot of assumptions in short-lived
requests, and assuming the same request lives forever is bound to cause
a lot of issues - from stale objects sitting around to unoptimal memory
use to eventual overflows in all kinds of counters, etc. Why not use
real servers for server work?
You yourself say you'll have it behind nginx or so - so why not let
nginx do the server part?

We have several hundred places in PHP where fatal error (E_ERROR or
E_CORE_ERROR) can be produced. Of course, some of them are compile-time
but not all of them. E.g., various string overflow scenarios can be
triggered by combining or processing strings (such as uncompressing or
decoding various formats). If you server relies on proxy to filter out
all fatal error scenarios, I'm afraid it's much harder than it seems.

I of course do not propose to make nginx filter JSON data. On the
contrary, I propose when we have clearly malicious DoS attempt (like
string overflow or hash collision overflow) to fail fast instead of
trying to make right and playing whack-a-mole. We have enough fails in
this are already.

-- 
Stas Malyshev
smalys...@gmail.com

-- 
PHP Internals - PHP Runtime Development Mailing List
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to