Good point.

Thanks. Dmitry.


On Fri, Apr 12, 2013 at 3:09 AM, Graham Kelly-Cohn <sgkel...@gmail.com>wrote:

> I don't think this is a safe optimization. In the following case it would
> output 'b' and not 'a' which is the correct result:
>
> a.php:
> <?php
> define('FOO', 'a');
> include('b.php');
> ?>
>
> b.php:
> <?php
> define('FOO', 'b');
> echo FOO;
> ?>
>
> It is certainly not likely for a constant to be defined twice but PHP
> currently just issues a notice and continues with the first constant value.
>
>
> On Thu, Apr 11, 2013 at 3:57 PM, Larry Garfield <la...@garfieldtech.com
> >wrote:
>
> > Speaking as a userspace developer and site admin, I'd be fine with
> trading
> > a more expensive compilation for a runtime improvement.  Even a 100%
> > increase in compilation time would pay for itself over only a dozen or so
> > requests (assuming the runtime improvements are non-trivial, too).
> >
> > Naturally some optimizations are harder to do than others given PHP's
> > architecture, but trading more expensive compile for cheaper runtime,
> even
> > if not a 1:1 trade, would be a win IMO.
> >
> > --Larry Garfield
> >
> >
> > On 4/10/13 9:16 AM, Dmitry Stogov wrote:
> >
> >> For now, the optimizations we do are quite chip.
> >> They may increase the compilation time on first request by 2, but on
> >> following requests we will get it back.
> >> Once we come to really expensive optimizations we will do it "offline"
> (in
> >> context of a separate process).
> >>
> >> Thanks. Dmitry.
> >>
> >>
> >> On Wed, Apr 10, 2013 at 5:16 PM, Florin Patan <florinpa...@gmail.com>
> >> wrote:
> >>
> >>  On Wed, Apr 10, 2013 at 4:07 PM, Arvids Godjuks <
> >>>> arvids.godj...@gmail.com>
> >>>>
> >>> wrote:
> >>>
> >>>> 2013/4/10 Dmitry Stogov <dmi...@zend.com>
> >>>>
> >>>>  Hi,
> >>>>>
> >>>>> Recently, I've found that OPcache optimizer misses a lot of
> abilities,
> >>>>> because it handles only one op_array at once. So it definitely can't
> >>>>> perform any inter-function optimizations (e.g. inlining).
> >>>>>
> >>>>> Actually, it was not very difficult to switch to "script at once"
> >>>>>
> >>>> approach.
> >>>
> >>>> The attached patch demonstrates it and adds per script constants
> >>>>> substitution explained in the following script
> >>>>>
> >>>>> <?php
> >>>>> define("FOO", 1);
> >>>>> function foo() {
> >>>>>      echo FOO . "\n"; // optimizer will replace it with: echo "1\n";
> >>>>> }
> >>>>> ?>
> >>>>>
> >>>>> Of course, I ran the PHP test suite and it passed all the same tests.
> >>>>> Personally, I think it's safe to include this patch into 5.5 and
> make a
> >>>>> green light to some other advanced optimizations in 5.5. (e.g.
> >>>>>
> >>>> conversion
> >>>
> >>>> INIT_FCALL_BY_NAME into DO_FCALL).
> >>>>>
> >>>>> Any thoughts?
> >>>>>
> >>>>> Thanks. Dmitry.
> >>>>>
> >>>>> --
> >>>>> PHP Internals - PHP Runtime Development Mailing List
> >>>>> To unsubscribe, visit: http://www.php.net/unsub.php
> >>>>>
> >>>>>
> >>>>
> >>>> Hi!
> >>>>
> >>>> Many obvious optimizations are not used due to the fact, that script
> >>>> translation into opcode state has to be fast. The nature of PHP
> dictated
> >>>> that and this was re-iterated countless times on this mailing list by
> >>>> the
> >>>> core developers.
> >>>>
> >>>> To do advanced stuff, you have to create some sort of pre-compile or
> >>>> storing that compiled code reliably on disk so that if memory cache is
> >>>> dropped or restart is done, there is no significant preformance hit
> >>>> while
> >>>> all the code compiles into optimized opcode again.
> >>>>
> >>>> I would also imagine that good part of the optimizations would require
> >>>> multiple files to be processed and optimized, but due to dynamic
> nature
> >>>>
> >>> of
> >>>
> >>>> the PHP opcode compilation is done on per-file basis, so do the
> >>>> optimizations.
> >>>>
> >>>> It's very commendable that you want to push optimizations and stuff,
> but
> >>>> there are some fundamental stuff that needs to be cared of to do some
> >>>> really good stuff.
> >>>>
> >>>> My 0.02$
> >>>>
> >>>
> >>> Hello,
> >>>
> >>>
> >>> If applying optimizations in multiple passes would be a problem for
> >>> speed, especially on the first request, then maybe a way to solve this
> >>> would be to have a configurable variable like: opcache.passes which is
> >>> between 1 and 10 (lets say) and then have the engine do something like
> >>> this:
> >>> - load the file, compile it and apply a first round of 'quick'
> >>> optimizations for the first time and mark it as passed once;
> >>> - next request, load the compiled version, apply another round of
> >>> optimization then mark it as a second pass
> >>> - repeat the above step until the optimization passes in the said file
> >>> = opcache.passes value
> >>>
> >>> This way only the initial requests will be affected by this but in a
> >>> way that the hit on those requests is smaller that applying all the
> >>> steps at once.
> >>> I'm really not sure if it's that easy to implement but 'on paper' this
> >>> could be the way to solve it imho.
> >>>
> >>> What do you think, does it make sense?
> >>>
> >>>
> >>>
> >>> Best regards
> >>> ----
> >>> Florin Patan
> >>> https://github.com/dlsniper
> >>> http://www.linkedin.com/in/**florinpatan<
> http://www.linkedin.com/in/florinpatan>
> >>>
> >>> --
> >>> PHP Internals - PHP Runtime Development Mailing List
> >>> To unsubscribe, visit: http://www.php.net/unsub.php
> >>>
> >>>
> >>>
> >>
> > --
> > PHP Internals - PHP Runtime Development Mailing List
> > To unsubscribe, visit: http://www.php.net/unsub.php
> >
> >
>

Reply via email to