> __autoload() works fine if you have exactly one implementation for your entire
> system with exactly one set of logic for how to map a class name to a file
> name.  As soon as you try to mix two different libraries together, fatal
> error.
>
> spl_autoload() does not suffer from this short-sighted problem but
> accomplishes the same goal in a much more robust fashion.
>
> Any new meta-auto-load mechanism should forego the dead-end that is a single
> unique __auto*() function and just use a design that actually works, aka
> spl_autoload() or similar.

I see the problem. At projectpier.org we have the one code set situation.
Given the reactions, I will rewrite the RFC to use spl* functions with
the information supplied above.

> With that said, I like the idea of generalizing autoload to include functions
> et al.  Autoloading functions would make my life a lot easier. :-)

Yes, the alternative solution is to write a wrapper function:
somefunc($a, $b) becomes ppcall("somefunc", $a, $b)
Works but if we can get this in PHP itself it will be much better.

>To be fair, though, half of the potential benefits the OP listed are already 
>solved
> by using a version control system.  Any version control system.  If you're
> still having collisions at the file level when two people work on the same
> file it's because you're an idiot and are not using version control (or the
> changes touch each other, in which case this proposal wouldn't help you
> anyway).

I disagree (but otoh, I am an idiot :-)). I just think there is no
conceptual need to group functions into a file.
I also don't believe in distributed vcs (probably idiotic again :-)).
The problem of integrating code is simply postponed with a dvcs.
At work we have good experiences with a centralized code repository
where people can work together, where backups are taken care of and
were revision history is built in. Integration issue are solved before
the code is touched.
Also, I think it is crazy I have to do a grep on a filesystem to find
out in which file a function is defined (e.g. grep -i -l "ion
pick_date_widget"). I know, documentation etc. But that argument only
works when a system is in good shape when you take over. Why can't I
just see by the file name what definition hides inside?

> The main advantage of this proposal would be lazy-loading of functions.
> I don't think autoloading methods makes any sense since classes cannot be 
> split between files anyway.

I think I was not clear with my remark on __call and __callStatic. I
think autodefining of methods is already possible by using __call and
__callStatic for that. The __call is used in this example for
dynamically adding methods to a class:
http://www.php.net/manual/en/function.call-user-func.php#89636. The
same __call function can be used to dynamically add code to the class.
But I think it would be better if the run time would do this in a
uniform way via autodefine.
The minimal definition in my view for a class would be "class
SomeClass {}" or "class SomeClass;" . But then classes resemble
namespaces to much (but in my view a class can also be viewed as a
namespace for a set of variables and functions). This is however
getting to theoretical, so lets keep it simple and let's assume that
the minimal definition of a class should have its variables, like this
"class SomeClass { var a; var b; }" and that methods can be added
dynamically.

> I'm not entirely sure what else is actually reasonable to autoload.  Classes,
> Interfaces, Functions, and Traits make up the scope of first-class code
> structures, don't they?  Autoloading a variable doesn't even make sense to me,
> and include files are already handled by include[_once].

Autodefining variables does make sense. Suppose you have a variable
called "$language" which is an array with a lot of language items and
that you have put this array somewhere. Current practices forces you
think what the possible paths are how the variable is accessed and
then include this array at the proper places. So far so good. But
things change in time and code is added to the system and suddenly the
variable is referenced via a path which was not foreseen. Current
practice forces you to rethink the include locations. But what happens
in reality? Most people simply include all the files somewhere
upfront, just to be sure. Again a lot of code parsed and processed
which may not be executed. Wouldn't it be much more elegant to define
the variable when it is first referenced?
Of course, this could be emulated with a function that would be
autodefined, so that is why I chose the minimal solution to not have
variables autodefined.

But another interesting option for autodefining variables is that you
are able to track which variables are used in your application. The
fun thing is that you can do this in every environment. For
development it is interesting to see if you are using variables which
you don't recognize (typo's) or don't want to use (e.g.
$HTTP_POST_VARS), for testing it is interesting to see if your test
cases are using the most important variables and for production it is
interesting to see which variables are used at all and which never.
The paragraph above applies of course to all objects that can be
autodefined. As pointed out in the RFC, autodefine enables you to get
more insight and thus more control over your codebase.

The include and include_once are simply mentioned because they are
already identified by the parser. And there is a difference between
the two. Maybe you want to know that an include_once is called
multiple times to rethink the location of the include.

Thanks for all the feedback so far. It is much easier for me to react
to arguments and remarks then to come up with answers / ideas when
writing an RFC.

Cheers, Reinier

-- 
PHP Internals - PHP Runtime Development Mailing List
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to