Jason Merrill wrote: > On 08/03/2009 09:36 PM, Adam Butcher wrote: >> Thanks. I haven't any copyright assignments on file -- this is my first >> dabbling with gcc and I've been doing it >> mostly to experiment with C++ lambda support and non-standard extensions >> such as polymorphic lambda support. > > OK. We'll need an assignment in order to incorporate your changes into > GCC. If you want to assign all present and future changes to the > compiler, you could use the form > OK I've kicked that off.
>> I've been working in my own repository which I rebased earlier today from >> trunk (r150148). I've attached the >> rebase including the fix for generating correct copying of lambda classes >> (re-laying out after parsing body). > Hmm, having the new changes folded into the old changes is kind of > inconvenient, but I suppose I can deal. It's moot anyway until your > assignment is on file. > Yes sorry about that. I appreciate the issue. I had taken a branch of trunk and applied the lambda changes to it to keep only lambda changes on my working branch (allowing simpler future rebaseing). There were a number of things I had to change to to get the lambda changes a) building and b) working with test programs. Unfortunately I did all this in one commit. Not very helpful. I will mail you my 'fixes' diff against the latest lambda branch head independent of the rebase noise if you want. > Incidentally, comments like > > /* > * > * > */ > > aren't customary in GCC sources; that's just something John did that > we'll need to clean up before the code goes into the trunk. There are > other formatting irregularities, some of which I've corrected. > Ah I see. I was following the principle of following the nearest style! I have reworked my changes in this style. >> Experimenting with a working version and seeing it's issues will be useful >> to me. To others to maybe. With >> concepts >> gone from C++0x and being reworked for C++15(?) maybe support for >> polymorphic lambdas could be reintroduced? -- >> though >> I'm sure its much too late for that and that its likely been around the buoy >> many times. From what I have read I >> got >> the idea that the Callable concept was the primary reason for polymorphic >> lambdas not being accepted. > > I don't know what the reasoning was there, but people have been somewhat > conservative about what usages of lambdas are allowed for fear of > unforseen implementation issues. Certainly having a working > implementation would go a lot toward convincing people to allow it, even > if it doesn't make it into C++0x. > Hopefully. From my point of view the class generated by a lambda expression should be equivalent to something you could write yourself -- aside from the single stack-pointer reference optimization which only a compiler could achieve -- the class has a name, albeit invisible to the user (except in errors/warnings), and instances of should be useable just as if there were user-defined functors. I'm sure there are things I've overlooked but hopefully this proof-of-concept will help to allay people's fears. >> Implied template typename arguments via auto are not currently supported. >> The syntax parses but I haven't yet >> synthesized the template arguments and therefore not replaced the auto's >> with them so it doesn't compile. > > Since templates work so differently from normal functions, I'm a little > uncomfortable with the idea of templates that don't involve any template > syntax, just the use of auto in the parameter list. But I'm open to > giving it a try, at least in the lambda context. Maybe outside of > lambda it could be used with a small template introducer... > I did think about auto as a shortcut for template typename outside of lambdas but I figured that was pushing it a bit too far. It wouldn't be obvious that a function declared in that way was actually a function template. The benefit of using it in lambdas is that the call operator is never 'seen', so whether its a template or not shouldn't affect the caller providing the input arguments are compatible. >> I'd welcome feedback on the syntax and semantics and of course any >> fundamental things I've missed by coming up with >> such a syntax! >> >> From a grammar point of view I've just expanded >> lambda-parameter-declaration to start with: >> lambda-template-param-decl [opt] >> where lambda-template-param-decl is: >> < template-parameter-list> > > Makes sense. I suppose to specify explicit template arguments users > would have to write > > lambdaob.operator()<template-arg>(fn-arg) > That would be the case for explicit instantiation yes. With my implementation you could only get such as nullary function template if you explicitly specified the template parameters and explicitly specified an empty call argument list. I've made the template parameter list optional within the already optional lambda-parameter-declaration. If present, it has to be followed by a function parameter list. The intent of course is that any template parameters will be referenced in the function parameter list. Though it does not prevent having unreferenced template parameters that would need to be explicitly bound. > so we'd really expect them to be only deduced. Yes. >> # unspecified typename inferred by use of auto in function parameter list >> [=]<typename YY> (auto* x, YY y) { return x->Func( y, a, b ); }; >> # is translated into >> [=]<typename YY, typename X> (X* x, YY y) { return x->Func( y, a, b >> ); }; > > A mixed declaration like that seems rather ugly to me; I'd hope that > people would use either explicit template args or auto in a particular > lambda, not both. But I suppose that's not reason enough to forbid it. > Yes. I was just considering the general case. Note that auto only infers 'typename' template parameters. If you wanted more complex parameters you would need to specify them in the template parameter list -- but you may still want to use auto as well. I suspect this case would be handled better elsewhere though. It is an edge usage that I wouldn't expect to see. Maybe to constrain some input -- for instance [=] <template <int> class T> (auto* x, T<4> const& t) { ... would force that the second argument was a template parameterized on int 4 (for whatever obscure reason) whilst still adding an implicit typename parameter for use in the type of 'x'. As you say, I don't really see any reason to forbid it. Someone may find a unique and useful use for it at some point! John Freeman wrote: > Jason Merrill wrote: >>> Experimenting with a working version and seeing it's issues will be >>> useful to me. To others to maybe. With concepts >>> gone from C++0x and being reworked for C++15(?) maybe support for >>> polymorphic lambdas could be reintroduced? -- though >>> I'm sure its much too late for that and that its likely been around >>> the buoy many times. From what I have read I got >>> the idea that the Callable concept was the primary reason for >>> polymorphic lambdas not being accepted. >> >> I don't know what the reasoning was there, but people have been >> somewhat conservative about what usages of lambdas are allowed for >> fear of unforseen implementation issues. Certainly having a working >> implementation would go a lot toward convincing people to allow it, >> even if it doesn't make it into C++0x. > > There were several issues with polymorphic lambdas in the presence of > concepts that concerned many on the committee. I've come to accept that > it's too late to re-introduce polymorphic lambdas into C++0x (now > C++1x), but there's no stopping GCC from implementing it as an extension. > I feared that would be the case. If its accepted by the C++ community as a useful thing in GCC then maybe it could become a defacto standard -- if we could just get comeau, digital-mars, borland, and microsoft to buy into it ;). But I'm getting _way_ ahead of myself! -- we don't even know if people want it yet -- or whether there are any nasties that crop up during experiments. > Just my opinion, but I don't think there should be any special template > syntax [for auto to be used to infer template typename parameters in > polymorphic lambdas]. The whole point of argument deduction was > terseness. It doesn't have to be implemented using templates, so I > don't equate it with templates. > That's a good point. Hopefully others will share that view -- the syntax certainly seems intuitive to me as a user. John Freeman wrote: > Relaying out the class after the lambda body was parsed is something I > had attempted to do, but possibly failed at. It will absolutely need to > be done. > >> The way 'default reference capture' is implemented on the lambda branch >> seems to be kind of reactive. I would expect that inheriting the >> stack somehow (maybe using just a stack pointer) would be better >> but without more investigation I don't know if that is possible or >> how one would go about doing it. > > This won't be suitable in the general case, because of the copy default > capture [=]. Implicit captures will have to be reactive. > Yes I realized this when I was re-basing my changes. I've altered my stack-pointer comment to refer to only implicit reference capture in my latest patch. > This is preferred for reference default captures, but I was not familiar > with stack pointer mechanics in GCC. I just wanted to get something > working first that could be improved later. > Completely agree. And you've done a great job. Also handling the more general and useful case is clearly a winning strategy. Clients would not be affected by an optimization later -- just that their reference lambdas would drop in size. > When I last left the discussion, the biggest > issue in my opinion was concept map dropping. The workaround made the > deduction algorithm intimidatingly complex. > I am not up to speed with the issues but they obviously exist. One thing I don't understand, and perhaps you could help, is that whatever issues polymorphic lambdas have -- surely manually written function objects with template call operators have the same issues. Is this true? And if so does it mean that future standard library algorithms will be more constraining than exisiting ones -- or am I barking up the wrong tree? Regards, Adam