Richard,

My _concern_ with inlining is that since it is designed to short-circuit 
dynamic method lookup, it is impossible to call a _different_ implementation. 
That is, you lose the opportunity to have the _receiver_ decide how to respond 
to the message. You may think of it as a message, but the caller is deciding 
how the receiver will respond—which largely defeats the purpose and role of it 
being a message. Yes, at the machine code level you are performing a branch 
instruction, but when comparing OOP to Procedural Programming we typically make 
a distinction between “messages” and "procedure calls." The distinction is that 
the receiver gets to decide how to respond to a message. In C++ this is the 
distinction between a “virtual" and "non-virtual" function. By inlining, you 
are converting the function from a virtual function to a non-virtual function, 
and this can make a difference (which is why virtual functions exist).

How should a proxy (https://en.wikipedia.org/wiki/Proxy_pattern 
<https://en.wikipedia.org/wiki/Proxy_pattern>) to nil respond to the #’isNil’ 
message? How should the Null Object Pattern 
(https://en.wikipedia.org/wiki/Null_object_pattern 
<https://en.wikipedia.org/wiki/Null_object_pattern>) respond to #’isNil’? 

And, yes, I’m sure you can come up with benchmarks that show a measurable 
difference, but what is the impact in realistic code? When someone asked about 
inlining #’yourself’ in GemStone I believe I measured the performance as taking 
2 nanoseconds per call (on a 2012 machine). A 10% speedup would make it 1.8 
nanoseconds. Is that worth it? Maybe, maybe not.

Note that I described my position as a “concern,” not an ideological objection. 
Mostly I’m giving a rationale for something that doesn’t seem to be explained 
very well for you. I accept that there may be a time for inlining, but I can 
“comprehend" another side to the issue.

James

> On Mar 16, 2022, at 9:42 PM, Richard O'Keefe <rao...@gmail.com> wrote:
> 
> We're still not on the same page.
> You seem to have some ideological objection to inlining that
> I am completely failing to comprehend.
> Just because a procedure call (message send) is inlined doesn't
> in the least mean it *isn't* a procedure call (message send),
> just as compiling a procedure call (message send) as a jump
> (last-call optimisation) doesn't mean it *isn't* a procedure
> call (message send).
> By the way, forget about "40 years ago".
> I just did an experiment in Pharo 9, and found that
> using "_ ifNotNil: " instead of "_ izNil ifFalse: "
> -- where izNil is a non-inlined self == nil --
> gave a 10% speedup, in a test code where real work was going
> on as well.
> As for turning off all inlining, what do you think that would
> do to #ifFalse:ifTrue: and its relatives?  
> 
> 
> On Thu, 17 Mar 2022 at 08:34, <s...@clipperadams.com 
> <mailto:s...@clipperadams.com>> wrote:
> 
> 
> To start with, why do you CARE whether a particular method is inlined or not?
> 
> I care because it makes “everything is a message” a lie! And I suspect (no 
> proof and could be wrong) it’s an optimization that only made sense with the 
> hardware constraints of 40+ years ago. Arguing against premature optimization 
> is hardly something I just made up ;-)
> 
> This makes absolutely no sense to me. What makes you think that the 
> combination "_ isNil ifFalse: [_]" will NOT be inlined?
> 
> I may have been unclear. My intent was to communicate: “I’d like to stop ALL* 
> inlining of messages by default if possible”
> 
> *or as many as practical
> 
> The thing that rings loud alarm bells for me is there being "long chains" in 
> the first place.
> 
> I agree that it is in general a smell, but long chains was tangential to the 
> intention above
> 
> Can you give an example?
> 
> I don’t know if I can think of one that’s not contrived… Wrapping something 
> external? Squeak’s AppleScript support used to mirror the underlying AS, 
> which is pretty much exactly that.
> 
> In my own programming, I've generally found that nils turning up in the 
> middle of a chain indicates a serious design error somewhere.
> 
> Agreed. See smell comment above.
> 

Reply via email to