Jesse Zhang <sbje...@gmail.com> writes: > On Fri, Apr 10, 2020 at 3:59 PM Tom Lane <t...@sss.pgh.pa.us> wrote: >> They can't be strict because the initial iteration needs to produce >> something from a null state and non-null input. nodeAgg's default >> behavior won't work for those because nodeAgg doesn't know how to >> copy a value of type "internal".
> Ah, I think I get it. A copy must happen because the input is likely in > a shorter-lived memory context than the state, but nodeAgg's default > behavior of copying a by-value datum won't really copy the object > pointed to by the pointer wrapped in the datum of "internal" type, so we > defer to the combine function. Am I right? Then it follows kinda > naturally that those combine functions have been sloppy on arrival since > commit 11c8669c0cc . Yeah, they're relying exactly on the assumption that nodeAgg is not going to try to copy a value declared "internal", and therefore they can be loosey-goosey about whether the value pointer is null or not. However, if you want to claim that that's wrong, you have to explain why it's okay for some other code to be accessing a value that's declared "internal". I'd say that the meaning of that is precisely "keepa u hands off". In the case at hand, the current situation is that we only expect the values returned by these combine functions to be read by the associated final functions, which are on board with the null-pointer representation of an empty result. Your argument is essentially that it should be possible to feed the values to the aggregate's associated serialization function as well. But the core code never does that, so I'm not convinced that we should add it to the requirements; we'd be unable to test it. regards, tom lane