Hello, On Tue, 14 Oct 2025, Alejandro Colomar wrote:
> Because bool is entirely different from the other regular integer types, > and has entirely different rules, Why do you say that? bool and its operations is a normal finite algebra on an ordered set, the only thing being that its add/mul are saturating and have "funny" colloquial names (namely "or" and "and"): add == or ; mul == and Because the set is finite and fully ordered comparison makes (a) sense, and the (b) max/min exists; and because it is so very small these two operations actually coincide with add and mul as well: max == or and min == and. The maxof/minof of that type is again trivial as well, and arguably even more elegantly defined than on the other integer types (because of the saturating behaviour of add, that maxof is the fixed-point of the operation add1, and minof the fixed point of mul-any). > Also, it's easy to extend features, but not so much to narrow them. > Thus, I find it better (safer) to exclude it, at least initially. I > welcome anyone interested in supporting bool to show a valid use case > for it. I prefer to be cautious by default. Exceptions are the enemy of good language design. max/min on bool are well defined as are maxof/minof. Excluding it from these operators seems fairly unnatural. My 2 cents :-) Ciao, Michael.
