On 15.08.2017 11:15, Tony Whyman via Lazarus wrote:
Why shouldn't there be a single char type that intuitively represents a single character regardless of how many bytes are used to represent it.
I suppose by "char" you mean "single printable thingy" with Unicode it's rather debatable what such a thingy is.
Hence a Unicode singe char would need to be just be a Unicode string. -Michael -- _______________________________________________ Lazarus mailing list Lazarus@lists.lazarus-ide.org https://lists.lazarus-ide.org/listinfo/lazarus