Jonas Maebe wrote:

Provided that one calls a Z80 16-bit :-) More to the point: do any current CPUs have e.g. vector operations that suggest a realistic maximum size?

The current x86's bit test/set instructions support addressing the complete 32/64 bit address space. But the original 8086 didn't have any vector instructions at all. Again: this limitation is unrelated to instruction sets, it's about deciding on a point at which you're going to waste a lot of memory by using a plain bitmap.

For larger sets... OK, how /does/ one declare a set of UTF-8 characters?

An UTF-8 character is not an ordinal data type and hence support for "set of <utf-8 character>" is orthogonal to support for larger sets. If you store them in strings or arrays, then you need a hashtable of strings or arrays (and/or support for sets of strings or arrays, which would probably be implemented using... a hashtable).

That was pretty much my gist. Since these days Unicode is larger than 65536 codepoints I don't think there's any advantage to expanding sets from 256 to 65536 elements, efficient operations on sparse arrays of 256-element sets would be far better.

A modest expansion to be able to handle something like a bitboard for Go might be attractive though.

--
Mark Morgan Lloyd
markMLl .AT. telemetry.co .DOT. uk

[Opinions above are the author's, not those of his employers or colleagues]
_______________________________________________
fpc-pascal maillist  -  fpc-pascal@lists.freepascal.org
http://lists.freepascal.org/mailman/listinfo/fpc-pascal

Reply via email to