On Tue, 4 Nov 2014 14:30:54 -0800, Charles Mills wrote:

>I always thought it was the hex just sort of seemed "system-like" and
>decimal numbers were, you know, for those COBOL types. <g>
>
>I always wondered why did they put two more or less mutually-exclusive data
>in two different 12-bit fields? If they had devoted 11 bits to the ABEND
>code and one bit to system versus user, we could have had ABEND codes
>ranging up to S7FFFFF or U8388607. Whether that would have been good or bad
>I will leave as an exercise for the reader.
>
(ITYM "23")

For S0Cx, the bottom nybble is the hardware interrupt code.  This provides
some motivation for hex.  But still, why decimal?

In a CDC operating system, octal ruled.  Job time limits were coded in octal
numbers of seconds.  The ROT was that 100 (octal) seconds was about a
minute, and 10000 (octal) seconds was about an hour.  At some point this
impelled a naive colleague to ask  "Are octal seconds bigger than decimal
seconds?"

Hey, it makes as much sense as KibiBytes and MebiBytes, abbreviated as
K and M.  And why not allow a simple unsuffixed decimal number, e.g.
REGION=100000000?  (Or does that actually work?  I haven't tried it.)
Can "G" (or "Gi") be far behind?

-- gil

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: INFO IBM-MAIN

Reply via email to