On 04/07/2015 11:05 AM, Grant Edwards wrote:
On 2015-04-07, Chris Angelico <ros...@gmail.com> wrote:
On Wed, Apr 8, 2015 at 12:36 AM, <jonas.thornv...@gmail.com> wrote:
Integers are internally assumed to be base 10 otherwise you could not
calculate without giving the base.
All operations on integers addition, subtraction, multiplication and
division assume base 10.
You misunderstand how computers and programming languages work. What
you're seeing there is that *integer literals* are usually in base
10; and actually, I can point to plenty of assembly languages where
the default isn't base 10 (it's usually base 16 (hexadecimal) on IBM
PCs, and probably base 8 (octal) on big iron).
I'd be curious to see some of those assemblers. I've used dozens of
assemblers over the years for everything from microprocessors with a
few hundred bytes of memory to mini-computers and mainframes. I've
never seen one that didn't default to base 10 for integer literals.
I'm not saying they don't exist, just that it would be interesting to
see an example of one.
I can't "show" it to you, but the assembler used to write microcode on
the Wang labs 200VP and 2200MVP used hex for all its literals. I wrote
the assembler (and matching debugger-assembler), and if we had needed
other bases I would have taken an extra day to add them in.
That assembler was not available to our customers, as the machine
shipped with the microcode in readonly form. Not quite as readonly as
the Intel processors of today, of course.
Additionally, the MSDOS DEBUG program used hex to enter in its literals,
if i recall correctly. Certainly when it disassembled code, it was in hex.
--
DaveA
--
https://mail.python.org/mailman/listinfo/python-list