Marc Santhoff wrote:

Am Sonntag, den 21.08.2005, 16:19 -0700 schrieb L505:
The first compilers were written in assembly language. This allowed
for
the next generation compilers to be written in a high level language.
And the assembly language was just magically inserted into the memory
with that magic script?

At some point it comes to a hardware etching level, I'm guessing.

Yes, "assembling" is the task of translating mnemonic codes to
hexadecimal byte codes that are what is called "machine language". The
programs doing this automatically are called assemblers.

In fact, it's translating mnemonics and syntactical constructs to -binary- codes; hexadecimal is just a notation often used in object files and by monitor programs.

The machine language has to be put in the program memory of the machine
in question. That can be done via burning an EPROM or similar, by poking
bytes in hex in via the system monitor, ...

Often the first compilers (and interpreters) ported to a new machine
in that ancient times were forth engines, because the core of such thing
is only a few kilobytes big. An it can compile compilers and anything
else.

Assembler, often considered "low-level", introduces symbolic programming, i. e. assigning names and syntax to binary patterns. Isn't that a bigger invention (or abstraction) than that of high-level languages? I mean, isn't the step from binary programming to assembler larger than the step from assembler to HLLs?

Is all this forgotten nowadays?
Marc
Every year, new layers, APIs etc. are built on top of old ones, sinking them into the almost unconscious. It's not yet completely forgotten, but doesn't it seem to become an esoteric science? Most programmers today see some API or platform as their working base, which is IMHO like standing on a cloud - they don't see the transistors etc. Would they be capable to build a computer from scratch? A mechanical cash register? A hydraulic-based computer?

But you can't make money with that comprehensive knowledge. And, the advance in electronics hides the disadvantages of current software structure with it's many layers, wasting time and memory, and sacrificing simple, understandable structures. A computer with 2 GHz, booting in 2 minutes, spends 240 000 000 000 cycles, doing almost nothing in a terribly complicated way. That's crazy if you see the whole thing. But who cares?

There will be, in a few years, a stop in the increase of hardware performance, dictated by quantum theory. How will that change the software development process?

Anton


_______________________________________________
fpc-pascal maillist  -  fpc-pascal@lists.freepascal.org
http://lists.freepascal.org/mailman/listinfo/fpc-pascal

Reply via email to