Hi,

On Sat, Sep 29, 2018 at 5:39 PM dmccunney <dennis.mccun...@gmail.com> wrote:
>
> Those were the days when MS was the outfit who got a start writing a
> version of BASIC for microcomputers, and got asked by IBM to craft
> an OS for the then new IBM PC.

Outsourcing software development was also a way for IBM to avoid
infighting and legal liability (which was a problem for them,
apparently). But even the IBM PC was more of a stopgap measure. Well,
at least the 8086 wasn't meant to be long-term viable. It was more or
less a temporary competitor to the Z80 while Intel was busy working on
more mature solutions like the (failed) iAPX 432. (As much as people
hate on the 8086 and segmentation, they sure seem to have loved the
386.) MS just played along so that they could keep selling their
compilers/languages. Also, later they didn't want to be tied down by
following others (e.g. Xenix and AT&T).

> MS bought a product called 86DOS from an outfit called
> Seattle Computer Products that made machines based on an 8086 CPU and
> an S100 bus, and used that as the base for what became MSDOS.

New-fangled 16-bit versus traditional 8-bit, which more was common at
the time. It was supposed to be more powerful, make things easier.
(Loosely similar to all the 64-bit hype nowadays, although I'd say the
overall improvement is less. But most people clearly prefer 64-bit
nowadys for various reasons, whether fair or not. But 32-bit isn't
quite "dead" yet, so I guess even they have to admit it's still a
market worth catering to, barely. Or maybe it's harder to adopt than
it sounds. Such transitions are never easy, even after many years.)

> It looked a lot like Digital Research's CP/M under the hood to make it
> easy to port popular CP/M applications like WordStar and VisiCalc to
> the new architecture.

CP/M-86 wasn't available yet, so they had little choice but to
buy/write their own.

> (And I recall when the OS war was DOS vs CP/M86
> vs UCSD Psystem vs DRDOS on the PC.  MS won.)

DR-DOS didn't come until later (1988??). IIRC, that was basically a
DOS-only variant of CP/M-86. That was after many people dropped CP/M
in favor of MS-DOS only (1987?). Even IBM tried to replace DOS with
OS/2 around that time, but RAM shortages hurt. (And the "DOS extender"
noticeably took away one obvious advantage of OS/2.) But of course
also IBM fired MS (1990 or 1991), so they both went their separate
ways with incompatible projects.

> IBM hasn't been the Evil Empire for quite some time.  MS is in the
> process of trying to mend its ways and *not* be the Evil Empire any
> more.  IBM and MS were what they once were for purposes of account
> control.  That no longer works, and both companies know it.

They've had a new CEO since five years. Yes, he's done many changes.
(Honestly, MS already gets a rep as being internally fragmented, too
many projects, too many competing teams. They gain and lose interest
in various projects at seemingly random. Too much bread in soggy
waters or whatever.)

(I don't wish to be too cynical, but ... I remember when they
announced the new CEO. His Wikipedia page had more edits in that
single day, even before he actually did anything, than it did in its
entire existence previously. Yeah, people are morbidly obsessed with
symbolic power.)

> > I couldn't care less about .NET, it's pretty much a non-portable,
> > dead-end technology, just years behind the curve. A lot of former Java
> > fanatics (for which .NET became a substitute once M$ could not get to
> > terms with Sun) have jumped that ship already in the past. M$ could take
> > a hint from that...
>
> Sorry, but you're  behind in your understanding.  .NET is core
> technology for Windows, increasingly used by all manner of things.

I don't know if .NET is truly used universally in Windows proper. Some
few things may use it, and I know they pushed it heavily, but I don't
know if that was ever (or still is) heavily used enough in Windows 10
to be that dire. They had some minimal support, but most of it had to
be downloaded separately (optionally).

> (Current development is around .NET Core, which is a new flavor of the
> framework.)  Linux already had the Mono project to implement an open
> source equivalent of .NET.  MS engineers are major contributors to
> Mono, and MS has open sourced the whole thing.

They are very proud of it. And many many developers always say they
love C# (although I've never even pretended to learn it). Anders
Hejlsberg was the chief architect of C#, and he was the guy behind
Turbo Pascal (and later, TypeScript). The TIOBE Index (flawed, I know)
lists VB.NET and C# as fifth and sixth place, respectively. And there
have been two (or more?) "standards" of C# while Java has never been
standardized. Yes, I know, C# is (historically?) Windows only, but
people love it to death. The comparison to Java is because both are
"managed" (garbage collected). Java is still "numero uno" overall.

* https://www.tiobe.com/tiobe-index/

> This  means portable applications, because the .NET framework provides
> the underlying runtimes, and you can code in C#, F# or the like and
> expect your code to run under Windows and Linux.  The surface is only
> beginning to be scratched.

In theory, yes. And just for the record, the JVM was directly inspired
by the Pascal P-code series of bytecode compilers. Although Pascal
wasn't garbage collected, but Oberon was, and there have been various
Oberon compilers, derivatives, and OSes. (I think Java, the language
proper, was more inspired by Objective C.)

> And .NET isn't really a Java substitute.  You can run both, and I do.

But you don't necessarily need both. At least one (Oberon-based)
compiler, GPCP, targets both.

> What we are seeing now is a side effect of the steady advance of
> hardware, which got progressively smaller, faster, and cheaper.  It's
> now possible to run apps in scripting languages like Python where you
> formerly had to write in something like C and compile to native code,
> because the hardware is fast enough you don't *need* to compile to
> native code to get acceptable performance.

Depends on the hardware. Maybe for an x86 desktop, you don't need it.
But there's still plenty of hardware (even ARM-based) that won't run
fast without some heavy optimizations. BTW, Python has many
implementations: CPython, PyPy, IronPython, etc. Even Java has other
compilers (and other languages can use the JVM, e.g. Scala or Kotlin).

> MS can no longer assume that the whole world runs on X86 architecture,
> and there's an awful lot of ARM based kit out there.  (Think most
> smartphones and tablets.)  It's a multi-platform world and MS must
> work with it.

MS already sells "always on" Windows/ARM64 laptops (with IA-32
emulation). But they also still use Intel in their latest Surface
device. (I think??)

> To make life more interesting, look at compilers.  Compilers like GCC
> are in two parts - a front end parser for supported languages, and
> back end code generator producing object code for the specified
> platform.  Compilers like that need an intermediate architecture
> independent  language representation.  The front end compiles to it,
> and the back end translates it to object code.

I'm no expert, so I forget what they called it for GCC 3.x (circa
2005), but it heavily changed for 4.x (GENERIC and a subset called
GIMPLE). And probably has changed even more since then. Oh, GCC also
used to have its own native-code Java compiler (GCJ), but that's been
removed in recent versions. (Other options are available, apparently,
so it's less crucial nowadays.)

> In compilers like Clang on top of LLVM, the intermediate language may
> be JavaScript, and there may be no reason to compile to machine code.
> Fast optimizing JIT compilers for JavaScript are available for major
> platforms that compile JS to native code for execution, so just
> compile to JS and drop that directly onto the target machine.

I would be surprised if JIT existed beyond x86 and ARM. That's all
most people care about (tier one?). Let's not pretend that others
don't exist, too, even if they are less "major". I'm just saying,
pretending that every good idea is universally supported is a bit
naive.

> I'd call GCC's days numbered.

It's still heavily used in various architectures, even on some
platforms that LLVM doesn't actively support. It still gets a lot of
work and has improved quite a lot in recent years. (Since 2013 it's
been self-hosted in C++.) It's not going anywhere.

> But there are reasons who folks like Google won't use GPL code.

Some of that may be practical, some philosophical, but mostly I think
they just don't want the hassle with lugging around thousands of huge
source tarballs. Don't forget that GPL is very popular, by far (50%?),
and even the Linux kernel will probably always be "GPLv2 only".

As long as it's still "free/libre" and has active contributors
available who are willing to refactor, debug, and extend it all, the
practical differences between BSD and GPL are probably minor. (I'm
sure you know this as ESR has often said similar. But feel free to
correct me, I'm probably misunderstanding many details.)


_______________________________________________
Freedos-user mailing list
Freedos-user@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/freedos-user

Reply via email to