On Mon, Jun 1, 2020 at 5:30 PM Rugxulo <rugx...@gmail.com> wrote: > On Sun, May 31, 2020 at 11:26 PM dmccunney <dennis.mccun...@gmail.com> wrote: > > On Sun, May 31, 2020 at 10:35 PM Rugxulo <rugx...@gmail.com> wrote: > > > > > So no, I haven't tried rebuilding this (yet?), and I'm no *nix fiend, > > > but I do think AWK is a cool tool, maybe cooler than GW-BASIC (don't > > > kill me!). > > > > AWK is a cool tool. But it's not a full programming language for > > building stand alone apps. GWBASIC is. > > I don't think this particular BASIC is a compiler, only an > interpreter. (The very first BASIC was a compiler.)
Doesn't matter. You can create an entire application in an interpreted language, and people did. And BASIC being interpreted on early machines was likely a matter of hardware available. Kemeny and Kurtz were working on larger multi-user systems. Consider the Commodore 64, which had MS BASIC v2 embedded. When you booted it, you were in the interpreter, talking to BASIC. BASIC on that machine was embedded in a 8KB ROM. > But there actually are compilers for AWK out there, even REXX! But > most implementations don't do that. (Why bother? Interpreted is often > fast enough.) Awk and REXX are script languages, and were generally interpreted. Awk is used in things like pipelines, where you call awk to query and process a text file and pass the results to something else. REXX was the next generation of script language on IBM mainframes, intended to provide more power than CLISTs. (I was a dab hand at CLIST programming back when.) But you got REXX as a component of IBM's VM/CMS OS. VM was actually intended as a hypervisor, allowing you to run other IBM mainframe OSes under it. It was popular for cases like taking code written for DOS/VSE and converting it to run under OS/MVS. You could bring each up in a partition, with a production partition running DOS/VSE, and a test partition where you handled conversions and made sure things worked as expected under MVS. Once you had completed conversion and testing, with verification that everything worked as designed, you could take down the DOS/VSE partition and make the MVS partition the production environment. (IIRC, VM/CMS imposed about a 10% overhead, which is remarkably good. REXX subsequently got brought up under other architectures. (I have a version that works under Palm OS.) But hey, there are compilers for DOS batch files... :-p > (untested by me, but just FYI) > * http://awka.sourceforge.net/index.html Convert awk to C, then compile to an executable. I recall hearing back when that AT&T was working on an awk compiler. I have no idea if this bears any relation to that effort. > I'm actually a bigger fan of Sed, but that's much more limited > (intentionally?). Also, AWK vaguely reminds me of REXX in > functionality (although that, too, I only lightly dabbled in). > Obviously, REXX was more known on IBM mainframes and OS/2. Intentionally. It has a different use case than awk. SED is a stream editor, explicitly intended to be called in a pipeline to perform *scripted* edits on what is fed to it. One bit often missed by DOS folks back in the day was that *EDLIN* could be used that way. Advanced batch programmers in environments like corporate installations where they weren't allowed to install third party code used EDLIN where they might otherwise have installed SED. > In recent years, BWK wrote a book on Go. That language has come a long > way and done a lot. A lot of people from Plan 9 still work on that. > Oh, one guy did write a compatible implementation of AWK in Go! The main honcho behind Go is Rob Pike, with Ken Thompson and Robert Griesemer contributing. (Griesmer was a main developer for the V8 JavaScipt engine.) Pike and Thompson were colleagues of Kernighan back the AT&T Bell Labs in Murray HILL NJ when Thompson and Kernighan were designing Unix and Dennis Ritchie was developing the C language. Go is specifically intended for concurrent programming, and addresses weaknesses in C/C++ (primarily in memory management) that bite when you are trying to create concurrent code. Go's intent is to handle the memory management for the developer, so they can just develop code and not have to worry about it. Open source advocate Eric S. Raymond has largely switched to Go these days. A paying project he's technical lead on is updating NTP. That was a morass of security holes and being used in DDOS attacks. The first challenge was scraping away decades of accumulated cruft in the form of special case code for various old architectures and environments. The last I knew, his current version of the core NTP code is about 70% smaller than the C code it replaced, and is far more secure. Many recent security bugs filed against NTP don't exist on his code because his code removes the attack surfaces they target. Go turned out to be just the thing to use for the project. > > It was initially written to perform "one liners", where you invoked awk > > on a command line with the commands to execute and the data to examine. > > I don't think UNIX originally came with a C compiler (except maybe > add-on?), so all you had was Sh and AWK. I have an AT&T 3B1. I got it *before* I got a DOS PC. It was a single user Unix workstation, based on a 10mhz Motorola 68010 CPU (the first model of that CPU with HW memory management.) It ran AT&T Unix System V Release 2, and could boot and run an instance of SysVR2 in *one* MEGABYTE of RAM. Give it more and it flew. (Mine has 3.5MB RAM.) It came with AT&T's "cc" C compiler, and I recall it being a built-in, not an add-on. Things were different back then. You used cc to compile individual modules of code. The output from cc was assembler, assembled to object code by as, and the object code modules (*.out files) were linked to form the intended executable by ld. (It was possible to stop at the "output to assembler" stage and hand optimize before handing off to as and then ld.) If memory serves, the C compiler was also a built-in on the AT&T 3B2 machines based on a 32 bit Western Electric CPU and intended to be multi-user systems. (I was Tech Support Manager for a systems house that resold AT&T kit, and logged a lot of time on them.) (And while the 3B1 was intended as a single user system, AT&T SysV was multi user. I had a client running a 3B1 with four Wyse 75 terminals and a printer attached, running a customized DBMS for distribution management. Performance was adequate, thank you.) I was able to build Daniel Lawrence's version of MicroEMACS "out of the box" on my 3B1, with no fiddling with the supplied sources needed. Lawrence did a good job of confining non-portable code to architecture specific modules that only got built if the code was being built on that architecture. I had fun implementing the WordStar command set in MicroEMACS, and on the 3B1 I could use ME's macro language to add support for an assortment of special keys on the 3B1 keyboard and do something sensible in it when they were pressed. (And the 3B1 had an AT&T employee written package that was amusing. The 3B1 and sibling UNIX-PC were attempting to compete with PCs running DOS. So the AT&T folks came up with a package using shell script functions and aliases to provide equivalents of DOS commands, like DIR and FIND, which can be viewed as a special case of Unix grep. I giggled.) > > Awk is still useful on *nix - various things like build recipes may > > use it in scripts - but for most purposes, perl has replaced it. (I > > consider that a pity. > > I respect Larry Wall and Perl but never learned it. Honestly, I > dislike the various versions and non-standard incompatibilities. It's > a bit too brittle to rely upon (isn't everything? even "standards" > have many holes, buggy implementations; so, that's not really a Perl > problem, per se, just "life"). I respect Wall, and perl, but know just enough of it to be dangerous. His "There's more than one way to do it!" dictum was both a blessing and a curse. There's a humorous list floating around comparing languages by how you shoot yourself in the foot. For perl, the entry is "You shoot yourself in the foot, but nobody else can figure out how you did it. 6 weeks later, neither can *you*". :-p > > Awk is smaller and faster, and perl may be overkill for a lot of what you > > might need to do. > > Sed is much lighter than AWK, so yes, even AWK can be overkill. Maybe > even Sed is overkill for some things. (Sed came from Ed, so I guess > Edlin would be loosely comparable in DOS circles.) Whether Sed is lighter than awk may depend on what version you run on what hardware. But they have completely different use cases. You probably can't replace awk with sed, or vice versa, and you may need to use *both*. > > Former Busybox maintainer Rob Landley griped elsewhere about sending > > patches to > > remove the dependency on Perl from Linux kernel builds, since awk did > > all that was needed, only to find it reappear again.) > > I once told you that he should just use (BSD-licensed) AWK from old > Minix 2.x for ToyBox. Not sure if that's truly practical advice, > though. I suggested he look at the version of awk Brian Kerninghan had on his web page. Kernighan was all in favor. But Rob is paranoid about licensing issues. Kernighan was working for AT&T when he wrote it, Bell Labs got spun off to Lucent Technology, and Lucent merged with French vendor Alcatel to become Alcatel-Lucent. The code is probably abandonware at this point, but Brian suspects Alcatel-Lucent doesn't even recall they own it, and has no idea who to ask there to get formal clearance. I think Rob could get away with just using it, but he's risk averse and is rolling his own. He's also working on toysh, a bash compatible shell to include in Toybox. He thinks he can manage it in 3,500 lines of C. Right now, he's in a maze of twisty little corner cases, asking "What is Bash *doing* here?" Chet Ramey, the Bash maintainer, has been providing answers. (I personally am disgusted when one open source project cannot use code from another because of incompatible licensing, with the GPL being the biggest problem child. Rob won't go anywhere near GPL code. For that matter, neither will Google. These days, if I have an option, I won't either.) > Yes, Perl is overkill. You know, I rebuilt old NASM 0.98.39 [2005] > recently for 8086. (Pre-existing 16-bit DOS binaries were 186 only, > ugh, heheh.) My old XT clone is on a shelf under my computer desk. I gave it a turbo 10mhz motherboard and NEC V20 CPU. The V20 had optimized microcode, ran about 5% faster than the Intel 8088, and supported 186 instructions. :-) (I recall seeing *one* PC clone in the old days, aimed at gamers, and specifically using the 80186 CPU. I saw the 80186 in add-on cards for Unix systems, where it was a controller for attached dumb terminals.) > 0.98.39 used Perl, which works but is bloated (and our > only DOS build still is DJGPP's old 5.8.8 from 2007). So I finagled it > a bit just to use Sed (only), which is much smaller and simpler (thus, > no Perl required). Oh, I also used AWK behind the scenes a bit to > help. (I never properly learned Perl but do have a book on it.) The significant thing for perl is that a DOS build *exists* Whether anyone might have an actual use for it is another matter. > Yeah, it's just a mess. Big projects are harder to maintain, and > unfortunately DOS is not "top tier" for most actively-developed > projects. DOS probably isn't even on the radar screen. Why should it be? DOS was designed and implemented in the days when hardware was big, slow, limited and expensive. Today, hardware is small, fast, and cheap. (Consider the Internet of Things. That *exists* because hardware capable of *running* a full TCP-IP stack and being a peer on an Internet is *cheap* enough that it can be deployed in places where we used to use 8 bit micro controllers because anything else cost too much.) There are people working in the embedded space still concerned with hardware limitations, but most folks aren't and don't need to be. > Sorry for the ramble, it's just a minefield of tools out there. Still fun! That's true for pretty much everything. Want a minefield? Develop in JavaScript. JavaScript is explicitly a "batteries not included" language. Doing anything serious requires libraries. There are more JS libraries out there than I can count, with Religious Arguments among developers about which to use. ______ Dennis _______________________________________________ Freedos-user mailing list Freedos-user@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/freedos-user