On Wed, Aug 29, 2012 at 2:04 AM, erik quanstrom <quans...@quanstro.net> wrote: >> > the haahr/rakitzis es' if makes more sense, even if it's wierder.) >> >> Agreed; es would be an interesting starting point for a new shell. > > es is great input. there are really cool ideas there, but it does > seem like a lesson learned to me, rather than a starting point.
Starting point conceptually, if not in implementation. >> I think in order to really answer that question, one would have to >> step back for a moment and really think about what one wants out of a >> shell. There seems to be a natural conflict a programming language >> and a command interpreter (e.g., the 'if' vs. 'if not' thing). On >> which side does one err? > > since the raison d'ĂȘtre of a shell is to be a command interpter, i'd > go with that. Fair enough, but that will color the flavor of the shell when used as a programming language. Then again, Inferno's shell was able to successfully navigate both in a comfortable manner by using clever facilities available in that environment (module loading and the like). It's not clear how well that works in an environment like Unix, let alone Plan 9. >> I tend to agree. As a command interpreter, rc is more or less fine as >> is. I'd really only feel motivated to change whatever people felt >> were common nits, and there are fairly few of those. > > there are nits of omission, and those can be fixable. ($x(n-m) was added) Right. >> > perhaps (let's hope) someone else has better ideas. >> >> Well, something off the top of my head: Unix pipelines are sort of >> like chains of coroutines. And they work great for defining linear >> combinations of filters. But something that may be interesting would >> be the ability to allow the stream of computations to branch; instead >> of pipelines being just a list, make them a tree, or even some kind of >> dag (if one allows for the possibility of recombining streams). That >> would be kind of an interesting thing to play with in a shell >> language; I don't know how practically useful it would be, though. > > rc already has non-linear pipelines. but they're not very convienient. And somewhat limited. There's no real concept of 'fanout' of output, for instance (though that's a fairly trivial command, so probably doesn't count), or multiplexing input from various sources that would be needed to implement something like a shell-level data flow network. Muxing input from multiple sources is hard when the data isn't somehow self-delimited. For specific applications this is solvable by the various pieces of the computation just agreeing on how to represent data and having a program that takes that into account do the muxing, but for a general mechanism it's much more difficult, and the whole self-delimiting thing breaks the Unix 'data as text' abstraction by imposing a more rigid structure. There may be other ways to achieve the same thing; I remember that the boundaries of individual writes used to be preserved on read, but I think that behavior changed somewhere along the way; maybe with the move away from streams? Or perhaps I'm misremembering? I do remember that it led to all sorts of hilarious arguments about what the behavior of things like, 'write(fd, "", 0)' should induce in the reading side of things, but this was a long time ago. Anyway, maybe something along the lines of, 'read a message of length <=SOME_MAX_SIZE from a file descriptor; the message boundaries are determined by the sending end and preserved by read/write' could be leveraged here without too much disruption to the current model. > i think part of the problem is answering the question, what problem > would we like to solve. because "a better shell" just isn't well-defined > enough. Agreed. > my knee-jerk reaction to my own question is that making it easier > and more natural to parallelize dataflow. a pipeline is just a really > low-level way to talk about it. the standard > grep x *.[ch] > forces all the *.[ch] to be generated before 1 instance of grep runs on > whatever *.[ch] evaluates to be. > > but it would be okay for almost every use of this if *.[ch] were generated > in parallel with any number of grep's being run. > > i suppose i'm stepping close to sawzall now. Actually, I think you're stepping closer to the reducers stuff Rich Hickey has done recently in Clojure, though there's admittedly a lot of overlap with the sawzall way of looking at things. - Dan C.