On Wed, Nov 14, 2007 at 04:44:25PM -0600, Nick Apperson wrote:
> perhaps this is an obvious statement...  The best language depends on the
> way in which your program works.  Having used C++ extensively, my program
> designs naturally fit easily into that language.  I'm sure a lisp programmer
> would think of better solutions that would only work in lisp.  As far as
> languages about restriction, well.... #&$* those languages.  I make sharper
> turns without training wheels thank you very much.

I agree it depends on how the program works, and I would even grant
that often you have a large amount of design freedom to choose how the
program works. In Go programming today there seems to be an especially
large amount of freedom, because we are deeply uncertain how a good
program should work. That freedom isn't guaranteed, though: sometimes
a problem domain can push you pretty hard into using a feature. E.g.,
who needs more than 16-bit integers? Sometimes you don't. I think it
should be easy to write a reasonably serious Go program of
conventional design in a language supporting no numeric types other
than 16-bit integers. But if you were in the problem domain of atomic
physics simulations, the first thing you'd want to do is change
languages or, failing that, at least write a floating point library in
terms of the built-in 16-bit integer support. Not only floating point
numbers but heap allocation, recursion, and inheritance are examples
of abstractions that people working in particular problem domains can
avoid for years with no difficulty, but if you get into a problem
domain that calls for them, it can be really painful to do without
them, regardless of whether you have been trained in a language and
style which encourages them. I think it's pretty likely that someday
we'll learn that the Go problem domain pushes that way for dynamic
memory ("heap") allocation, but YMMV.

I'm in partial agreement with you on the (un)desirability of
restrictions imposed by programming languages. My disagreement is that
some significant fraction of the time, the restrictions of some
programming language can turn out to be a good fit to what you're
doing. When one is doing a lot of work in a domain where the
restrictions are a good fit, a language which imposes tasteful
restrictions which make it easier to reason about the program can
actually be quite valuable. (Tasteless restrictions are pretty
useless: I will never use a language which lets me express no numbers
other than primes between 1010 and 1460.) As an example that you might
agree on (at least agreeing that it's valuable, but perhaps needing to
be convinced that it should be considered a restriction), I bet that
as a happy C++ programmer you value the guarantees provided by the
static type system when you reason about a program. But fundamentally
the C++ static type system (like the fancier ones in ML and Haskell)
is a restriction which limits some valid computations a program could
do in a dynamically-typed language (like Lisp and many scripting
languages). C++ and ML and Haskell are still Turing-complete, of
course, so you can still do those valid computations somehow,
typically by faking up dynamic types on top of the static type system.
But doing that is clumsy enough to make it pretty clear that you're
working around a restriction. (Similarly, Fortran 77 didn't have heap
allocation, but programmers could fake it up by allocating one monster
array and then reusing parts of it for different things. Ew.) The
Turbak and Wells paper "Cycle Therapy" is my current favorite example
of having to fake up a dynamic type system in order to express a
reasonable computation --- the "rose trees" they define are pretty
weakly typed compared to the usual objects in the SML and Haskell type
systems. (Sorry, no C++ in that paper, but I think the effect should
be the same in C++.) I have worked on similar code in CL, and it seems
to just naturally fit into the CL dynamic type system with no hassle.

Also I bet that you welcome --- or take for granted as an obviously
good thing --- C++'s prohibitions on bizarre constructs like a "goto"
statement which leaps out of the body of one function into the body of
another. (C++ doesn't support the Fortranmythos/Intercal "come from"
construct, http://www.fortran.com/come_from.html, either.:-) Such
restrictions make it so much easier to reason about the vast majority
of the programs we want to write that we'd need an amazingly
compelling reason to want to relax them.

So I think I disagree with your characterization of restrictions as
training wheels. Some restrictions *are* like training wheels. But
some are more like choosing to commute in a car instead of a
motorcycle. Some are like having a highrise balcony which is a
continuous sheet of reinforced concrete with a railing instead of just
a few scattered small pads where the architect anticipates you are
most likely to want to place your feet.:-| And some are controversial:
you might not be happy programming in a language which supports
"continuations" (feeling more or less the same enthusiasm as you'd
feel for "come from," because continuations are about that confusing,
and significantly harder to explain:-) but it turns out that there are
some pretty compelling use cases to justify the confusion created by
continuations, so some people in some problem domains are justifiably
enthusiastic about them.

And so horses for courses again. I hate restrictions about as much as
you do when I am stuck with a language which has restrictions which
seem a poor fit to the problem at hand, as in the Prolog or Haskell
examples in my original post, or in the misfit between SML and "Cycle
Therapy" above. But when the restrictions are a pretty good fit to the
problem at hand --- e.g., for you, probably the C++ static type system
for many problems you work on; for me, the SML type system for the
higher-order function code in Umut Acar's thesis --- they can be very
helpful for the programmer. (And in some cases, like imposing static
type systems or forbidding continuations or allowing only C++-style
static single dispatch instead of CLOS-style dynamic multiple
dispatch, they can be very helpful for the efficiency of compiled code
too.)

-- 
William Harold Newman <[EMAIL PROTECTED]>
PGP key fingerprint 85 CE 1C BA 79 8D 51 8C  B9 25 FB EE E0 C3 E5 7C
"The objective is not to convince someone with your arguments but 
to provide the arguments with which he later convinces himself."
  -- Milton Friedman (probably paraphrased)
_______________________________________________
computer-go mailing list
computer-go@computer-go.org
http://www.computer-go.org/mailman/listinfo/computer-go/

Reply via email to