Joachim Durchholz wrote: > > You can have aliasing without pointers; e.g. arrays are fully sufficient. > If i = j, then a [i] and a [j] are aliases of the same object.
I am having a hard time with this very broad definition of aliasing. Would we also say that a[1+1] and a[2] are aliases? It seems to me, above, that we have only a, and with only one variable there can be no aliasing. A further question: given a 32 bit integer variable x, and offsets i and j (equal as in the above example) would you say that x &= (1 << i) and x &= (1 << j) are aliased expressions for setting a particular bit in x? I am not being facetious; I am trying to understand the limits of your definition for aliasing. > (After I observed that, I found it no longer a surprise that array > optimizations are what Fortran compiler teams sink most time into. The > aliasing problems are *exactly* the same as those with pointers in C - > though in practice, the Fortranistas have the advantage that the > compiler will usually know that a [i] and b [j] cannot be aliases of > each other, so while they have the same problems, the solutions give the > Fortranistas more leverage.) I don't understand this paragraph. On the one hand, it seems you are saying that C and Fortran are identically burdened with the troubles caused by aliasing, and then a moment later it seems you are saying the situation is distinctly better with Fortran. > > What term would you use? First-class variables? > > I think it's more a quasi-continuous spectrum. > > On one side, we have alias-free toy languages (no arrays, no pointers, > no call-by-reference, just to name the most common sources of aliasing). > > SQL is slightly more dangerous: it does have aliases, but they are > rarely a problem (mostly because views aren't a very common thing, and > even if they are used, they aren't usually so abstract that they really > hide the underlying dependencies). > > Next are languages with arrays and call-by-reference. "Pascal without > pointers" would be a candidate. > Here, aliasing occurs, and it can and does hide behind abstraction > barriers, but it's not a serious problem unless the software becomes > *really* large. > > The end of the line are languages that use references to mutable data > everywhere. All OO languages are a typical example of that. Now with this, it appears you are agreeing that SQL has an advantage vis-a-vis aliasing compared to OO languages. Yes? If so, we are agreeing on the part I care about, and the specifics of just what we call aliasing are not so important to me. Marshall -- http://mail.python.org/mailman/listinfo/python-list