It is clear that some collections *could* support a more efficient last.
Anything with random access.  Anything that supports rseq (e.g., sorted
collections).

There are multiple ways to accomplish this; I presume a protocol would do
the trick.

Perhaps the original design decision is easily justified in terms of the
original way the collections were factored into various interfaces, but now
that it is so easy to make functions polymorphic over different types via
protocols, I can't imagine this would be a difficult change if anyone cared
to do it.

I don't think anyone is arguing that the current semantics aren't well
documented; the claim is that it violates the "Principle of Least Surprise"
in the sense that most people expect a built-in core function to be
implemented efficiently for the cases when it can be implemented
efficiently.  Everyone knows that a vector, for example, is designed to
provide very fast access to the last element, so it is counterintuitive
that a function called "last" ignores this capability and just treats
vector as a generic sequence, searching through the items one-by-one to get
to the last element.

BTW, I disagree with Warren's comment about how a better last would
eliminate the need for peek.  Even if you have last, peek is a very useful
way of guaranteeing consistent stack semantics for a wide variety of
collection types.

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en

Reply via email to