On Wed, 25 Nov 2015 06:00 am, Random832 wrote: > On 2015-11-24, Chris Angelico <ros...@gmail.com> wrote: >> Probably the grammar. In other words, it's part of the language's very >> definition. > > Then the definition is wrong. I think "literal" is a word whose meaning is > generally agreed on, rather than something each language's spec can invent > from whole cloth for itself. It's not a python term, it's a programming > term.
Well, I don't know about that. According to you, and Ruby, this is a literal: [x+1, y+2*x, func(arg), MyClass(a, b)] http://ruby-doc.org/core-2.1.1/doc/syntax/literals_rdoc.html#label-Arrays which seems like an abuse of the term to me. How can it be a *literal* when it contains non-literal expressions which aren't known until runtime? Although I note that the actual examples of Array literals and Hash literals in the Ruby docs punt on the issue by only showing expressions that could be replaced by constants by a peephole optimizer. Lua refers to *string literals* but *table constructors*: http://www.lua.org/manual/5.1/manual.html Apart from strings, the above manual avoids the use of "literal". > And the documentation doesn't even use it consistently; it calls {} a > literal. Citation required. -- Steven -- https://mail.python.org/mailman/listinfo/python-list