Op 13-07-16 om 10:49 schreef Steven D'Aprano: > On Wednesday 13 July 2016 17:05, Lawrence D’Oliveiro wrote: > >> On Wednesday, July 13, 2016 at 6:22:31 PM UTC+12, Ian wrote: >> >>> I never claimed it's not useful. I don't really have a problem with >>> format supporting it, either. But if it does, then don't call it >>> "precision". >> Like it or not, that is the accepted term, as used in the printf(3) man page. >> >> Feel free not to use common accepted terms if you don’t want. You can use >> words to mean whatever you avocado, but don’t expect other people to carrot. > +1 QOTW
But as far as I know, "significant digits" is not the accepted term for the width of the representation when you print a number with added zeroes in front of it. So when we start we a term like "precision" and people jump from that to "significant digits", maybe we should consider how confusing the common accepted term can be. -- Antoon -- https://mail.python.org/mailman/listinfo/python-list