Steven D'Aprano <st...@pearwood.info>: > I think that we can equally choose the natural numbers to be > axiomatic, or sets to be axiomatic and derive natural numbers from > them. Neither is more correct than the other.
Mathematicians quit trying to define what natural numbers mean and just chose a standard enumerable sequence as *the* set of natural numbers. That's analogous to physicists defining the meter as a particular rod in a vault in Paris. So not even the length of the rod but the rod itself. To modern mathematicians, the concept "three" simply means the fourth element in the standard enumeration. When mathematicians need to use natural numbers to count, they have to escape to predicate logic. For example, to express that a natural number n has precisely two divisors, you have to say, ∃x∈N ∃y∈N ¬x=y ∧ x|n ∧ y|n (which avoids counting) or: ∃B∈P({x∈N : x|n}×2) (∀x∈{x∈N : x|n} ∃y∈2 ((x,y)∈B ∧ ∀z∈2 (x,z)∈B→y=z) ∧ ∀x∈2 ∃y∈{x∈N : x|n} ((y,x)∈B ∧ ∀z∈{x∈N : x|n} (z,x)∈B→y=z)) This latter clumsy expression captures the notion of counting. However, in programming terms, you could say counting is not first-class in mathematics. Counting is done as a "macro" if you will. If the logicians had managed to define natural numbers the way they wanted, counting would be first class and simple: {x∈N : x|n}∈2 Marko -- https://mail.python.org/mailman/listinfo/python-list