Is $0$ a natural number?

$\begingroup$

Is there a consensus in the mathematical community, or some accepted authority, to determine whether zero should be classified as a natural number?

It seems as though formerly $0$ was considered in the set of natural numbers, but now it seems more common to see definitions saying that the natural numbers are precisely the positive integers.

$\endgroup$ 16

9 Answers

$\begingroup$

Simple answer: sometimes yes, sometimes no, it's usually stated (or implied by notation). From the Wikipedia article:

In mathematics, there are two conventions for the set of natural numbers: it is either the set of positive integers $\{1, 2, 3, \dots\}$ according to the traditional definition; or the set of non-negative integers $\{0, 1, 2,\dots\}$ according to a definition first appearing in the nineteenth century.

Saying that, more often than not I've seen the natural numbers only representing the 'counting numbers' (i.e. excluding zero). This was the traditional historical definition, and makes more sense to me. Zero is in many ways the 'odd one out' - indeed, historically it was not discovered (described?) until some time after the natural numbers.

$\endgroup$ 16 $\begingroup$

There is no "official rule", it depends from what you want to do with natural numbers. Originally they started from $1$ because $0$ was not given the status of number.

Nowadays if you see $\mathbb{N}^+$ you may be assured we are talking about numbers from $1$ above; $\mathbb{N}$ is usually for numbers from $0$ above.

[EDIT: the original definitions of Peano axioms, as found in Arithmetices principia: nova methodo, may be found at : look at it. ]

$\endgroup$ 7 $\begingroup$

I think that modern definitions include zero as a natural number. But sometimes, expecially in analysis courses, it could be more convenient to exclude it.

Pros of considering $0$ not to be a natural number:

  • generally speaking $0$ is not natural at all. It is special in so many respects;

  • people naturally start counting from $1$;

  • the harmonic sequence $1/n$ is defined for any natural number n;

  • the $1$st number is $1$;

  • in making limits, $0$ plays a role which is symmetric to $\infty$, and the latter is not a natural number.

Pros of considering $0$ a natural number:

  • the starting point for set theory is the emptyset, which can be used to represent $0$ in the construction of natural numbers; the number $n$ can be identified as the set of the first $n$ natural numbers;

  • computers start counting by $0$ (see the explanation of Dijkstra)

  • the rests in the integer division by a $n$ are $n$ different numbers starting from $0$ to $n-1$;

  • it is easier to exclude one defined element if we need naturals without zero; instead it is complicated to define a new element if we don't already have it;

  • integer, real and complex numbers include zero which seems much more important than $1$ in those sets (those sets are symmetric with respect to $0$);

  • there is a notion to define sets without $0$ (for example $\mathbb R_0$ or $\mathbb R_*$), or positive numbers ($\mathbb R_+$) but not a clear notion to define a set plus $0$;

  • the degree of a polynomial can be zero, as can be the order of a derivative;

I have seen children measure things with a ruler by aligning the $1$ mark instead of the $0$ mark. It is difficult to explain them why you have to start from $0$ when they are used to start counting from $1$. The marks in the rule identify the end of the centimeters, not the start, since the first centimeter goes from 0 to 1.

An example where counting from $1$ leads to somewhat wrong names is in the names of intervals between musical notes: the interval between C and F is called a fourth, because there are four notes: C, D, E, F. However the distance between C and F is actually three tones. This has the ugly consequence that a fifth above a fourth (4+3) is an octave (7) not a nineth! On the other hand if you put your first finger on the C note of a piano your fourth finger goes to the F note.

I would say that in the natural language the correspondence between cardinal numbers and ordinal numbers is off by one, thus distinguishing two sets of natural numbers, one starting from 0 and one starting from 1st. The 1st of January was day number $0$ of the new year. And zeroth has no meaning in the natural language...

$\endgroup$ 10 $\begingroup$

According to ISO 80000-2:2009: Quantities and Units - Part 2: Mathematical signs and symbols to be used in the natural sciences and technology, page 6;

$$\mathbb{N}=\{0,1,2,3,\ldots\}$$ $$\mathbb{N^*}=\{1,2,3,\ldots\}$$

enter image description here

$\endgroup$ 1 $\begingroup$

These lecture notes from a combinatorics course given for many years by N.G. de Bruijn suggest a helpful alternative:

Due to the confusion caused by N. Bourbaki about the natural numbers, we feel obliged to define: $$\begin{align}\Bbb N_0 & = \{0,1,2,\ldots\}\quad \text{ and } \\ \Bbb N_1 & = \{1,2,3,\ldots\}. \end{align}$$

(Page 4)

$\endgroup$ 3 $\begingroup$

There are the two definitions, as you say. However the set of strictly positive numbers being the natural numbers is actually the older definition. Inclusion of $0$ in the natural numbers is a definition for them that first occurred in the 19th century.

The Peano Axioms for natural numbers take $0$ to be one though, so if you are working with these axioms (and a lot of natural number theory does) then you take $0$ to be a natural number.

$\endgroup$ 6 $\begingroup$

I remember all of my courses at University using only positive integers (not including $0$) for the Natural Numbers. It's possible that they had come to an agreement amongst the Maths Faculty, but during at least two courses we generated the set of natural numbers in ways that wouldn't make sense if $0$ was included.

One involved the cardinality of Sets of Sets, the other defined the natural numbers in terms of the number $1$ and addition only ($0$ and Negative Integers come into the picture later when you define an inverse to addition).

As a result when teaching the difference between Integers and Natural Numbers I always define $0$ as an integer that isn't a Natural Number.

$\endgroup$ 3 $\begingroup$

The Peano-Dedekind axioms (as used in proving propositions by use of the Principle of Mathematical Induction) define the $\mathbb{N}$ as either $\mathbb{N}$ = $\mathbb{Z^+} \cup \text{0} = \text{{0, 1, 2, ...}}$ or $\mathbb{N} = \mathbb{Z^+} = \text{{1, 2, 3, ...}}$, that is, it depends on the context (usually this "context" may be seen from the given proposition to be proved, at least in the case of using PMI).

$\endgroup$ $\begingroup$

Peano used 1 as the first natural number in his arithmetic theory. Geometric: If a line is breadthless with infinite points, then the first line you can create is a 1. Written 1 if it were a map you could scale it. This 1 can then be used as a metric to index a length of each successive natural number. On a Cartesian Graph 0 is used to define one end point of the 1, but without the second end point at 1 there is no length, and no number according to some. Also consider, if you were creating a set of lines each representing a natural number, what would 0 look like? There would be no length to 0, and hence it is not a member of the set of lines.

$\endgroup$

You Might Also Like