Today, we begin a historical journey from actual infinity to potential infinity and back again. We will part ways with ultrafilters for this first leg; they will get a needed rest before making a spectacular reappearance on the return trip.

In a previous post, we discussed the use of infinitesimals by Galileo and some of his contemporaries and the controversy that they caused in the scientific and religious circles of Europe during the 17th century. At the end of the 17th century, the infinitesimal took an even larger role in mathematics as one of the central players in the calculus of Newton and Leibniz. To illustrate how infinitesimals were used, let us consider the derivative, one of the fundamental concepts of calculus.

Suppose f:\mathbb{R} \rightarrow \mathbb{R} is a function. The derivative of f (which exists, provided f is a sufficiently well-behaved function) measures the rate of change of the value of f with respect to changes in its input. For example, suppose the variable x denotes time and f(x) denotes the position of a car along a road at time x. Then the derivative of f at a time x_0 (denoted, in Leibniz’s notation, by \frac{df}{dx}(x_0)), measures the velocity of the car at time x_0.

If f is a linear function (so its graph is a straight line), then the derivative of f is simply the slope of that line. In our example, this would correspond to a situation in which the car was traveling at a constant velocity. But what if f is a more complicated function? In this case, a fruitful approach is to try to approximate the graph of f by straight lines. (This is a common tactic in mathematics: given a complicated structure that you don’t know how to deal with, try to approximate it with simpler things that you do know how to deal with.)

In practice, this might look as follows. To find the derivative of f at time x, choose a time x_1 close to (but different from) x, find the line connecting the points (x, f(x)) and (x_1, f(x_1)), and find the slope of this line. Doing the algebra, this slope turns out to be:

\frac{f(x_1) - f(x)}{x_1 - x}

If we rewrite x_1 as x + h, then this becomes:

\frac{f(x + h) - f(x)}{h}

The situation is illustrated in the following picture.

img9
The derivative of f at x is approximated by the slope of the line connecting (x, f(x)) and (x+h, f(x+h)).

The smaller h becomes, the closer the slope of the line between (x,f(x)) and (x+h, f(x+h) gets to the actual derivative of f at x (which is the slope of the line tangent to the curve at the point (x, f(x)). Therefore, one might think, in order to find the true derivative of f, we just have to take h to be really small. How small is this? Well, it certainly needs to be smaller in magnitude than every positive real number since, as the picture above illustrates, taking h to be a non-zero real number just gives an approximation (though an arbitrarily good one) to the actual derivative. However, it cannot be equal to 0, since, if h=0, then

\frac{f(x + h) - f(x)}{h} = \frac{f(x)-f(x)}{h} = \frac{0}{0}

which is of course undefined. The obvious answer, to those who accept the use of such things, at least, is to let h be a non-zero infinitesimal. This is essentially the approach taken by Newton and Leibniz. (I am of course simplifying things, but this is the general idea).

The formulation of calculus by Newton and Leibniz was a great success and led to many further advances in science and mathematics, but the controversy surrounding the use of infinitesimals did not go away. As we mentioned in the earlier post, indiscriminate use of infinitesimals often led to paradox. There were also attacks on ontological grounds. Opponents of the use of infinitesimals claimed that their proponents wanted things both ways: on the one hand, infinitesimals should have positive magnitude to avoid undefined expressions such as \frac{0}{0}; on the other hand, this positive magnitude should be smaller than any real positive magnitude. These two demands seemed contradictory. In addition, there is the observation that, if h is a positive infinitesimal, then \frac{1}{h} would be a positive infinite number, clashing with the orthodoxy of the time that actual infinity should not be present in mathematics.

These objections essentially amounted to an appeal to what came to be known as the Archimedean property as applied to the set of real numbers. Briefly, the Archimedean property states that there are no infinite or infinitesimal elements in a given structure. More precisely (and in the context of the real numbers (it is the same for any ordered field, for example)), the Archimedean property states that, if x and y are any two positive numbers, then there is a natural number n such that nx > y. The Archimedean property appeared in Euclid’s Elements and was given its name by mathematician Otto Stolz in the 1880s. Stolz did much work on extensions of the real numbers which do not satisfy the Archimedean property. Interestingly and rather surprisingly, Cantor called this work an “abomination” and published an attempted sketch of a proof of the non-existence of infinitesimals.

Archimedean_property
An illustration of the Archimedean property, which essentially states that any positive magnitude can be covered by finitely many copies of any other. Here, 4 copies of A are sufficient to cover B. 

The standard real numbers, \mathbb{R}, satisfy the Archimedean property. However, if \mathbb{R} is extended to include infinitesimals, then the Archimedean property fails. To see this, let x be a positive infinitesimal, and let y = 1. Since x is an infinitesimal, we must have x < \frac{1}{n} for every natural number n, so nx < 1 for all natural n and thus x and y witness the failure of the Archimedean property.

One of the most prominent opponents of infinitesimals was Bishop Berkeley, who, in 1734, published The Analyst, an attack on the foundations of infinitesimal calculus with the wonderful subtitle, A DISCOURSE Addressed to an Infidel MATHEMATICIAN. WHEREIN It is examined whether the Object, Principles, and Inferences of the modern Analysis are more distinctly conceived, or more evidently deduced, than Religious Mysteries and Points of Faith. In the following memorable passage, Berkeley calls into question the nature and existence of infinitesimals (in what follows, ‘fluxions’ refers to a definition of Newton closely related to the derivative):

And what are these Fluxions? The Velocities of evanescent Increments? And what are these same evanescent Increments? They are neither finite Quantities nor Quantities infinitely small, nor yet nothing. May we not call them the ghosts of departed quantities?

berkeley
Bishop Berkeley

Newton, Leibniz, and like-minded mathematicians of course defended their use of infinitesimals. Leibniz explicitly justified their use in his Law of Continuity, expressed in a manuscript in 1701:

In any supposed continuous transition, ending in any terminus, it is permissible to institute a general reasoning, in which the final terminus may also be included.

And more plainly in a letter from 1702:

The rules of the finite are found to succeed in the infinite.

The debate over infinitesimals was undoubtedly a good thing for mathematics, as it inspired mathematicians to work to put calculus on rigorous foundations. This was largely done in the 19th century, as the actual infinities of infinitesimals were replaced by the potential infinities of limits, and the definition of the derivative settled into the form familiar to students of calculus ever since:

\lim_{h \rightarrow 0} \frac{f(x+h) - f(x)}{h}

In calculating such a limit, one considers the behavior of f(x+h) as h becomes arbitrarily close, but not equal, to 0. Importantly, these values of h are all standard real numbers and not infinitesimal. Infinitesimals themselves came to be seen as something like a useful but foundationally suspect fiction, a heuristic that helped lead to the great achievements of calculus but which had properly been discarded upon the rigorous founding of calculus in the language of limits.

This would be a natural ending point for our story. Indeed, this is how the story went for about 100 years, and calculus as taught in most schools today is hardly changed from its 19th century formulation. However, surprises awaited in the mid-20th century, when the use of infinitesimals in calculus was finally put on a rigorous foundation and Newton, Leibniz, and other proponents were, in a sense, vindicated. We’ll have that story in our next installment.

 

Advertisements

One thought on “Ultrafilters VI.a: Ghosts of Departed Quantities

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s