Cauchy wrote Cours d'Analyse (1821) based on his lecture course at the École Polytechnique. In it he attempted to make calculus rigorous and to do this he felt that he had to remove algebra as an approach to calculus.
Cauchy's approach to the calculus:
As for my methods, I have sought to give them all the rigour which is demanded in geometry, in such a way as never to fall back on reasons drawn from what is usually described in algebra. Reasons for this latter approach, however widely they are accepted, above all in passing from convergent to divergent series and from real to imaginary quantities, can only be considered, it seems to me, as inductions, apt enough sometimes to set forth the truth, but ill founded according to the exactitude which is required in the mathematical sciences. We must even note that they suggest that algebraic formulas have an unlimited generality, whereas in fact the majority of these formulas are valid only under certain conditions and for certain values of the quantities they contain. By determining these conditions and these values, and by fixing precisely the sense of all the notations I use, I make all uncertainty disappear.
Cauchy's definition of a limit:
When the values successively attributed to the same variable approach indefinitely a fixed value, eventually differing from it by as little as one could wish, that fixed value is called the limit of all the others.
Cauchy's definition of an infinitesimal:
When the successive absolute values of a variable decrease indefinitely in such a way as to become less than any given quantity, that variable becomes what is called an infinitesimal. Such a variable has zero for its limit.
Cauchy's definition of continuity:
Let f (x) be a function of a variable x, and let us suppose that, for every value of x between two given limits, this function always has a unique and finite value. If, beginning from one value of x lying between these limits, we assign to the variable x an infinitely small increment a, the function itself increases by the difference f (x + a) - f (x), which depends simultaneously on the new variable a and on the value of x. Given this, the function f (x) will be a continuous function of this variable within the two limits assigned to the variable x if, for every value of x between these limits, the absolute value of the difference f (x + a) - f (x) decreases indefinitely with that of a. In other words, the function f (x) will remain continuous with respect to x between the given limits if, between these limits, an infinitely small increment of the variable always produces an infinitely small increment of the function itself.
After giving this definition, Cauchy used it to prove that sin x was a continuous function.
Cauchy's definition of convergence and sum of a series:
Let Sn = u0 + u1 + u2 + ... + un-1 be the sum of the first n terms, n being any integer. If, for increasing values of n, the sum Sn approaches a certain limit S, the series will be called convergent and the limit in question will be called the sum of the series.
Cauchy then notes
Sn+1 - Sn = unFor Sn to converge, Cauchy says that it is necessary, but not sufficient, that un has limit zero. He then states:
Sn+2 - Sn = un + un+1
Sn+3 - Sn = un + un+1 + un+2
Sn+2 - Sn = un + un+1
Sn+3 - Sn = un + un+1 + un+2
It is necessary also, for increasing values of n, that the different sums un + un+1, un + un+1 + un+2, ..., that is, the sums of the quantities un, un+1, un+2 ..., taken, from the first, in whatever number we wish, finish by constantly having absolute values less than any assignable limit. Conversely, when these many conditions are satisfied, the convergence of the series is assured.
Cauchy's definition of the derivative:
When a function y = f (x) remains continuous between two given limits of the variable x, and when one assigns to such a variable a value enclosed between the two limits at issue, then an infinitely small increment assigned to the variable produces an infinitely small increment in the function itself. Consequently, if one puts Dx = i, the two terms of the ratio of differences
Cauchy's version of the mean value theorem:
If, f (x) is continuous between the limits x = a and x = b, we designate by A the smallest and by B the largest value that the derived function f '(x) attains in the interval, the ratio of the finite differences
[f (b) - f (a)] / (b - a)will necessarily be included between A and B.
Cauchy's approach to rigour didn't save him from errors, however. He "proved" incorrectly that the limit of a convergent series of continuous functions is continuous. Abel produced a counterexample.
1. C H Edwards, The historical development of the calculus (Springer, New York, 1979).
2. J W Grabiner, The origins of Cauchy's rigorous calculus (MIT Press, 1981).
3. P Kitcher, The nature of mathematical knowledge (Oxford University Press, Oxford, 1983).
4. I Kleiner, Excursions in the history of mathematics (Birkhäuser, Springer, New York, 2012).