"The main duty of the historian of mathematics, as well as his fondest privilege, is to explain the humanity of mathematics, to illustrate its greatness, beauty, and dignity, and to describe how the incessant efforts and accumulated genius of many generations have built up that magnificent monument, the object of our most legitimate pride as men, and of our wonder, humility and thankfulness, as individuals. The study of the history of mathematics will not make better mathematicians but gentler ones, it will enrich their minds, mellow their hearts, and bring out their finer qualities."
Sir Isaac Newton | Gottfried Wilhelm von Leibniz |
It is interesting to note that Leibniz was very conscious of the importance of good notation and put a lot of thought into the symbols he used. Newton, on the other hand, wrote more for himself than anyone else. Consequently, he tended to use whatever notation he thought of on that day. This turned out to be important in later developments. Leibniz's notation was better suited to generalizing calculus to multiple variables and in addition it highlighted the operator aspect of the derivative and integral. As a result, much of the notation that is used in Calculus today is due to Leibniz.
The development of Calculus can roughly be described along a timeline which goes through three periods: Anticipation, Development, and Rigorization. In the Anticipation stage techniques were being used by mathematicians that involved infinite processes to find areas under curves or maximaize certain quantities. In the Development stage Newton and Leibniz created the foundations of Calculus and brought all of these techniques together under the umbrella of the derivative and integral. However, their methods were not always logically sound, and it took mathematicians a long time during the Rigorization stage to justify them and put Calculus on a sound mathematical foundation.
In their development of the calculus both Newton and Leibniz used "infinitesimals", quantities that are infinitely small and yet nonzero. Of course, such infinitesimals do not really exist, but Newton and Leibniz found it convenient to use these quantities in their computations and their derivations of results. Although one could not argue with the success of calculus, this concept of infinitesimals bothered mathematicians. Lord Bishop Berkeley made serious criticisms of the calculus referring to infinitesimals as "the ghosts of departed quantities".
Berkeley's criticisms were well founded and important in that they focused the attention of mathematicians on a logical clarification of the calculus. It was to be over 100 years, however, before Calculus was to be made rigorous. Ultimately, Cauchy, Weierstrass, and Riemann reformulated Calculus in terms of limits rather than infinitesimals. Thus the need for these infinitely small (and nonexistent) quantities was removed, and replaced by a notion of quantities being "close" to others. The derivative and the integral were both reformulated in terms of limits. While it may seem like a lot of work to create rigorous justifications of computations that seemed to work fine in the first place, this is an important development. By putting Calculus on a logical footing, mathematicians were better able to understand and extend its results, as well as to come to terms with some of the more subtle aspects of the theory.
When we first study Calculus we often learn its concepts in an order that
is somewhat backwards to its development. We wish to take advantage of
the hundreds of years of thought that have gone into it. As a result, we
often begin by learning about limits. Afterward we define the derivative
and integral developed by Newton and Leibniz. But unlike Newton and
Leibniz we define them in the modern way -- in terms of limits. Afterward
we see how the derivative and integral can be used to solve many of the
problems that precipitated the development of Calculus.
To learn more about the historical development of Calculus check out these
sites: