1685 – 1731 · The Calculus of Approximation
The mathematician who taught us to approximate the infinite with polynomials, transforming analysis and laying foundations for modern computation.
Born on 18 August 1685 in Edmonton, Middlesex, into a prosperous and well-connected English family. His father, John Taylor, was a man of means who ensured his son received an excellent education.
From an early age, Brook showed prodigious talent in both mathematics and music. He entered St John's College, Cambridge in 1701, at age 16, receiving his LL.B. in 1709 and LL.D. in 1714.
At Cambridge, Taylor studied under the influence of the Newtonian school, absorbing the methods of fluxions that would shape his entire mathematical career.
Son of John Taylor of Bifrons House, Kent. The family had connections to the minor nobility and valued education.
Taylor was an accomplished musician who would later apply mathematics to the vibrating string problem, linking his two passions.
Studied alongside figures who would become key players in British mathematics during the Newton-Leibniz controversy.
Taylor's mathematical career was remarkably concentrated. Elected a Fellow of the Royal Society in 1712 at just 27, he quickly rose to prominence in the English mathematical establishment.
He served as Secretary of the Royal Society from 1714 to 1718, during a period when the society was deeply embroiled in the priority dispute between Newton and Leibniz. Taylor was firmly in Newton's camp.
His masterwork, Methodus Incrementorum Directa et Inversa (1715), contained both the Taylor series theorem and pioneering work on finite differences, all published in a single landmark year.
Personal tragedies — the deaths of both his first and second wives in childbirth — and declining health curtailed his mathematical output. He died on 29 December 1731, aged just 46.
FRS (1712), Secretary of the Royal Society (1714–1718)
Methodus Incrementorum (1715), Linear Perspective (1715), New Principles (1719)
Much of his work was not fully appreciated until Euler and later Lagrange extended and formalized his results.
Taylor worked during the golden age of British Newtonian mathematics, but also during the bitter calculus priority war.
Newton was still alive (d. 1727) and dominated British mathematics. Taylor's work used Newton's fluxional notation and methods exclusively.
The Royal Society's 1712 Commercium Epistolicum sided with Newton over Leibniz. Taylor, as a Royal Society insider, was deeply involved.
Johann Bernoulli and Leibniz were developing calculus in parallel. The ideological split between British and Continental methods would last a century.
The early 18th century saw explosive growth in infinite series methods. Newton, Gregory, and Mercator had all used series expansions before Taylor.
The reign of Queen Anne and George I — a period of political stability, scientific patronage, and the flowering of the Royal Society.
The early 1700s saw growing interest in the mathematical principles underlying art, architecture, and perspective — a topic Taylor would pioneer.
Taylor's theorem states that any sufficiently smooth function can be represented as an infinite sum of terms calculated from its derivatives at a single point:
f(x) = f(a) + f'(a)(x−a) + f''(a)(x−a)²/2! + f'''(a)(x−a)³/3! + …
Or in compact sigma notation:
f(x) = Σ f(n)(a)/n! · (x − a)n, n = 0 to ∞
Published in Methodus Incrementorum (1715), this result unified many earlier special-case series expansions known to Newton, Gregory, and Mercator into a single, general theorem.
Taylor polynomial approximations of sin(x): as degree increases, the approximation converges to the true curve over a wider interval.
Taylor's theorem transforms local information (derivatives at a single point) into global knowledge (the function's behavior everywhere within the radius of convergence).
The remainder term Rn(x) = f(n+1)(c)/(n+1)! · (x−a)n+1 for some c between a and x, quantifies the error of truncation — crucial for numerical analysis.
Before Taylor, individual series expansions were known (Newton's binomial series, Mercator's log series, Gregory's arctangent series), but no one had stated the general principle connecting derivatives to power series coefficients.
The series converges within a radius R determined by the nearest singularity in the complex plane. Outside this radius, the series diverges.
If a function has a power series representation, it must be the Taylor series. This uniqueness is fundamental to complex analysis.
Functions equal to their Taylor series are called analytic. Not all smooth functions are analytic — the function e−1/x² is smooth everywhere but not analytic at x=0.
The very title of Taylor's masterwork — Methodus Incrementorum Directa et Inversa — translates as "Direct and Inverse Methods of Increments," referring to finite differences.
The forward difference operator Δf(x) = f(x+h) − f(x) is the discrete analogue of the derivative. Taylor showed how to build a complete parallel calculus using this operator.
He derived the Taylor/Gregory-Newton interpolation formula:
f(a+nh) = Σk=0 C(n,k) · Δkf(a)
This is the finite-difference analogue of the Taylor series, replacing derivatives with difference operators and factorials with binomial coefficients.
Finite difference methods are the backbone of numerical computing: from solving differential equations to computer graphics to financial modeling.
Taylor organized finite differences into a systematic table structure, revealing hidden patterns in discrete data.
Δ0f = f(x)
Δ1f = f(x+h) − f(x)
Δ2f = Δf(x+h) − Δf(x)
Each level captures a higher-order discrete rate of change, analogous to successive derivatives.
Taylor recognized that the shift operator E (where Ef(x) = f(x+h)) relates to Δ by E = 1 + Δ. This mirrors the relation ehD = 1 + hD + h²D²/2! + … for derivatives.
Just as integration is the inverse of differentiation, Taylor developed the inverse method of increments — what we now call indefinite summation, the discrete analogue of the antiderivative.
Taylor applied his methods to the vibrating string problem, deriving the fundamental frequency formula f = (1/2L)√(T/μ), the first correct solution in mathematical physics.
Also in 1715, Taylor published Linear Perspective: or, a New Method of Representing Justly All Manner of Objects as They Appear to the Eye, the first rigorous mathematical treatment of perspective drawing.
Previous treatments by artists (Alberti, Dürer) were practical but lacked mathematical precision. Taylor introduced:
The work was so abstract and concise that it was poorly received at first. Taylor published a revised, more accessible edition, New Principles of Linear Perspective, in 1719.
Taylor's work influenced 18th-century treatises on perspective by Joshua Kirby and others. His vanishing point theory became standard in both art and engineering drawing.
Taylor's perspective theory anticipated concepts later formalized by Poncelet and other projective geometers in the 19th century.
Because the 1715 book was so terse, many of Taylor's ideas were rediscovered independently by later mathematicians, without credit to him.
Taylor's approach exemplified the Newtonian style: moving fluidly between continuous and discrete, between geometry and algebra.
Start with discrete finite differences Δx, Δy
Let increments become infinitely small — Newton's fluxional calculus
Expand in power series to obtain general formulas
Apply to physics, geometry, or numerical problems
Taylor wrote in an extremely compressed style, packing major results into brief passages. This made his work hard to read but intellectually dense. The Methodus Incrementorum contains results that later mathematicians expanded into entire treatises.
Taylor used Newton's dot notation for fluxions exclusively, contributing to the isolation of British mathematics from the more powerful Leibnizian notation used on the Continent. This methodological divide hampered British mathematics for a century.
Johann Bernoulli accused Taylor of plagiarizing his work on the oscillating string and the isoperimetric problem. Their dispute, conducted through published letters and papers, became one of the bitterest in 18th-century mathematics. Bernoulli claimed priority for results in the Methodus Incrementorum.
Taylor's theorem was, in special cases, known before Taylor. James Gregory (1668), Newton (1691), and Leibniz all used specific series expansions. Taylor's contribution was the general statement, but some historians question how much was truly new.
As Secretary of the Royal Society, Taylor was involved in the institutional side of the priority dispute. He served on committees investigating Leibniz's claims and was seen by Continental mathematicians as a partisan agent of Newton.
Taylor's original proof of his theorem lacked the rigor demanded by later standards. He assumed that every function could be expanded in a power series without proving convergence — a gap that would not be fully addressed until Cauchy in the 1820s.
Taylor's theorem became one of the most important results in all of mathematics. Every student of calculus encounters it; every numerical method relies on it.
Taylor series, Taylor expansion, Taylor polynomial, Taylor's theorem, Taylor remainder — his name is among the most frequently cited in mathematics.
Dying at 46, Taylor produced a remarkably compact body of work. Yet the density and originality of his contributions earned him a permanent place in the mathematical canon.
Euler took Taylor's series and ran with it, using it as the backbone of his analytic approach to mathematics. Without Taylor's general theorem, Euler's Introductio in analysin infinitorum would not have been possible.
Gradient descent uses first-order Taylor approximations. Second-order methods (Newton's method) use the quadratic Taylor approximation. Backpropagation is essentially Taylor expansion applied to computational graphs.
Small-angle approximation (sin θ ≈ θ), perturbation theory in quantum mechanics, post-Newtonian expansions in general relativity — all are Taylor expansions at their core.
Duration and convexity in bond pricing are first and second Taylor coefficients. The Greeks in options pricing (Δ, Γ, Θ) are Taylor expansion terms of the Black-Scholes formula.
Polynomial approximations of trigonometric and exponential functions in shader code use truncated Taylor series for fast evaluation on GPUs.
Linearization of nonlinear systems around operating points is a first-order Taylor expansion. Extended Kalman filters use Taylor expansion for state estimation.
Runge-Kutta methods, the Euler method, and all classical ODE solvers are derived from Taylor expansion of the solution. Higher-order methods match more Taylor terms.
Methodus Incrementorum Directa et Inversa (1715) — Taylor's masterwork containing the theorem, finite differences, and the vibrating string problem.
Linear Perspective (1715) and New Principles of Linear Perspective (1719) — his treatises on projective geometry and art.
L. Feigenbaum, "Brook Taylor and the Method of Increments," Archive for History of Exact Sciences (1985) — the definitive modern study of Taylor's mathematics.
N. Guicciardini, The Development of Newtonian Calculus in Britain, 1700–1800 (1989) — essential context for Taylor's milieu.
W. Dunham, The Calculus Gallery (2005) — accessible chapter on Taylor's contributions to analysis.
V. Katz, A History of Mathematics (3rd ed., 2009) — excellent coverage of Taylor within 18th-century analysis.
K. Andersen, The Geometry of an Art (2007) — comprehensive study of the mathematics of perspective, with substantial coverage of Taylor's pivotal role.
"The method of series is of the greatest use in all parts of the Mathematics, both pure and mixed; it being the most universal, and indeed almost the only method which, in the present state of science, can make us masters of the more difficult problems."
— Brook Taylor, preface to Methodus Incrementorum Directa et Inversa, 1715Brook Taylor (1685–1731)