1646 – 1716
Universal genius — co-inventor of the calculus, architect of modern notation, pioneer of binary arithmetic, formal logic, and the dream of a universal calculus of reason
Leibniz made significant contributions to mathematics, philosophy, physics, engineering, linguistics, geology, law, theology, and library science. He may be the last person to have mastered the entirety of human knowledge in his era.
His mathematical education was initially weak compared to Newton's. He learned much from Huygens during a visit to Paris (1672–76), where he was exposed to the latest work on infinite series and tangent problems.
Sent on a diplomatic mission, Leibniz used his time in Paris to study mathematics intensively under Huygens. By 1675 he had independently developed the calculus, with notation far superior to Newton's fluxional dots.
"Nova Methodus" (1684) presented the differential calculus; a 1686 paper presented the integral calculus. These publications established Leibniz's notation (dx, dy, ∫) as the standard.
Served the House of Hanover for 40 years as librarian, historian, and counsellor. Tasked with writing a history of the Guelph dynasty — a project he never finished, distracted by mathematics and philosophy.
Published the Théodicée (1710) and left the Monadology (1714). Developed the philosophical doctrine that we live in "the best of all possible worlds" — later satirized by Voltaire in Candide.
Leibniz's greatest mathematical legacy may be his notation — the symbols dx, dy, dy/dx, and ∫ that made calculus operational.
Leibniz's notation makes the fundamental rules of calculus appear as natural algebraic operations:
d(uv) = u dv + v du
Differentials of a product — transparent and memorable.
dy/dx = (dy/du)(du/dx)
Fractions "cancel" — suggesting the correct formula by formal algebra.
∫u dv = uv - ∫v du
Follows immediately from integrating the product rule.
In 1679, Leibniz developed the binary (base-2) number system, showing that all arithmetic could be performed using only 0 and 1.
Leibniz designed and built the Stepped Reckoner (1694), a mechanical calculator that could add, subtract, multiply, and divide.
Leibniz envisioned a "calculus of reasoning" (calculus ratiocinator) combined with a universal symbolic language (characteristica universalis) that would reduce all human disputes to calculation: "Let us calculate, gentlemen!"
Binary arithmetic + mechanical calculation + formal logic + universal notation = the conceptual foundation of modern computing. Leibniz saw 300 years into the future, even if the technology of his time couldn't realize the vision.
Leibniz discovered determinants while studying systems of linear equations, 50 years before Cramer. He showed that a system of three linear equations in two unknowns has a solution only if a certain expression (the determinant) vanishes.
Attempted to reduce logical reasoning to symbolic manipulation. His work on syllogisms, combinations, and the algebra of concepts anticipated Boole's mathematical logic by 150 years.
π/4 = 1 - 1/3 + 1/5 - 1/7 + ... Discovered independently by James Gregory and Leibniz. This beautiful infinite series connects π to odd numbers, though it converges extremely slowly.
Leibniz coined the term "analysis situs" for what we now call topology — the study of properties preserved under continuous deformation. He envisioned this as a qualitative geometry, though he didn't develop it far.
Create notation that captures the essence of the concept
Reduce reasoning to rule-governed manipulation of symbols
Let the notation do the thinking; follow the rules mechanically
Read the result back as a meaningful statement about the world
"In symbols one observes an advantage in discovery which is greatest when they express the exact nature of a thing briefly... then indeed the labor of thought is wonderfully diminished." — Leibniz understood that good notation doesn't just record mathematics; it enables it.
Leibniz believed nothing happens without a reason. In mathematics, this led him to seek the most general, most elegant formulation of every result. His notation reflects this: dy/dx is more general than Newton's ẏ because it works for any variables.
The calculus priority dispute dominated Leibniz's final years and overshadowed his immense contributions.
Leibniz died on November 14, 1716, in Hanover. His employer, Elector George (by then King George I of Britain), did not attend the funeral. Neither did any member of the Royal Society. Only his secretary was present. His grave went unmarked for 50 years.
"It is unworthy of excellent men to lose hours like slaves in the labour of calculation which could safely be relegated to anyone else if machines were used."
— Leibniz, on the Stepped ReckonerEvery calculus textbook in the world uses Leibniz's notation: dy/dx, ∫, d²y/dx². Newton's fluxions are a historical curiosity. Leibniz's notation remains the language of analysis.
Binary arithmetic, formal logic, and the vision of mechanical reasoning are the conceptual DNA of computer science. Leibniz anticipated Boole, Turing, and von Neumann.
His discovery of determinants and work on systems of equations anticipated a field that didn't fully develop until the 19th century.
His characteristica universalis inspired Frege, Russell, and Gödel. The dream of formalizing all reasoning continues in automated theorem proving and AI.
Every digital device uses Leibniz's binary system. Every bit is a Leibnizian 0 or 1.
Leibniz's notation makes differential equations writable, readable, and solvable. Every physics model uses them.
Automated reasoning systems descend from Leibniz's dream of mechanizing thought via symbolic calculus.
Boolean logic (Leibniz's symbolic successor) underpins all modern encryption algorithms.
Shannon's information theory measures information in bits — the base-2 units Leibniz introduced.
Relational databases and SQL are grounded in formal logic whose lineage traces to Leibniz's combinatorial calculus.
Maria Rosa Antognazza (2009). The definitive modern biography, covering all aspects of Leibniz's extraordinary intellectual life.
William Dunham (2005). Masterpieces from Newton to Lebesgue, with Leibniz's contributions beautifully contextualized.
Joseph Hofmann (1974). Detailed study of the crucial years when Leibniz developed the calculus.
J.M. Child, trans. (1920). Primary sources showing Leibniz's calculus taking shape in real time.
G.W. Leibniz, various translations. Essential philosophical texts that reveal the worldview driving his mathematics.
Steven Nadler (2008). The mathematics and philosophy of Leibniz and his contemporaries, accessibly presented.
"It is unworthy of excellent men to lose hours like slaves in the labour of calculation which could safely be relegated to anyone else if machines were used."
— Gottfried Wilhelm Leibniz, 1685Gottfried Wilhelm Leibniz · 1646–1716 · The Notation of Calculus