The British were deep in the throes of the Battle of the Atlantic. In six short months of 1940, German U-boats had sunk three million tons of Allied shipping. The U.S. Navy joined the quickly growing British forces in the region to help push back the German attack on convoys of Allied supplies and imported goods. The Germans had one goal: keep Britain isolated and vulnerable to attack.

But the Germans were also too confident in their system of sending encrypted messages from land to its seamen in the Atlantic. See, they were depending on a remarkable, typewriter-like encryption machine, called Enigma. Messages could be scrambled using a collection of wheels that offered billions of combinations. The complexity — and simplicity — of the Enigma machine was its greatest success and failure.

By 1939, the Poles had managed to get their hands on an Enigma machine, and just before the country was invaded, the Polish intelligence organization sent the machine to the British. A code-breaking headquarters was set up in Bletchley Park an estate in Buckinghamshire.

And this is where math — and Alan Turing — comes into the story.

Born in London in 1912, Turing showed great aptitude for mathematics at a young age — but like many of the great mathematicians before him, he was much more interested in following his own instincts and interests. As a result, his performance in school was checkered. In 1931, he enrolled in King’s College Cambridge to study mathematics, and after graduating in 1935, he became a fellow of the school.

Turing was fascinated by a variety of mathematical concepts, including logic and probability theory. He independently discovered the Central Limit Theorem, which explains why many distributions are close to the normal distribution (or bell curve). (Trust me, this is a really big deal.) He also began experimenting with algorithms, designing the Turing machine. This led him to Princeton, where he studied with Alonzo Church, before returning to England in 1938.

At first, Turning considered his “machine” to be an abstract concept — a computer was a person doing a computation. But over time, he began considering the possibility that an actual machine could be built that would follow algorithms to solve problems. Once back in England, he began developing this invention.

But in 1939, war was declared. Turing was asked to be a part of the Bletchley Park team in England. Using the stolen Enigma machine provided by the Poles, he and mathematician Gordon Welchman developed the first “bombe” or WWII, British code-breaking machine, which collected top-secret information the team called ULTRA. By the end of the war, Turing and his colleagues had developed 49 such bombes, which were instrumental in decoding German Navy U-Boat messages during the long Battle of the Atlantic.

While Turing’s inventions did not end World War II, historians estimate that his contributions shorted it by several years and helped save thousands of lives.

This work propelled Turing into the burgeoning field of computer science. Employed by the National Physical Laboratory, he set his mind to developing the first digital computer, but his colleagues dismissed his ideas. In 1949, he joined Manchester University, where he laid the groundwork for the field of artificial intelligence.

But something was simmering under the surface: Turing’s sexuality. He didn’t particularly hide his attraction to men, and in 1952, he was arrested and convicted for the crime of homosexuality. His choice was to go to prison or accept chemical castration, a process designed to reduce the libido and thus sexual activity. He chose the latter. Although he had continued to work in secret for the Government Communications Headquarters (GCHQ, the British intelligence agency), because he was an out, gay man, his security clearance was revoked. Still, Turing went back to work on his research in computers and applying mathematics to biology and medicine.

In the summer of 1954, his house cleaner found Turing dead in his bedroom, a half eaten apple near his body. The coroner found that he had died of cyanide poisoning, and the subsequent inquest ruled his death a suicide. However, his mother asserted that his death was accidental, a result of cyanide residue on his fingers.

In 2009, the British government issued a posthumous apology to Turing for his arrest, conviction and chemical castration. Prime Minister Brown called his treatment “appalling”:

While Turing was dealt with under the law of the time and we can’t put the clock back, his treatment was of course utterly unfair and I am pleased to have the chance to say how deeply sorry I and we all are for what happened to him … So on behalf of the British government, and all those who live freely thanks to Alan’s work I am very proud to say: we’re sorry, you deserved so much better.

This year, marking the 100-year anniversary of his birth, much of the math and science community around the world has celebrated Alan Turing Year, designed to elevate Turing’s contributions to the fields. (And in fact, Google introduced one of the most challenging of its Doodles on Turing’s 100th birthday. Check it out!)

What did you already know about Alan Turing? And what could more could he have accomplished had his life not been so short? Share your reactions in the comments section.

I have had a wonderful, wonderful time exploring these stories of math history this month. Let’s do it again sometime! If you’d like to learn something more about math history, drop me a line.

Ada Lovelace was probably bound for greatness. The product of the brief marriage between Lord Byron (yes, that Lord Byron) and Anne Isabella (“Annabella”) Milbanke, she was born in 1815. But in true Romantic tragedy, her parents separated soon after her birth , and she never knew her father. Her mother, whom Lord Byron called “the Princess of Parallelograms,” was pretty quick with the calculations, and so Lovelace got a good education in math and science. This approach also served to protect Lovelace from the fiery passions of poetry (according to her prudish mother).

Seems Ada got the best of both parents. At age 13, she developed a design for a flying machine — quite a feat in 1828, a full 85 years before and an ocean away from the Wright Brothers at Kitty Hawk. But over time, her approach to mathematics was decidedly verbal. She called herself the poetic scientist, and her writings were imaginative and described in metaphors.

When she was 17 years old, Lovelace met Mary Somerville, the self-taught mathematician and scientist. The two became fast friends, attending lectures, demonstrations and concerts together. And it was Somerville who introduced Lovelace to the man who would help cement her name in history.

Charles Babbage was the inventor of the Difference Engine, a rudimentary calculator that wasn’t built until more than 100 years after his death. He and Lovelace met in 1834, when he was working out the design of his next invention, the Analytic Engine.

Unlike his Difference Engine, this new design was programmable, an idea that completely enthralled Lovelace. She and Babbage became good friends and colleagues, and in 1843, Babbage asked Lovelace to translate into English a French summary of a presentation he gave describing the Analytic Engine. And by the way, could she also expand upon the ideas, since she was so familiar with the design?

What Lovelace wrote was nothing less than prescient:

The distinctive characteristic of the Analytical Engine, and that which has rendered it possible to endow mechanism with such extensive faculties as bid fair to make this engine the executive right-hand of abstract algebra, is the introduction into it of the principle which Jacquard devised for regulating, by means of punched cards, the most complicated patterns in the fabrication of brocaded stuffs. It is in this that the distinction between the two engines lies. Nothing of the sort exists in the Difference Engine. We may say most aptly that the Analytical Engine weaves algebraical patterns just as the Jacquard loom weaves flowers and leaves.

(I said she wrote in metaphors!)

Again Babbage’s machine was not built in his lifetime, but the design — featuring punch cards of the early mechanical computers — is still acknowledged as the precursor to the modern-day computer. And Lovelace is considered the first computer programmer because of what she suggested the machine could do: compute the Bernoulli numbers.

What the heck are they? Well, first off, Bernoulli numbers are a pretty big deal in number theory and analysis. Basically, they’re a sequence (or list) of rational numbers (or decimals that either repeat or terminate). These numbers show up in a variety of places that won’t matter to you. The important thing here is that they are darned difficult to compute. In the 19th century, folks who needed them typically depended on tables that listed these numbers. But Lovelace developed a program that would generate them automatically.

Thus, the first computer program program was born.

Unfortunately for all of us, Lovelace would never see her invention realized. She died of cancer in 1852, before publishing anything more. Still, her contribution is so great that computer geeks around the world still revere her. In 1977, the Department of Defense named its high-level computer programming language Ada. Heck, the IT guy at my last regular job named his first daughter Ada.

I wonder what Lord Byron would have written about his daughter, the poetic scientist?

Had you heard of Ada Lovelace? What do you think Lord Byron would have thought of her contributions? Share your feedback below.

Before beginning this story, a little background. There are two really basic ways to think of calculus:

1. The study of the infinite (extremely large) and the infinitesimal (extremely small).


2. The study of limits. Imagine a gnat that is flying from the middle of a room to the doorway. The gnat first moves halfway to the door. Then he takes a little breather and moves half of the remaining distance. Another breather, another jaunt half of the remaining distance. And so on and so on. Will he ever get to the door?

(Okay, most mathematicians might hate me for boiling things down to this very basic level, but for the average Joe or Jane, these explanations will do the trick. And due to space issues, I need for you to just trust me on why these things matter. Some day, I’ll write about the applications of calculus and other higher-level math.)

If you think math history isn’t very exciting — in a Batman meets Joker or Clint Eastwood make-my-day kind of way — you’re pretty much right. There are a few life-and-death situations, like Galileo’s (okay, it was his soul in peril, not his physical body), but for the most part, mathematicians were either revered or went unnoticed. Except for Sir Isaac Newton and Gottfried Wilhelm Leibniz.

I wish I could say that this was an actual duel, not because I love violence or wish ill on one of these fine mathematicians, but because it would make this story even more interesting — especially to high school students or grown ups who think math is BOR-ring. But in the end this story is still pretty fascinating, especially given the fact that these men never met or spoke on the phone or Skyped (because cell phones and the internet didn’t exist).

It was 1666, around the time of the Apple Incident (you know, when a fallen apple prompted Newton to develop his theories of gravity) that The Sir thought up his ideas of fluxions. Don’t worry, you shouldn’t know what that word is, as it’s never used in modern mathematics. Instead we call his development differential calculus.

Leibniz was just 20 years old at that time. Sure, he was a genius — he had already earned degrees in philosophy and law, and that year he published his first book, De Arte Combinatoria or On the Art of Combinations. While this expansion of his philosophy dissertation is obliquely related to mathematics, it was well before Leibniz began formally dabbling in the Queen of the Sciences.

This timing is pretty darned important. Trust me.

So Newton farts around with this idea of fluxions, finally getting around to publishing Method of Fluxions in 1736. But along he published a few manuscripts on the subject, sending early copies to some colleagues. Meanwhile, in Germany, Leibniz was jotting down his own discoveries in his journal. In 1675, he noodled around, finding the area under a the graph of y = f(x) using integral calculus.

In other words, the two men were discovering calculus at the same time and in completely different parts of the world. (Okay, Germany and England weren’t too distant from one another, but in the 17th century, they may as well have been on different planets.)

I’d bet that given Newton’s stereotypical absent-minded-professor approach to the world around him, he might never have even noticed Leibniz’s publications, which came in 1684 and 1686. Or at the very least, he might have simply acknowledged the great coincidence and moved on. (Apparently, the man could barely be trusted to keep a dinner date, much less worry about a rival in a different country.)

In fact, it was neither Newton nor Leibniz who lit the fire of the great calculus war. In 1704, an anonymous review of Newton’s fluxions suggested that he borrowed [ie stole] the idea from Leibniz, which of course infuriated Newton. Letters flew back and forth between the two mathematicians and their surrogates. Newton was behind the publication of these letters, called Commercium Epistolicum Collinii & aliorum, De Analysi promota. (I am not kidding.) A summary of this publication was published anonymously in 1714 in the Philosophical Transactions of the Royal Society of London. But everyone knows that Newton wrote it.

The Swiss mathematician Johann Bernoulli — who later made his own contributions to infinitesimal calculus — attempted to defend Leibniz, but Newton pretty much took him down. In the end Leibniz meekly defended himself, refusing to look through his “great heap of papers” to prove that he had independently discovered calculus at the same time as Newton. When he died in 1716, Leibniz had been pretty well beaten up by Newton and his buddies (metaphorically speaking, of course).

It wasn’t until much later that everyone came around to the accepted and logical — though really coincidental — truth of the whole ordeal. Both Newton and Leibniz discovered calculus at the same time, using slightly different approaches. To many of us math folks, this is a truly wondrous event.

But there’s more. Even though Newton enjoys (and did enjoy) a bit of celebrity for his genius, he largely wrote for himself, while Leibniz was a bit obsessive about notation, wanting to be sure that his discoveries could actually be used. This is one of the big reasons that today’s calculus is pretty much Leibniz’s discovery. Newton’s approach turns out to be a bit too clunky for everyday use.

So whether or not you possess a general (or specific) understanding of calculus, you can certainly appreciate the 17th-century-style drama surrounding the discovery of this critical field of mathematics, right? At the very least, we can thank Newton and Leibniz for that.

Did you know about the great calculus controversy? What questions does it bring up for you? Ask them in the comments section!

It seems to me that the Greek philosopher and scientist, Archimedes, was like the forgetful scientist. And a few tales of his life support this theory.

Born in 287 B.C. on the island of Sicily, he had the good fortune — for him and us — to have a wealthy astronomer for a father. He enrolled in an Alexandrian school based on the principles of Euclid — the father of plane geometry. (You know: points, lines, planes, if the corresponding sides of two triangles are congruent then so are the triangles and vertical-angles-are-always-congruent.)

He must have gotten a good education, because Archimedes went on to apply mathematics to building tools, like the Archimedes screw, which is used to efficiently pump water from one place to another. (Contraptions based on his design are still being used today.) He also explained how levers and pulleys work, developing new ways to move even heavier objects. (“Give me a lever long enough and a fulcrum on which to place it, and I shall move the world,” he said.)

And speaking of heavy objects, my very favorite math story is about our dear, old, absent-minded Archimedes. Apparently his good buddy King Hiero hired a goldsmith to make him a crown of the shiny stuff. But the king was suspicious that the goldsmith was cheating him — giving him a crown made of a composite of gold and another (cheaper) metal.

So Hiero took his crown to the smartest man he knew, Archimedes, who gave the problem some deep thought. But it wasn’t until he lowered himself into one of the city’s public baths that the solution hit him like a ton of bricks (or a crown of gold). He got so excited that he ran through the streets naked and shouting, “Eureka! Eureka!” or “I’ve got it! I’ve got it!”

Nobody  knows for sure if this is a true story, but it sure got the attention of my high school math students back in the day. And Archimedes’ discovery has certainly stood the test of time. See, when he got into his bath, Archimedes noticed that his body caused some of the water to spill over the side. That got him thinking about the relationship between the volume of his body and the amount of water that was displaced. By replicating the experiment with gold and silver, he realized he had discovered the principle of displacement — if an object sinks in water, the amount of water that is displaced (or overflows) is equal to the volume of that object.

P.S. Apparently the goldsmith was trying to pull one over on the king. The crown was made of iron and covered in gold.

But when it came to mathematics, geometry was his thing. (Duh. It was ancient Greece, after all.) The man had an obsession with circles. In order to better estimate the value of π (or the ratio of the circumference of a circle to its diameter), he drew a 96-side regular polygon. (It was between 3 10/71 and 3 1/7.) He used the same “method of exhaustion” to find the area of a circle and the volume of a sphere.

Archimedes’ death is a testament to his ability to focus on his studies with no regard to the world around him. Stories say that his last words were, “Do not disturb my circles.” These were said to the Roman soldier who killed him, as Archimedes studied.

Did anything about Archimedes surprise you? Which of his discoveries have you counted on at home or work? Share your responses in the comments section.

Things were moving right along in the invention and use of number systems. The Sumerians started things off sometime during the 3rd millenium, when their budding commerce system helped them invent the first set of written numbers. The Egyptians systematically engineered a formal base-ten system that morphed from hieroglyphics to the much-easier-to-write hieratic numbers.

But something was missing. Something really important — and really, really small.

The Greeks advanced geometry considerably. (More on that next week.) But in the Roman Empire, mathematical invention and discovery virtually stopped — with the exception of Roman numerals. These were widely used throughout Europe in the 1st millenium, but like the number systems that came before, it was positional and did not use place value.

But why weren’t these systems using place value? It all comes down to zero. Up to this point, this seemingly inconsequential number was absent.

There is some debate about this, of course. Some historians assert that sometime around 350 B.C. Babylonian scribes used a modified symbol to represent zero, which astronomers found useful to use this placeholder in their notations. And on the other side of the world, the Mayans used a symbol for zero in their “Long Count” calendar. But there is no evidence that zero was used for calculations.

Along came the Indian mathematician and astronomer, Brahmagupta, who was the first person in recorded history to use a symbol for zero in calculations. But India’s relationship with zero started well before that.

In ancient and medieval India, mathematical works were composed in Sanskrit, which were easily memorized because they were written in verse. (I am not kidding.) These beautiful sutras were passed down orally and in written form through the centuries. Thus the idea of zero — or śūnya (void), kah (sky), ākāśa (space) and bindu (dot) — was first introduced with words. Eventually, an actual dot or open circle replaced these words, as Indians began using symbols to represent numbers.

Brahmagupta used zero in arithmetic — adding, subtracting, multiplying and even dividing using the all-important number. All of that was well and good, except for division. It wasn’t until Sir Isaac Newton and his German counterpart Gottfried Wilhelm Leibniz came along that it was established that dividing by zero is undefined.

But really, the big deal here was not doing arithmetic. Nope, it was place value. This is so important that we all take it for granted. It’s the difference between $65 and $605 or the difference between 0.02% and 2%. See, zero isn’t just a place holder — in our number system it can represent a place value. You think math is hard now? Imagine doing calculations with Roman Numerals! Without place value and our humble zero, this work is exceedingly difficult.

This is a relatively new idea in the scheme of things. Almost 3,000 years had passed, since the Sumerians developed the first written number. Zero was introduced in India sometime around 400 A.D., though it didn’t show up in a text until around 600 A.D. Through trade routes, zero began showing up in the Middle East and China, but it took a very long time — the middle of the 12th century! — for Europeans to begin using zero and place value.

And that’s pretty much it — the very long history of our current number system, without which most other major discoveries, like calculus, trigonometry or geometry, could not be developed.

Of course there is much, much more to say about numbers themselves. For example, they’re arranged in a system based on their particular characteristics, kind of like the way we categorize animals or plants. Positive whole numbers are called natural numbers;positive and negative numbers are called integers; fractions and terminal decimals are rational numbers, and so on. This is connected to a fascinating (to me) branch of mathematics, called abstract algebra. But that’s a story for another day.

What surprised you about the history of numbers? And how about that zero? Ask your questions or make comments here.

So the Sumerian system of numbers — as far as we know, the first in the world — came into being rather naturally and out of necessity. But the Egyptians took things one step further, and they did it very systematically. Priests and scribes invented a system of numbers that included tally marks and hieroglyphics. In doing so, they developed a base-ten system featuring different symbols for different numbers.

The Egyptian people were very fortunate. With few neighbors, they didn’t have spend time worrying about war or defending themselves from attack. They also lived in a very fertile area, making agriculture less troublesome than it might have been. All of this freed up their time to do things like develop a numerical system and make big advances in mathematics. (You know, the ordinary stuff we do when we live in peace and have lots of food and water.)

Hieroglyphics could be used to express a wide variety of numerical values — all the way to one million! The symbol for one was a tally mark, so four tally marks represented 4, and so on. But 10 was expressed as a horseshoe shape and 100 a coiled rope. A little tiny prisoner begging for forgiveness was the hieroglyphic for 1,000,000. (I’d love to know the story behind that one.)

Yes, I drew these myself. No, I am not an artist or an ancient Egyptian. But you probably knew that.

While these characters could be arranged to represent an almost endless set of whole numbers and even fractions, the Egyptians were missing a critical numeral: zero. This meant that with all of their advances, Egyptian numbers had no place value system.

All of this allowed the Egyptians to take huge steps in the development of arithmetic, including the four basic operations — addition, subtraction, multiplication and division — and using numbers for measurement. Without these advances, we would have no great pyramids.

As the ancient society moved to the much more portable and easier-to-use papyrus and ink to record words and numerals, hieroglyphs gave way to hieratic numerals. These are more akin to brush strokes, and allowed the Egyptians to write larger numbers with fewer symbols. It’s pretty easy to see that this sped things up quite a bit.

On Friday, we’ll visit ancient India, where the most amazing creation/discovery revolutionized the system of numbers. (Seriously, this was a big, big deal!)

Can you imagine having to use hieroglyphics to balance your checkbook? If you have questions about the Egyptian system of numbers, ask them in the comments section.

When the world began 4.54 billion years ago, it didn’t come with numbers. They didn’t appear with the dinosaurs or first mammals or even the first homo sapiens. That’s because numbers were createdas a way to describe the world. And that is a big-honkin’ deal.

Think about it: Numbers make our daily lives much, much easier — from knowing how much time you have before you must get out of bed to setting the table with the correct number of plates at dinner time. You simply cannot get through your day with encountering numbers — not just once, or twice or a dozen times, butthousands and thousands of times. (Do you see what I did there?)

So if numbers haven’t been with us since the beginning of time, where the heck did they come from? Well, that history is pretty challenging to tell, but this week I’ll give you a little overview, starting with the Sumerians.

Sumer was a region of Mesopotamia, roughly where Iraq is today. The Sumerians made so many discoveries and inventions that the region is often called the Cradle of Civilization. Before this time, people used tallies to count things and geometric figures showed up in art and decoration. But these representations were not really mathematical, and they weren’t used widely and systematically.

It was the rise of cities that really set things in motion. As Sumerians developed commerce, they developed one of the world’s first system of numbers. To keep things fair, people needed a way to keep track of sales and barters. First, they counted on tallies. But there were no numerals associated with the hatch-marks they were using to show the number of sheep in a herd or eggs in a basket.

(Here is a good time to underscore the difference between a number and a numeral. It’s a teeny-tiny distinction, but an important one. A numeral is a character or symbol that describes a number. A number is the actual value of the numeral. So 3 is a numeralBut if I say I have three kittens, well, I’m talking about the number of sweet, little, purring balls of fur curled up on my lap.)

The Sumerians took things a little further with their whole commerce thing — they started systematically subtracting. See, if I had five goats, I’d be given five special tokens. If I sold off one of them, I’d have to give back one of my tokens. To keep track of this natural back-and-forth of trading and selling, merchants began to keep clay tablets of tallies that showed not only the number of baskets or cows or whatever they had at any moment, but a sales history.

And so, arithmetic was born. Oh, and writing. Ta-da! (Those Sumerians were smart and resourceful.)

Now, as this process developed over time, the Sumerians settled on a base 60 system of numbers. We have a base ten system, which in very, very basic terms means two things: we have ten basic numerals that are used to write all other numbers (0-9) and our numbers are described in sets of 10 or multiples of ten.

But not the Sumerians. They liked 60, a number that should be very familiar to us, since it’s the basis of our system of time. That’s probably no accident, right?

Eventually, the Sumerians developed their own set of numerals, called cuneiform numbers. They looked like the inscription in the photo above.

So there you have it. The world’s first numerals — near as we can tell. Next up: The Egyptians.

(Disclaimer: I’ll be the first to admit that this history is a lot more complex than can be described here. And I’d bet my last dollar that there are a few historians out there who disagree with the generally accepted history of Sumerians and mathematics. There’s so much we don’t know about his ancient history.)

Got questions about the Sumerians or the development of numbers? Ask them below. Was anything in this story surprising or particularly interesting? I’d love to hear what you think.