Things were moving right along in the invention and use of number systems. The Sumerians started things off sometime during the 3rd millenium, when their budding commerce system helped them invent the first set of written numbers. The Egyptians systematically engineered a formal base-ten system that morphed from hieroglyphics to the much-easier-to-write hieratic numbers.
But something was missing. Something really important — and really, really small.
The Greeks advanced geometry considerably. (More on that next week.) But in the Roman Empire, mathematical invention and discovery virtually stopped — with the exception of Roman numerals. These were widely used throughout Europe in the 1st millenium, but like the number systems that came before, it was positional and did not use place value.
But why weren’t these systems using place value? It all comes down to zero. Up to this point, this seemingly inconsequential number was absent.
There is some debate about this, of course. Some historians assert that sometime around 350 B.C. Babylonian scribes used a modified symbol to represent zero, which astronomers found useful to use this placeholder in their notations. And on the other side of the world, the Mayans used a symbol for zero in their “Long Count” calendar. But there is no evidence that zero was used for calculations.
Along came the Indian mathematician and astronomer, Brahmagupta, who was the first person in recorded history to use a symbol for zero in calculations. But India’s relationship with zero started well before that.
In ancient and medieval India, mathematical works were composed in Sanskrit, which were easily memorized because they were written in verse. (I am not kidding.) These beautiful sutras were passed down orally and in written form through the centuries. Thus the idea of zero — or śūnya (void), kah (sky), ākāśa (space) and bindu (dot) — was first introduced with words. Eventually, an actual dot or open circle replaced these words, as Indians began using symbols to represent numbers.
Brahmagupta used zero in arithmetic — adding, subtracting, multiplying and even dividing using the all-important number. All of that was well and good, except for division. It wasn’t until Sir Isaac Newton and his German counterpart Gottfried Wilhelm Leibniz came along that it was established that dividing by zero is undefined.
But really, the big deal here was not doing arithmetic. Nope, it was place value. This is so important that we all take it for granted. It’s the difference between $65 and $605 or the difference between 0.02% and 2%. See, zero isn’t just a place holder — in our number system it can represent a place value. You think math is hard now? Imagine doing calculations with Roman Numerals! Without place value and our humble zero, this work is exceedingly difficult.
This is a relatively new idea in the scheme of things. Almost 3,000 years had passed, since the Sumerians developed the first written number. Zero was introduced in India sometime around 400 A.D., though it didn’t show up in a text until around 600 A.D. Through trade routes, zero began showing up in the Middle East and China, but it took a very long time — the middle of the 12th century! — for Europeans to begin using zero and place value.
And that’s pretty much it — the very long history of our current number system, without which most other major discoveries, like calculus, trigonometry or geometry, could not be developed.
Of course there is much, much more to say about numbers themselves. For example, they’re arranged in a system based on their particular characteristics, kind of like the way we categorize animals or plants. Positive whole numbers are called natural numbers;positive and negative numbers are called integers; fractions and terminal decimals are rational numbers, and so on. This is connected to a fascinating (to me) branch of mathematics, called abstract algebra. But that’s a story for another day.
What surprised you about the history of numbers? And how about that zero? Ask your questions or make comments here.