Like HowStuffWorks on Facebook!

How Zero Works

        Science | Math Concepts

Zero in the West and on Calendars
Because there wasn't a Thursday, April 0 the day before, the Gregorian calendar has a built-in flaw.
Because there wasn't a Thursday, April 0 the day before, the Gregorian calendar has a built-in flaw.
Creatas Images/Thinkstock

The Western idea of zero being more than just a placeholder came from India in the fifth century A.D. It was here that zero as a number began to take shape and spread throughout the Arab world. It was Leonardo of Pisa, better known as Fibonacci, who introduced the number zero to the West. The son of a customs officer stationed in Algeria, Leonardo was tutored by Arabs, who taught him mathematics based on the Arabic numerals we use today, including zero. Fibonacci used what he learned and wrote a book in Latin on the use of zero and the Hindu-Arabic numeral system for the West in 1202. [source: O'Connor and Robertson]. Eventually, this idea took root and became the standard counting system we use today.

Interestingly, zero also developed simultaneously and independently of its discovery in India among the Maya of Central America. To the Mayans, zero was the base to begin counting accurately, and this was reflected in the Mayan calendars. The first day of the month was zero, followed by 1 and so on.

This concept makes for much more accurate counting, especially in tracking dates and in fact makes the Mayan calendar technically superior to the one we use today. The Gregorian calendar that is predominant in the West is based on the Roman form of counting, which did not include zero in any form (despite being introduced about 400 years after Fibonacci's book). As a result, there is no year 0 A.D. or 0 B.C. on the Gregorian calendar. By skipping over zero when dating the years following B.C., a small but noteworthy mathematics time bomb is set. Because there is no zero counted, new decades, centuries and millennia actually begin a year following their normally celebrated date. For example, the new millennium didn't actually begin until January 1, 2001, despite being celebrated by the Western world on January 1, 2000.

This misunderstanding arises from the fact that there since there is no zero year, a decade doesn't end after the ninth year, as it should. Instead, counting begins at 1, which means the tens column is reached before the transition to a new decade (or hundreds column for century and so on). Beginning a count from zero is at the heart of zero as a number, though it can seem foreign to Westerners. If you're having trouble conceiving of this, just remember that there are 10 single-digit numbers, zero through nine. Anything after that falls in the tens place or higher. But what about what's below zero? It's here that we begin to reach zero's rightful place in mathematics.