When does the new millennium begin — in 2000 or 2001? The public has voted with its checkbooks for the former. The Fairmont Hotel in San Francisco has fielded numerous calls to reserve rooms for the evening of December 31, 1999 — and precious few for December 31, 2000. The story is the same at the Palmer House in Chicago, and at the Plaza in New York City.
Travel bookings are brisk for the tail end of 1999. Favorite destinations include the pyramids at Giza; the Taj Mahal, in India; Tanzania's Ngorongoro Crater; and Machu Picchu, in Peru. Civic leaders in Nazareth are building 2,000 new hotel rooms in anticipation of the crush. Islands located a hair west of the International Date Line are vying for the honor of being the first to usher in the millennium — and for the tourist dollars that accompany it — even though it will be hurricane season. A marketing representative for the island nation of Tonga offers the following scenario: "I'd like you to imagine thousands of school children lining the shoreline, perhaps spaced no more than two meters apart, all simultaneously lighting their coconut-sheath torches on the stroke of midnight."
And yet the idea that centuries begin in years ending in 01 — not 00 — has been the consensus of historians, newspaper editors, calendarists, bureaucrats, and other arbiters of the culture for at least three centuries.
Their opinion is not dispassionately held. The Times of London wrote on December 26, 1799, "The present century will not terminate till January 1, 1801 ... We shall not pursue this question further.... It is a silly, childish discussion, and only exposes the want of brains of those who maintain a contrary opinion to that we have stated." Those afflicted with a "want of brains" included Goethe, Schiller, and Victor Hugo, all of whom made the error of coming to the defense of 1800 as the beginning of the nineteenth century, and were summarily censured by the press. The New York Times, The Washington Post, Scientific American, and The Nation put their editorial might behind 1901 as the first year of the twentieth century. The New York Times has endorsed 2001 for the twenty-first. I have not found a publication that has broken ranks, though Science News went astray briefly in 1986 when it stated in an obituary that Admiral Hyman Rickover, whose birth date was January 27, 1900, had been born in the first year of the century. The magazine quickly ran a letter to the editor pointing out its mistake.
The United States has no official calendar, and no legally prescribed method for numbering years. In general, though, we use the Gregorian calendar, established by an act of Parliament in 1751 as the official calendar of England. This act also bound the American colonies to the Gregorian, but that obligation was canceled by the Revolutionary War. In the United States we have the inalienable right to number our days and years as we please. One could run a company, for example, according to the Mayan 584-day Venerean calendar (based on Venus years) without fear of prosecution, though not without practical difficulties. Or one could choose from among the forty or so extant calendars worldwide.
Despite this calendrical freedom at least two federal agencies, for reasons best known to themselves, have taken a stand on the beginning of the next millennium. Both the National Institute of Standards and Technology (formerly the Bureau of Standards) and the Library of Congress have declared 2001 to be the first year of the twenty-first century. Ruth S. Freitag, of the library's science and technology division, has compiled a 232-source bibliography on the topic, some of the sources dating back to the close of the seventeenth century. Virtually all the sources support the library's choice of 2001. The Royal Greenwich Observatory, in England, the internationally recognized authority on timekeeping and the self-proclaimed last word on calendrics, posits January 1, 2001, as the start of the new era. (Timekeeping is defined as the "measurement of fractions of a day," whereas calendrics is "the reckoning of time over extended periods," the day being the smallest calendrical unit of time.)
Every proponent of 2001 makes the same argument: Although the idea that a century begins on the 00 year may stem from an intuitive, odometric logic, it betrays the public's lack of mathematical sophistication. The first A.D. year was 1. Because there was no A.D. 0, one cannot begin the second century with A.D. 100, since that would leave only ninety-nine years in the first century. The second century must begin in 101, the third in 201, and so on. The crux of the 2000 versus 2001 debate lies in the controversial nature of the number zero.
The issue touches on history, number theory, religion, politics, and economics — all of it colored by Western chauvinism. The public (pro-2000) is seemingly overmatched by historians and newspaper editors (pro-2001). But the public has a few quiet allies: some astronomers, number theorists, a fourth-grade class in western Massachusetts, and people who count for a living. There are some mute allies as well: the sun, the moon, the stars — all the celestial bodies in the universe, which move to their own beat, caring little for the pronouncements of the Library of Congress or the editorial board of The Times of London.
In the late fifth century A.D. Pope Gelasius, about to take over a Church of Rome that was pathetically ignorant of the Greek language, imported a Scythian monk whom he knew from Constantinople to translate documents in the papal archive. The monk was Dionysius Exiguus (roughly translated, "Denny the Runt"), who chose his name out of humility rather than because of small physical stature. A few decades later, working under Pope John I, Dionysius was translating from Greek into Latin the Easter tables drawn up by Saint Theophilus, of the Church of Alexandria, and his successor Saint Cyril. Easter is Christendom's most important movable feast, and its date is one of the most difficult to calculate. Dionysius was using the Alexandrian method to calculate his own Easter tables when the thought struck him that he was living in the 525th year since the birth of Christ. The monk saw the opportunity to dispense with the old numbering system — Anno Diocletiani, in which years were counted from the beginning of the reign of that Roman Emperor. Diocletian was an infamous persecutor of Christians, and Dionysius did not want to memorialize him further. He began his new era with 1 Anno Domini, which he calculated as the year of Jesus' birth, and never looked back. Dionysius didn't bother to number the years B.C., "before Christ." He gave Christianity a fresh start.
Dionysius has been accused of two blunders — or a blunder and a half. He was wrong about Jesus' birth date, now set somewhere in the range of 7 to 3 B.C., the consensus being 4 B.C. (Glen Bowersock, a professor of ancient history at the Institute for Advanced Study, in Princeton, New Jersey, says, "Just remember the jingle 'Hark the herald angels roar, / Christ was born in B.C. four.'") Regardless, Dionysius made an important contribution. "From a Christian point of view," says William Klingshirn, the chair of the Department of Greek and Latin at Catholic University, "time started all over again with the birth of Christ. Getting out from under the Roman calendar was very significant." He points out that Muslims begin their calendar with the flight of Mohammed. Year 1 of the Muslim calendar is the year of the Hegira (A.D. 622). Judaism has Western civilization's most ancient calendar, more than 5,000 years old, dating back to when, according to interpreters of Scripture, God created the world. "It's no accident that these great monotheistic religions have starting points that they now celebrate," Klingshirn says. "It's really central to them that time ought to start anew, since there's one religion, one God."
Arguably, Dionysius's other "blunder" is that he began with A.D. 1 rather than 0. The Scythian monk is often defended as having been merely a product of his times: the Romans possessed no zero in their cumbersome numbering system. In our modern base-ten system zeroes abound: in 10, 20, 100, 1,000, and so on. So to say that a culture had no zero seems absurd. But the Romans in A.D. 525 had only Roman numerals: X (10), XX (20), C (100), M (1,000), and so on. There was no Roman numeral for zero. Making the year of Jesus' birth 0 would have solved many problems, but it is mathematically acceptable to begin a series with 1 — or 2, or 3, or any number — as long as one is consistent and no numbers are skipped.
In any case, the A.D. system was not an instant success. Pope John died a year later, and anti-Greek sentiment in Rome swept Dionysius, his calendar, and all his Eastern colleagues out of office. Anno Domini languished for 200 years, until Bede, the Northumbrian Anglo-Saxon monk, picked up Dionysius's system and popularized it in his classic work, the Ecclesiastical History of the English People, completed in 731. The A.D. system spread when the Emperor Charlemagne adopted it for dating acts of government throughout Europe.
The Venerable Bede made a blunder of his own. Attempting to improve upon Dionysius's system by completing it, he invented the B.C. system to extend backward before Jesus. Like Dionysius, Bede was hampered by the lack of a zero. He stuck 1 B.C. immediately before A.D. 1. This made no more sense than counting backward from 2001 directly to 1999, skipping 2000. The result is a year-numbering system with a year missing in the middle, a mathematical oddity that has endured for more than twelve centuries.
If Dionysius and Bede skipped a number, then we must either admit the mistake and correct it or extend the mistake into the next millennium by beginning centuries in the 01 year. Our experts have chosen the latter course, thus overlooking the salient mathematical shortcomings of Western culture in the first millennium A.D. Neither Dionysius nor Bede had zero, though the Indians and the Maya had been using it since approximately the first century.
The we-don't-need-a-year-0 argument takes two forms. The first is that no one ever referred to a year 0 while living in that year. True enough. But no one ever knowingly lived in 1 B.C. either. In fact, A.D. 1 wasn't labeled until the sixth century, because Dionysius hadn't yet invented his system. Let's forget about that argument. Numbering schemes often take effect after the fact. Who knows what year 1997 will become when the calendar is revised by, say, the Church of Scientology in the next century?
The second argument is that zero isn't a real number — that people just don't count with it. It can be skipped with impunity. The Library of Congress states categorically, "In fact, there has never been a system of recording reigns, dynasties, or eras that did not designate its first year as the year 1." This statement appears not to be true.
Anthony Aveni, a professor of astronomy and anthropology at Colgate University, points out that the Maya used year 0s in their calendars and, in fact, began each month with a day 0. Confronted with this fact, the no-year-0 proponent Owen Gingerich, a professor of astronomy and the history of science at the Harvard-Smithsonian Center for Astrophysics, replied dismissively, "Oh, well, the Maya ..."
"We have a parochial view," Aveni says. "We're caught up with the Greco-Roman ways of doing things. It's in our nature not to think of nothingness."
The Babylonian Kings called the first year of a reign the accession year. The following year was 1. Dean Ware, a professor of medieval history at the University of Massachusetts at Amherst, says that even in medieval England reigns didn't always begin with the year 1. Sometimes the second year was 1. Sometimes the accession year was 1 — but only if the King ascended during the first part of the year. Sometimes the year 1 didn't begin until the King had done something significant. Which for some English Kings could be a long time.
Though Ware is firmly in the 2001 camp, he says the fact that there is no year 0 is a "fallibility." "It's awkward," he says, "that those two ones run together." He warns students to be alert: for example, the interval between the capture of Rome by the Gauls, in 390 B.C., and its capture by the Visigoths, in A.D. 410, is 799, not 800, years. To reckon the years between a B.C. date and an A.D. date add the two years together and subtract one to compensate for the lack of 0.
An embarrassing gaffe along these lines was committed by The Washington Post last year in yet another article explaining how "pretty simple" math dictates that the millennium must begin in 2001. The Post went on to declare that since Jesus was born in 4 B.C., the year 1996 marks 2,000 years since his birth. No. The math is, in fact, pretty simple, but the correct answer is 1,999 years.
Astronomers have solved this problem by revising Bede's numbers to make room for a year 0. In 1740 the French astronomer Jacques Cassini replaced B.C.-A.D. with a minus-plus system, in which 0 replaces 1 B.C., 2 B.C. becomes -1, and so on. All the A.D. years remain numerically the same. Cassini also said that centuries begin on the 00 year. For this the Library of Congress has noted that astronomers have been blamed for some of the "confusion."
They had little choice. Consider Halley's Comet, whose journey brings it past our planet approximately every seventy-five years. The comet cannot accommodate Bede's sloppy math, or the preferences of the Library of Congress, by truncating its orbit and losing a year as it crosses the B.C.-A.D. interface. Eclipses and all other periodic astronomical events are also affected. In addition, when Bede bypassed zero, he lost not just a year but a year and a day. Yes, in the Cassini calendar the year 0 is a leap year.
Dean Ware makes the point that calendars don't have to follow nature. One of our country's finest calendarists, LeRoy Doggett, who was the head of the Nautical Almanac Office at the U.S. Naval Observatory until his death, last year, wrote that some calendars replicate astronomical cycles, but some don't. The calendars that we in the West have followed have purported to mirror the rhythm of the heavens. In that sense, Doggett wrote, "Calendars serve as a link between mankind and the cosmos."
The calendar we use today was created by Julius Caesar, modified shortly thereafter by Augustus, and modified again by Pope Gregory in the sixteenth century. The present Gregorian calendar still closely resembles Julius's original effort. "In the sixteenth century calendars were very much at the front edge of intellectual life," says Tony Grafton, a historian at Princeton University. "You had to know how the calendar worked. There were major differences between Catholics and Protestants. But now it's all stuff printed in your pocket calendar." The Protestants objected to all the saints' days, when people weren't allowed to work. And not everyone during the Renaissance was happy with the A.D. system. The great Lutheran astronomer Johannes Kepler dated his most important book, Astronomia Nova, 1609 "Anno Aerae Dionysianae" rather than "Anno Domini," because he believed the calendar numbers were ordained by Dionysius, not by God.
Meanwhile, owing to Julius Caesar's leap years, the Church's movable feasts were backsliding through the year. Julius had instituted a leap year every four years. Over a long period that's too many, because the solar year isn't quite 365.25 days, and by the sixteenth century the calendar was ten days out of sync with the earth's orbit around the sun. Pope Gregory authorized that ten days be excised from the year 1582 — October 5 through October 14 — and decreed that leap days not be added in centennial years not divisible by 400. (Ergo 1600 and 2000 are leap years; 1700, 1800, and 1900 are not.) Not all countries agreed immediately. England did not drop the extra days until 1752. George Washington, born under the Julian calendar but still an English citizen when the Gregorian calendar was adopted, ended up with two birthdays, both of which have now been supplanted by President's Day. The Harvard-Smithsonian Center for Astrophysics, Owen Gingerich says, avoids the confusion by celebrating Copernicus' birthday, February 19, instead.
The Gregorian reform was the last straw for some astronomers. "What are we supposed to say?" Anthony Aveni asks. "'There was no sky for ten days'?" Many astronomers don't use Cassini's minus-plus system, favoring instead the Julian day calendar developed by Joseph Justus Scaliger in 1583. It begins counting time in 4713 B.C. and takes it day by day. Each day gets a new number; years are irrelevant. With the Julian day calendar, Aveni says, "I have a perfect system for reckoning the time from, say, one perihelion of Halley's Comet to the next. I don't have to worry about leap years." We are now closing in on 2.5 million Julian days. Scaliger chose 4713 B.C. as a starting point for various technical reasons and because it was sufficiently early to accommodate the declaration by Archbishop Ussher of Armagh that the world was created in 4004 B.C. It was amply early for astronomers, whose earliest recorded sightings, in the form of the Venus Tablets of Ammizaduga, showing the positions of Venus during the reign of the Babylonian King, date back to around the seventeenth century B.C.
Still, the Library of Congress hangs tough, insisting that our calendar is based on sound "simple arithmetic." Ruth Freitag continues to assert, despite the Maya and the Babylonian and English Kings, that no one counts with zero: "This is not how people count. People count by starting with one." Even today this may not be true.
Just up the road from Dean Ware's office at the University of Massachusetts sits the Mark's Meadow Elementary School, which serves as a lab school for the university's School of Education. Like many elementary schools across the United States, it begins not with Grade 1 but with a kind of Grade 0, called kindergarten. But that's not the point. Darryll McCall teaches fourth grade at Mark's Meadow. He assured me that although many of his fourth-graders are familiar with negative numbers, they have not yet been formally schooled in their use. McCall asked each student in his class to write down in order the numbers from +5 to -5. We wanted to see if the students would follow Library of Congress mathematics and count down without the use of zero. No luck. Out of sixteen students, thirteen included 0 between +1 and -1. Owen Gingerich was no more impressed with the fourth-graders than he was with the Maya. "Totally irrelevant!" he said. I went in search of more-acceptable counters — preferably adult, preferably non-Mayan.
Rob Navias, a shuttle-launch commentator for the National Aeronautics and Space Administration, assured me that no rocket or spaceship, in America or abroad, has ever blasted off on the count of one. All spacecraft, he said, manned or unmanned, in all nations, blast off at zero. In America the zero serves as a crossover point between the staff at Cape Canaveral, in Florida, which is in charge of launches, and that at the Johnson Space Center, in Houston, which is in charge of the actual flights. Cape Canaveral counts down to zero for launch; Houston counts back up for mission elapsed time. "Space flight is numbers. Zero is a part of that," Navias says. Michael Jordan-Reilly, of the Otis Elevator Company, told me that his firm will number floors in whatever manner its customers request. In America the ground floor is usually counted as one. But Europeans, Jordan-Reilly says, generally begin with zero, as in ground zero, under a variety of names. There's a certain logic to this. The first floor is one flight up. The eighth floor is eight floors up — not seven floors up, as in America. Jordan-Reilly points out that American floor numbering is often quirky: casino and hotel owners commonly skip the thirteenth floor, out of superstition.
Darryll McCall asks, "Is zero a number?" Is it a real integer, a counting number, or just a vertical line on a time line? Tony Grafton, of Princeton, sees zero for calendrical purposes as a line between increments rather than as an increment per se. I called the math department at the Massachusetts Institute of Technology to find out the proper way to count and whether zero is a real number. Apparently, counting is not MIT's forte. I was told that no one in the math department would comment on that topic. As for zero, a department administrator said, "Our people are interested more in numbers invented after 1972." He told me I needed a number theorist.
Tobias Dantzig wrote the classic cultural history of numbers, Number: The Language of Science (1930), which is still in print. "In the history of culture," Dantzig wrote, "the discovery of zero will always stand out as one of the greatest single achievements of the human race." Zero, he said, marked a "turning point" in math, science, and industry. Dantzig also noted that zero was invented not in Europe but in India, in the early centuries after Jesus. Negative numbers followed soon thereafter. The Maya invented zero in the New World at approximately the same time. Europe did not accept zero as a number until the twelfth or thirteenth century. Tony Grafton, of Princeton, bristles at the idea that not having a zero was significant. "Having a year zero is not a sign of sophistication," he insists.
But European mathematics was hampered for centuries by the lack of zero. Dantzig wrote that it was difficult to determine which numbering system was worse, the Greek or the Roman. "Neither was capable of creating an arithmetic that could be used by a man of average intelligence," he wrote. Without zero and negative numbers René Descartes could not have developed analytic geometry and the familiar analytical diagram with x and y axes. Without analytic geometry we wouldn't have the work of Newton, Leibniz, Euler, or the Bernoullis. Zero can also be an ordinal number. Mathematicians feel as comfortable with "zero'th" as with "first."
Barry Mazur, a mathematician at Harvard University, says that a numbering system that leapfrogs over zero reminds him of Out of Africa, by Isak Dinesen, in which the author was taught Swahili by a young Swedish dairyman. Her mentor taught her all the numbers except nine, because, he explained, there was no nine in Swahili. There was eight; there were ten, eleven, and so on. But no nine. Dinesen believed her teacher in part because her houseboy was missing a finger; she thought that perhaps it was a self-inflicted amputation, to facilitate counting on the fingers. As it turned out, the teacher had skipped nine because its sound in Swahili was vulgar to the Swedish ear. "Mathematicians, of course," Mazur says, "prefer it if the numbers are all there."
"They have not got nine in Swahili," the dairyman told Dinesen. He might have added, "They have not got zero in the Dionysius-Bede calendar" — the difference being that in the latter case he would have been telling the truth.
The Greeks, Dantzig wrote, "could not conceive the void as a number let alone endow the void with a symbol." This shortcoming hampered Western physics, too, for well over a thousand years. The pre-Socratic Greek philosopher Democritus of Abdera put forth the idea of a void ("Nothing exists except atoms and empty space; everything else is opinion"). Plato, however, disagreed, and Democritus' particle physics was abandoned in the West until the Renaissance. Chris Quigg, a physicist at the theoretical division of the Fermi National Accelerator Laboratory, in Batavia, Illinois, says, "Nothingness is essential to physics. The vacuum is essential to the way we understand the world. Zero is a very important concept."
Even the occasional calendarist agrees. The Jesuit Peter Archer broke with the usual system in his book The Christian Calendar and the Gregorian Reform (1941), inserting a year 0 before A.D. 1 and calling it the first year of the Christian Era. Like astronomers, he made it a leap year. Archer wrote that a date in this first year should be stated in terms of months and days only, "just as the age of an infant in its first year is given in terms of months and days." His book carries the imprimatur of Archbishop Francis J. Spellman. This means, William Klingshirn, of Catholic University, says, that there is "no theological problem with the work, no impediment to the faith or morals of a Catholic." It doesn't mean that the book is true, Klingshirn says — but Catholics are free to celebrate the new millennium in 2000. "The Church is accommodating when nothing important is at stake," he says. As for Cassini's year 0: "Scientists need not observe a calendar based on Christianity."
The Royal Greenwich Observatory continues to insist that there was no year 0 and that the millennium begins in 2001. Although it is the ultimate arbiter of timekeeping, the RGOhas little authority in the area of calendrics. One calendarist likens a timekeeper's pronouncements on the calendar to a particle physicist's opinions on cosmology — relevant, perhaps, but not authoritative.
Kristen Lippincott, the director of the Millennium Project at the Old Royal Observatory, from whose geographic coordinates Greenwich Mean Time is measured (and the RGO's former home), insists that the RGO is an authority. She compares it to "authorities from previous ages who might cite Aristotle, Ptolemy, or Bede." In settling on 2001 and renouncing zero the RGO has made a significant break with its past and with geography. One of the RGO's most illustrious directors was the fifth astronomer royal, Nevil Maskelyne (1732-1811), who stated that a century begins on the 00 year. The Old Royal Observatory sits atop the prime meridian, also known as zero — not one — longitude.
Lippincott argues that the RGO's determination on when the day, and thus the century, begins is based on international law. According to the transcripts of the International Meridian Conference of 1884, legally the Universal Day does not begin until it is "mean midnight at the crosshairs of the Airy Transit Circle in the Old Royal Observatory."
For the sake of argument, let's accept the legal definition: the millennium begins on January 1, 2001, at Greenwich. Now, imagine yourself vacationing in Tonga, near the International Date Line, where it is twelve hours earlier than it is in Greenwich, on the night of December 31, 2000. At the stroke of midnight, as the schoolchildren light their coconut-sheath torches, your digital watch turns from December 31 to January 1, 2001. Happy New Year! Happy New Millennium! No. Because it's still noon on December 31 in Greenwich, the millennium hasn't started yet. Legally, it's still yesterday. And it will remain yesterday until noon Tonga time, all evidence to the contrary. In fact, in all twelve time zones east of Greenwich midnight will come sooner than at the Royal Greenwich Observatory, but it will stay yesterday until midnight Greenwich Mean Time. This may not be how the world turns, but it's the official position of the observatory.
On the East Coast of the United States, five hours behind Greenwich, the new millennium will start at 7:00 P.M. Those people who celebrate midnight in Times Square will be five hours late. Californians must celebrate the new millennium at 4:00 P.M., while stuck in rush-hour traffic. "I guess we're seeing the last vestiges of British imperialism," Klingshirn says.
Can one celebrate 2000 without fear of censure from the RGO? Certainly, with the necessary cash. The RGO has set up an office in Greenwich, Connecticut, to sign up corporate sponsors for the observatory. The RGO's American representative, Stephanie Record, says she is looking for Fortune 500 companies that do business with the United Kingdom to become financial sponsors of the RGO. For a fee they will be able to display a special "Greenwich Meridian 2000" logo. Greenwich 2000? Not 2001? "We're loosening up the dogma," Record says. "Our purpose is to draw sponsors. Frankly, the year 2000 appeals to more people." The RGO plans to hold a year-long celebration in 2000, culminating with New Year's Eve on December 31, to usher in the real new millennium for its corporate sponsors. Some may get to ride the Concorde, celebrating midnight over Greenwich, England, and then jetting to the United States to celebrate midnight again over Greenwich, Connecticut, five hours later. (But wait: doesn't the millennium start at 7:00 P.M. U.S. East Coast time? "Don't write that in the article," Record pleaded.)
Last winter I was pulling out of the parking lot of the Friendly's restaurant in Hadley, Massachusetts, when the odometer of my thirteen-year-old car turned from 99,999 miles to 100,000. I pointed this out to my ten-year-old son, who sat beside me. He was thrilled. I bought the car in 1984 with 000007 on the odometer, but the dealer assured me that the seven miles were from incidental driving — test runs inside the factory, driving the car onto a transport truck, a boat, and so forth — and that the odometer began life at 000000. I like to think that this is because it's a German car: the Kaiser broke with most of Europe in 1899, proclaiming that the twentieth century would begin in 1900. But I've been assured that American cars, and even Japanese cars, also begin life with an odometer reading of 000000, never 000001. About a mile down the road, as I passed Dean Ware's office, I tried an experiment. Screaming, yelling, and bouncing up and down with as much excitement as I could muster, I shouted, "Look! 100001!" My son hardly looked up from his book. "What's your point?" he said.
Think how history might have changed had the Venerable Bede driven a BMW, with all those lovely odometric zeroes laid out in front of him. Bede achieved one of the great near misses in mathematics. By counting years "before Christ," he in effect became the Western inventor of negative numbers. Had he taken the logical next step and invented zero, he could have forestalled our current calendrical problems and accelerated Western thought by several centuries. He could also have made a significant contribution to Christianity. The word zero is derived indirectly from the Indian word "sunya." Sunyata is a profound Buddhist concept: it means emptiness, the void, the essence of all things, the mother of all existence from which everything is born and into which everything must return. Zero was no less profound for the Maya. In Mayan hieroglyphics archaeologists have found a flowerlike symbol, a central element with four lobes, which they believe stands for zero. Barbara Fash, of Harvard University's Peabody Museum, says the flower's center point is a "bed of creation." For the Maya, who believed in cyclical rather than linear time, zero marked both the beginning and the completion of a cycle. Zero was a flower. All life grew out of it.
Bede could have created a zero for the zero'th year of the Christian era, with a symbol that represented not only zero but also the man he considered to be the son of God. In our everyday calculations and reckonings all of us — Christians, Jews, Muslims, Buddhists, atheists — would have been continually confronted by Jesus.
Illustrations by Bush Hollyhead
Research for this article was conducted by Janet MacFadyen