Oysters vs. Porcupines
Unearthing the Roots of the Century Dispute
by Laurence McGilvery
copyright 2000 Laurence McGilvery.

Midnight, 31 December 1999

If this long-awaited moment were a bell ringing the highest note on the scale, it would be flat. If it were your last gasp attempting to break the mile, you'd be one step short of the finish line. If it were the final page in a lurid murder mystery, you'd never find out whodunit.

Do centuries and millennia close with years ending in 99 or 00? That question, what Ruth S. Freitag of the Library of Congress calls the "minor imbecility" of the Century Dispute, has diverted some of our most inventive thinkers for at least three hundred years. For example:

We are now in the last year of the century, and whoever denies this has no more brains than an oyster.

THE PORCUPINE, Philadelphia, 1799

In response, the thunderous voice of logic made short work of know-nothings:

The present century will not terminate till January 1, 1801, unless it can be made out that 99 are 100 We shall not pursue this question further It is a silly, childish business, and only exposes the want of brains of those who maintain a contrary opinion to that we have stated...

THE TIMES (London), December 26, 1799

-A century later nearly all major institutions and newspapers stood firm and celebrated the beginning of our own times on January 1, 1901, not 1900. Now, with a millennium in the balance, mass communication, commercial advantage, and millennial delusions have generated a din that swamps the small, steady voice of reason. The only possible excuse for adding to this tumult would be to ask--and answer--a different question. Why does the Century Dispute exist at all?

Tortured Sophistry
The year arrives too early or the century too late. Intelligent, able people have wasted carboys of ink and precious years of their lives trying to patch that seeming flaw, but what if they can't, poor devils? Some assert that the first decade of the Christian era was, absurdly, only nine years long, thus throwing the decimal system out of whack. Others fantasize a full year named zero between 1 B.C. and A.D. 1. This tortured sophistry is more than a little like the search for perpetual motion. The Century Dispute has no solution. Its remote source is older than any calendar. The mismatch between years and centuries arises from the natural, unavoidable collision of two conflicting systems of numbering as familiar to us as the alphabet.

Porcupines 1, Oysters 0
The Porcupines have prevailed, we few Oysters have been outnumbered and ridiculed as pedants, and yet bear with me. For the record, the short answer to the Century Dispute is this: the 1900s conclude tonight on New Year's Eve, December 31st, 1999; the twentieth century and the second millennium will last one more year.

The nearly universal cliché of the calendar as an odometer is exactly wrong. When the zeroes roll over on your nearly new car, they mark the end of its 2000th mile. When the clock hands point straight up at midnight on New Year's Eve, the 2000th year is only beginning. The first is a measurement, the second, a date. Confusing the two produces the grief and folly of the Century Dispute.

timeline

Income Tax Day can be written 4/15, but--quickly!--how many months is that? Not 4-1/2, but 3-1/2. The calendar always reads one unit higher than its equivalent quantity or measurement.

Invisible But Essential
Every day we use two numbering systems in conflict with each other, one for periods of time, the other for everything else, yet English lacks common terms to differentiate them. Let me repeat that. They are invisible. They have no English names. Even thinking about them is hard. They progress in parallel, forever a step apart. Once every hundred years their mutual assymetry causes a natural breach in the calendar that cannot be ignored. That time has returned.

What Zero Year?

Of all the confusions produced by the Century Dispute, the so-called zero year is the most irritating. Briefly, it inserts a year named zero between B.C. and A.D. to make centuries and years come out even. The first decade would end on New Year's Eve of the year 9, and the second millennium would end in 1999. The U. S. Naval Observatory blithely states on its website, "Today it is obvious that a year designated 1 would be preceded by year 0, which would be preceded by year -1, etc." Well, not quite. The calendar already has a perfectly good zero, the dimensionless moment that separated B.C. and A.D.

In "The Nothing That Time Forgot" (New York Times, November 13, 1999), Robert Kaplan and Dick Teresi asserted that leaving out the zero year "is like counting backward from 2001 directly to 1999, skipping 2000." Here we see with awful clarity the flaw in this idea. The writers have confused two of the three common roles of zero. First, its sloping shoulders and empty belly signify nada, nil, nix, the pocket empty or, better, the balance on a fully paid bill. Second, it stands for ten in our decimal system, giving birth to the glorious powers of ten which every citizen should understand (see the wonderful book Powers of Ten). Third, it is a dimensionless division, not a quantity or a duration. Any scale that measures continuously between plus and minus values contains this zero mark. Think of the visitor straddling the 0° meridian at Greenwich, England, one foot in the eastern hemisphere and one in the western. The year 2000 is a multiple of ten; the zero point is a multiple of nothing.

The other half of the zero year argument states that anniversaries crossing the B.C./A.D. divide occur a year too soon. Here is a simple example: 30 June 5 B.C. to 30 June A.D. 5 is not ten years but nine. The reason is not a missing zero year but the fact that the two outside years, 5 B.C. and A.D. 5, each contain only six months. Together they make one full year. The equivalent span on a thermometer is -4.5 degrees to +4.5 degrees.

If this arcane issue caused constant difficulties in our daily lives, it still would not be acceptable to fix it by destroying the symmetry of the calendar. It is, in fact, quite harmless. Let the astronomers and the archaeologists and their kin to whom it does matter remember that they must add one when crossing the B.C./A.D. divide, if they want anniversaries to appear to match. They are all up to that challenge.

Cardinals and Ordinals

What I will call calendar time encompasses days, months, years, centuries, and millennia, as well as seasons, historical eras, and geological periods. Entertainments divide into acts and scenes or movements; sporting events play out in quarters, innings, rounds, matches, and laps. All take time to unfold, and all are named as they begin. All-and this is important-are ordinal numbers, even when they look like cardinals. The new year is not 2000, like a milestone, but the 2000th year. It has a beginning, a middle, and an end.

The difference between cardinal numbers and ordinal numbers bears on the Century Dispute without exactly explaining it. Cardinal numbers say how much or how many: 12 eggs, $4.73, 21 years old. Ordinal numbers tell position in a sequence: 2nd base, Fifth Avenue, in her 22nd year. The number of this page is an ordinal. So is a highway mile marker or a street address. 9 East Tenth Street combines two sequences. The names of years--A.D. 1, 1066, 1455, 2000--are ordinal numbers masquerading as cardinals. A date links three together: 1 January of A.D. 1 is the first day of the first month of the first year.

Circa A.D. 525, a Scythian monk named Dionysius Exiguus ("Dennis the Humble") invented the Christian Era, though he calculated it four to seven years too late, according to most Biblical scholars. The Venerable Bede, the first great English historian (672/73-735), popularized it two hundred years later. Both used ordinal forms to name dates. Bede's Ecclesiastical History of the English People (Book I, chapter ii) contains the first written expression of the era we now call B.C.: "ante uero incarnationis Dominicæ tempus anno sexagesimo" ("in the sixtieth year before the time of the incarnation of the Lord," which is 61 B.C.). Those ancients knew the true names of years. We have forgotten.

An Impartial Despot
In the jumble of numbers that define our world, calendar time stands apart. Figures to express quantities and measurements march in lockstep, large complete units followed in descending order by small complete units and fractions: 5 ft. 6-1/2 in., $29.95, 98.6 degrees, 1 tablespoon 1-1/2 teaspoons. The key word there is "complete."
That impartial despot Father Time skates silently by to a more subtle but equally relentless rhythm. Calendar time defines a day by nesting it within an incomplete month, that within an incomplete year, and so on. If this is unclear, try to reverse the process; try to write a date as if it were a dimension. As I type these words it is the twenty-sixth day of December in the 1999th year, but each of those periods is incomplete. The last full year is not 1999, but 1998; the last full month, November; and the last full day, Christmas. Today is not 12/26/1999, but 1998 + 11 months + 25 days + 22 hours, 39 minutes. Read dates this way, forever looking backwards, and the century will come out even, but can you keep track of next week's appointments?

A Moving Finger
A measurement is like a cairn, three stones that a hiker stacks to mark the trail. A day is like a note within a melody within a song, all sounding at once. The present is never complete. Time, the fastest thing there is, constantly presses forward. Each new instant advances the second, the day, the century, the geological era at exactly the same rate, a great plume of past time streaming out behind us balanced on the indivisible moment of the present. Omar Khayyam's moving finger "having writ, moves on."

Geniuses
The question remains: Why? What's the point of two mutually exclusive systems? The paleontologist Alexander Marshack has argued persuasively that Paleolithic stones and bones marked with scratches resembling calibrated scales are actually primitive lunar tallies, thus pushing the calendar back at least 35,000 years. Should we be surprised? Our forebears knew the night sky better than nearly all of us. It spelled danger and elicited fear, no doubt, but the heavenly motions also seemed to announce tides and seasons, the growth of plants, the movements of animals. Those brilliant ancestors invented time, language, number, music, art, myth, and the sense of existence within an unfathomable universe. They balanced on the unspooling tightrope of time just as we do. The most observant among them must have discovered whole almanacs full of natural phenomena.

So how did they know to count by ordinal numbers? They didn't. A calendar is a map in time, and the "here" of that map is the present. Long before they had numbers, our heroes-in many times and places-must have begun to divide the seamless flow: today, yesterday, tomorrow. They could only note each day as they lived through it, whether they measured its beginning from midnight, dawn, noon, or the appearance of three stars in the twilight sky. You would have done the same.

Does It Matter?
Each of our two ways of numbering springs from necessity. Each works flawlessly, except when we confuse them and produce the indestructible misunderstanding called the Century Dispute.

Does it matter? No, for this 2000-year cycle has no relation to nature. Even its reckoning, calculated with the best of intentions, is off by four to seven years. Hillel Schwartz, author of Century's End, delights in pointing out that any moment is a millennium away from something. The turn of this 1999th year is an arbitrary scratch in a continuum which has lasted 4.56 billion years since Earth's formation and 12 to 15 billion years since the Big Bang. The rocks, oceans, and skies are not humming in a terrestrial crescendo as the hour mounts to midnight. The wheeling heavens will not stumble one nanosecond as our puny clocks tick over. Much of humanity already has shrugged its collective shoulders. The rest of the animal kingdom munches on, completely unconcerned.

Counting the Minutes & Seconds

"What a funny watch!" Alice remarked. "It tells the day of the month, and doesn't tell what o'clock it is!"

"Why should it?" muttered the Hatter. "Does your watch tell you what year it is?"

"Of course not," Alice replied very readily: "but that's because it stays the same year for such a long time together."

Clock time is embedded within calendar time, yet we usually count it just like measurements. What's going on? With her characteristic common sense, Alice recognized one likely reason. Hours are short; years are long. Still more important, calendar time is ancient, and clock time is recent. The first public mechanical clocks were erected in Italy around 1300. Accurate clocks that could count minutes and seconds were theoretical ideals until the technological revolution of the mid-seventeenth century. Only in the nineteenth century did the rise of railroads and the need for reliable schedules produce widespread precision in timetelling.

Calendar time is 35,000 years old or more, accurate clock time, less than 400 years. No wonder there is a mismatch. Of course those seventeenth-century rationalists would simplify clock time by adding it up like other measures. Of course they would follow the example of the modern clock face itself, which graphically displays the passage of time as sums of hours, minutes, and seconds. And of course they would leave calendar time alone. What possible benefit could there be in changing it except at century's end.

Does it matter? Yes, but only because if it does not, if the beautiful rigor of number can be so trivially cast aside, then all clarity inevitably will suffer.

2000
At this moment of midnight, December 31st, 1999, few who observe our calendar-not even we Oysters-are immune to the magical number 2000. We bid adieu to both the glories and the profound disasters of the 1900s. The twenty-first century and the third millennium will begin exactly one leap year, 52-2/7 weeks, 366 days, 8,784 hours, 527,040 minutes, 31,622,400 seconds later. Celebrate twice, and work, as ever, for a bright future.

Copyright 2000 Laurence McGilvery