Now that Brian has started the hare running on gender-neutral pronouns, I thought I’d weigh in on the old chestnut “When did the 21st century start?” (I saw this raised in a recent comments thread, but can’t locate it now). The commonsense view is that it began on 1 January 2000, and I think the commonsense view is right. Against this we get a bunch of pedants arguing, that, since there was no year zero, the 1st century (of the current era) began in 1CE, and therefore included 100CE. Granting this, the 21st century began on 1 January 2001.
The problem I have with all this is the claim that “there was no year zero”. It seems to imply that, on the first Christmas[1], the Jerusalem Post came out with a headline something like “Jesus Christ Born: Wise Men Announce New Dating System”. Since zero hadn’t been invented yet, there obviously wouldn’t have been a year 0, but on this assumption, there would have been a year 1. But of course, there wasn’t a year 1 either. In the hegemonic dating system of the time, this was 754 Ab Urbe Condita. No one would refer to dates Anno Domini for hundreds of years to come.
It’s true of course, that when our current system of dating was first proposed, by Dionysius Exiguus in the 6th century, zero still hadn’t been invented (or at least the concept hadn’t reached Christendom) and so, when, years were retrospectively dated, there still wasn’t a year zero in the system proposed then. But why should we be bound by the ignorance of a 6th century monk? If it suits us to have a year zero, why shouldn’t we have one? It’s obvious that there would be some overlap, in that 0 CE would be the same year as 1BCE, but I don’t see any problem with this. We can use the 0 CE to get the centuries right and 1BCE on the rare occasions we need to refer to historical discussion of events in this year that uses the traditional system.
When we want a really sensible system, we can follow the astronomers who not only use 0CE but employ negative numbers for earlier dates.
Note Although I’ve never seen anyone else put this argument, once I’d worked it out, I had enough to Google on, and found this piece by Steven Dutch of the University of Wisconsin
fn1. There’s an obvious problem to do with the day on which years are supposed to start, which I’ll skip over.
{ 55 comments }
Scott Martens 08.20.04 at 10:27 am
I’m with Hobsbawm. The 21st century started January 1st, 1992. The Gregorian calendar is just an arbitrary construct.
Mike S 08.20.04 at 10:56 am
Unless one is willing to CHANGE to a sensible system, the millenium began on January the first 2001. By all means convince us that the common sense idea can be rationalised and, perhaps, should be adopted; but a good idea is not the same as a rational interpretation of our current calender.
Fergal 08.20.04 at 11:24 am
On a slighly different tack (but in the same spirit), isn’t the use of CE and BCE rather than AD and BC a bit pointless? It doesn’t change the fact that the reference point is based on the birth of someone a large group of people think of as the son of God, and, since it’s mentally jarring every time one reads it, it comes across, perhaps unintentionally, as an affectation or squeamishness about religion.
In any case, to be consistent, one should change the names of the days of the week because they impicitly validate belief in Norse (or Babylonian) gods – not even Napoleon managed to get that one to stick (unless you’re Greek or Portuguese).
John Quiggin 08.20.04 at 11:28 am
Mike, I already have “changed” and so have the majority of people, since we use the term “21st century” to refer to years starting with “20”.
I’ve merely responded to pedantic counterarguments by saying that, if this implies I need to refer to the year before 1 CE as 0 CE, I’ll do so, should the occasion ever arise.
chris 08.20.04 at 11:36 am
This was given a goodish going over at the time. There was plenty of evidence shown that previous centuries were believed by contemporaries to have started in 1701, 1801, 1901, but a the majority opinion this time around seems to have been to settle for 2000 on the grounds that a. the tabloid press had already decreed it so, and b. that in these enlightened times mere mathematical precision is boring and pedantic.
Des von Bladet 08.20.04 at 11:46 am
FWIW it is impossible to get into this mess in Swedish, where one refers only to, say, “1800-talet” (technically ambiguous with a decade, but that never seems to be an issue in practice).
The non-need to keep mapping this numerical range to “19th century” is itself a vast improvement in my quality of life, despite (or because of) my assorted degrees in numerate disciplines.
reuben 08.20.04 at 11:59 am
‘a good idea is not the same as a rational interpretation of our current calender’
Hmm, which is better, a good idea, or ‘rational interpretation of our current calendar’? Think I’ll take what’s behind door number one, Bob!
(Snark aside, my point is that we don’t have to be constrained by our tools, particularly not when all we’re talking about here is a naming convention.)
‘in these enlightened times mere mathematical precision is boring and pedantic’
I’ll take clear communication over mathematical precision for its own sake any day. I suppose it would matter if our starting point was a truly precise and significant event, eg ‘the day our people invented burritos and thus made humanity smile for the first and best time’, but as the zero point for our calendar is predicated on the story of a guy born of a virgin and able to walk on water, I can’t find myself caring so much about precision.
Lee Bryant 08.20.04 at 12:42 pm
My daughter was born at dawn on 01-01-01, so that is when the C21st began – she wholeheartedly rejects the barbarity of the C20th and waited just long enough to avoid being associated with it.
Of course 2001 was the start of the C21st, just as the year 2000 was clearly the logical end of the C20th. How absurd it would be if the C20th didn’t contain the year 2000!
dsquared 08.20.04 at 1:01 pm
The 21st century started January 1st, 1992. The Gregorian calendar is just an arbitrary construct.
What about the Julian calendar? New Year’s Day round my house is April 10.
chris 08.20.04 at 1:04 pm
Reuben,
When you were about four, and you’d just learned to count and you thought it was brilliant, you would pester your mother, saying, “Mummy, listen to me count all the way to 100”. And you’d start, “1, 2, 3…” and so on, and with a reasonable slice of luck you’d get to “…97, 98,99”, and then, triumphantly, “A HUNDRED!”
You didn’t stop on ninety-nine and explain that you weren’t going any further because the next number started with a 1, so it was in a different hundred to the one you’d been counting.
Or if you did, I hope your mum told you not to be such a clever clogs,
Mark Byron 08.20.04 at 1:08 pm
I’ll agree with fergal; when someone starts tossing around CEs, you’re flagged with the thought that the writer is making a point of being secular (or at least non-Christan). It’s usually secular academics that use CE.
There are some fights that don’t seem to be winnable. For instance, my English major wife makes sure to say “It’s I” or It’s he” rather than “It’s me” or “It’s him.” That’s proper grammar, but the point is lost on the vast majority.
The same goes for centuries and millennia starting in XXX0 rather than XXX1; the former is correct, but you look pedantic pushing the point.
Aeon Skoble 08.20.04 at 1:25 pm
It’s not pedantry to notice that a numbered series of 100 things goes from 1 (the first thing) to 100 (the 100th thing). Calendar years are a series of numbered things (the first year-of-our-lord, the second year-of-our-lord, and so on, religiosity notwithstanding). If there are a bunch of apples on the table, you count 1 for the first apple, 5 for the fifth apple. There’s no “zeroth” apple, and the fifth one is five. The “first dozen” would be the apples numbered 1-12. The “second dozen” would begin with apple #13. That’s why the first century contained the years numbered 1-100, and the 20th 1901-2000. There’s nothing pedantic at all about clarifying this, and I think people who cry “pedantry” in this context are simply poisoning the well.
Mark Byron 08.20.04 at 1:31 pm
Oops, that should be the latter is correct.
reuben 08.20.04 at 1:38 pm
Chris,
You’ve inadvertently made my point better than I could. (I never was a very clever boy, you see.) The pedants shouting ‘No, it started in 2001!’ are all too similar to the triumphant little boy who’s thrilled at his ability to count to 100.
Mathematically, you’re right, but that’s not what’s actually important here, because we’re not talking about math, we’re talking about a naming convention. The question actually being asked isn’t, ‘Mathematically, when does the millenium end or begin?’ It’s ‘What should we define as the beginning of this millenium?’ Those are two very different questions.
The pedants are so thrilled at their ability to count correctly that the don’t realise that this isn’t actually about numbers, it’s about a name, and that at least in this instance, an accurate count just isn’t that important, so convenient and intuitive naming is going to trump it. If a correct count really had mattered, then, because we’re actually fairly good language users whether we like it or not, most people would have waited until 2001 to celebrate the millenium.
What I’m saying here is that people aren’t stupid; they’re fairly sensible, and at least in language terms, pretty capapble of recognising what is significant and what isn’t. Which isn’t the same as saying they hate math or don’t think the ability to count to 100 is important.
PS – I never did make it up to 100 myself, but due to my redneck upbringing and all the tallboy six-packs in our fridge, I was a wiz at the six times table.
Dick Thompson 08.20.04 at 1:40 pm
My modest proposal. Rename the year 1BCE to 0. Rename all the other BCE years on the formula n BCE -> -(n-1). Then we will have mapped the CE years into the rational integers, which is a much more reasonable system, and these silly questions will go away forever. Of course the ancient historians will be miffed, but how much weight can they swing? Greatest good for greatest number.
jam 08.20.04 at 1:47 pm
The pedantry inheres in insisting that a century always contains precisely 100 years. Almost no other unit of time is so inflexible. A year is normally 365 days, but occasionally it’s 366 and on one occasion was much shorter. A month can be anywhere from 28 to 31 days long. A day is normally 24 hours, but once a year there’s a 23 hour day and once a year a 25 hour day. Even a minute can sometimes be 61 seconds long. So I see nothing wrong with there having been a 99 year century somewhere along the way.
reuben 08.20.04 at 1:50 pm
Aeon
Fair play on asking people not to overuse the term ‘pedant’, of which i’m certainly guilty. The problem, though, is that these language discussions are always full of people who point to one fact and say ‘I’ve got a fact and this makes me right’, without realising or ackowledging that questions of language usage are usually too complex to hinge on one single fact, even if, as in this case, it’s the incontrovertible one that there are 1,000 years in a millenium.
Scott Martens 08.20.04 at 2:21 pm
D^2, fair enough. I should have said anno Domini/Common Era year numbering scheme.
My grandfather’s birth was legally recorded on the Julian calendar, one of the last to ever be so recorded. I’m pretty sure New Year’s on the Julian calendar is no later than mid January on the Gregorian, unless you’re going back to the pre-Julian Roman calendar and starting the year on March 25th. The Orthodox calendar year starts on Julian January 1st.
Motoko Kusanagi 08.20.04 at 2:21 pm
Why don’t we just accept Heribert Illig’s discovery that somewhere in the second half of the first millennium 297 fictitious years were inserted, and that we are really living in 1707 now? That way the solution to this problem can be postponed for another 10 generations.
arthur 08.20.04 at 2:50 pm
A friend of mine split the difference and held an excellent new millennium party on June 30/July 1, 2000. I think that improbably.com publicized this concept. Not only was it a sensible compromise, in the Norhtern hemisphere it was a much better time for a pool party than either Decmeber 31 would have been.
Stentor 08.20.04 at 3:17 pm
If we’re going by common usage rather than mathematical precision, then really the number of a year can’t tell us what millennium it belongs to. I can’t think of any meaningful social/historical phenomena that changed over precisely on January 1 of 2000 or of 2004. Remember that “the sixties” began somewhere in the middle of the set of years beginning with 196-, far after January 1 1961 or January 1 1960.
I vote that the 21st century as a useful historical epoch began (at least in America) either on January 25 2001 (Bush’s inauguration) or September 11 2001.
Richard Bellamy 08.20.04 at 3:48 pm
I’ll agree with fergal; when someone starts tossing around CEs, you’re flagged with the thought that the writer is making a point of being secular (or at least non-Christan). It’s usually secular academics that use CE.
Actually, it’s more likely JEWISH academics who will be found using CE rather than AD. AD meaning “Year of OUR Lord”, and the Jew is not included in the Us in question.
Using the same number system (2004 CE) is still based on someone else’s god, but it is a reasonable concession to people know what date your talking about (less translation needed than using 5760-whatever).
Just like putting “Under God” into the pledge unfairly requires the atheist to stand out in the crowd, using “AD” in common speech requires the non-Christian to stand out and identify his religion (or non-religion) in a context in which it is probably not relevant (“These Mayan ruins are of city that was first settled in 300 CE,” said the obviously-non-Christian archeologist.)
That is why the only proper response is to purge implied-religious-constructions from our speech.
Since a non-Christian will not wish to say “A.D.”, continuing to use it when there is a well-understood alternative appears to be merely an attempt to single out the non-Christian.
aeon skoble 08.20.04 at 3:50 pm
Reuben, you’re right that it’s partly about language, but words like “century” and “millenium” are pretty well defined – indeed, even the “21st century starts in 2000” crowd uses the words _exactly_ as I do. Ask one of them what “century” means, and they’ll say “period of 100 years” just as “dozen” means “collection of 12 things.” It’s not as though there is some disagreement about what “century” means. We and they agree it means “period of 100 years,” and that’s why this works as we say, not as they say – the first was 1-100, so the second was 101-200, etc., hence the 20th must be 1901-2000. Your point would have greater validity if we were talking about vaguely-defined or fungible terms, but we’re not. “Century” and “millenium” are like “dozen,” not like “thin.”
Ric 08.20.04 at 3:51 pm
I’ll agree with fergal; when someone starts tossing around CEs, you’re flagged with the thought that the writer is making a point of being secular (or at least non-Christan). It’s usually secular academics that use CE.
Actually, it’s more likely JEWISH academics who will be found using CE rather than AD. AD meaning “Year of OUR Lord”, and the Jew is not included in the Us in question.
Using the same number system (2004 CE) is still based on someone else’s god, but it is a reasonable concession to people know what date your talking about (less translation needed than using 5760-whatever).
Just like putting “Under God” into the pledge unfairly requires the atheist to stand out in the crowd, using “AD” in common speech requires the non-Christian to stand out and identify his religion (or non-religion) in a context in which it is probably not relevant (“These Mayan ruins are of city that was first settled in 300 CE,” said the obviously-non-Christian archeologist.)
That is why the only proper response is to purge implied-religious-constructions from our speech.
Since a non-Christian will not wish to say “A.D.”, continuing to use it when there is a well-understood alternative appears to be merely an attempt to single out the non-Christian.
fyreflye 08.20.04 at 4:05 pm
If the argument is framed as “pedantry” against “common sense” there can be so surprise at the majority view. Suppose we were to reframe the conflict as “those who can count to 100” against “innumerates.”
C Schuyler 08.20.04 at 4:20 pm
Great God almighty! (or, to be secular about it, What the f___?) Does anybody REALLY care about any of this?
Maybe I do care to some extent, about one bit of all this blather: “under God” strikes me as much morely clearly an endorsement of religion than “A.D.”, which for the vast majority of people who use it is a meaningless abbreviation. Plus, “under God” is a state-sponsored endorsement, inflicted on children, in a somewhat coercive environment . . . . I mean Christ, this is a philosophers’ blog; a little more evidence of rational distinction-making would be appreciated.
Matt Weiner 08.20.04 at 4:23 pm
John Q wrote:
if this implies I need to refer to the year before 1 CE as 0 CE, I’ll do so, should the occasion ever arise.
It’s not that simple, unfortunately; I think your proposal requires that you subtract 1 from every BC[E] date. This would cause a lot o’confusion whenever you were talking about ancient history.
Aeon, sure, a century is a period of a hundred years, but that doesn’t settle when it starts. I can say that it was a century ago that, um, whatever the hell happened in 1904. Admittedly the way I number the centuries runs into a tiny rough spot if you want to talk about the years 1-100, but that rough spot is a small price to pay for the way referring to 2000 as the beginning of the 20th century facilitates communication with our fellow-creatures.
I was the one who brought this issue up here, BTW, and I don’t think any of the arguments on the thread have equalled my “fresh can of whoop-ass” argument. (What’s that I hear? Sounded kinda like a can being opened.)
Matt Weiner 08.20.04 at 4:25 pm
Also, wasn’t “Wise Men Announce New Dating System” the headline when they launched the Salon personals?
reuben 08.20.04 at 4:34 pm
Aeon
You’ve missed my point. The argument isn’t over how long a millenium is, it’s over when we want to declare that this millenium starts. That’s a different proposition altogether.
And if pretty much everyone agrees that this millenium started in 2000, and to some people that means that this must mean that some century over the last 1000 or 2000 years has had a year stolen from it, well, frankly, who cares? It’s not as if that year actually disappeared (though if one could, I’d like to nominate 1977, when I was really fat and someone stole my Huffy).
All that’s happened is that part of the definition of ‘millenium’ has changed. That is, the definition of how long one is hasn’t altered, but our collective definition of when ‘the millenium’ starts has shifted by one.
Or, to look at it a slightly different way, you say that the definition of ‘a millenium’ is pretty well-defined. And it is. But we’re not talking about ‘a millenium’, we’re talking about ‘the millenium’. Changing the article brings notions of beginnings of endings into the the definition.
So if an earlier poster is correct in saying that the last millenium was considered to begin in 1001, then that means that 1,000 or so years ago, their definition of ‘the millenium’ included the concept ‘beginning in a year that ends in 001’. But now, the common defintion of ‘the millenium’ appears to include the concept ‘year that ends in 000’, even though the definition of ‘a millenium’ hasn’t changed.
Cheers
reuben 08.20.04 at 4:49 pm
Perhaps we should date events not with our current calendar, but beginning from the day when whup-ass was invented.
Or was it discovered?
aeon skoble 08.20.04 at 5:05 pm
Reuben, when you say “But now, the common defintion of ‘the millenium’ appears to include the concept ‘year that ends in 000’, even though the definition of ‘a millenium’ hasn’t changed.” — doesn’t that just mean that lots of people, encouraged by a sensationalist media, are making a mistake about how to count? It’s certainly possible that large numbers of people can make the same mistake.
Matt Weiner 08.20.04 at 5:24 pm
I date it from when whoop-ass was first bottled. In a strange quirk of events, putting a carbonated liquid (whoop-ass is carbonated) in a can is known as “bottling” rather than “canning.” You can’t prove me wrong.
aeon skoble 08.20.04 at 5:52 pm
Matt, you write “sure, a century is a period of a hundred years, but that doesn’t settle when it starts,” which is true when we use indefinite articles as in “it happened a century ago.” But when we’re using the definite article with a modifier (the 1st century, the 20th century), then the descriptors must have some referent. “1st” of what? “20th” since what? These cases refer to our dating system (regardless of whether the events presupposed to have happened actually did or not): 1st century AD, which must =1-100. The 20th of these things would thus include 2000. Obviously you know that, and you know that there’s zeroth apple on the table, so I take your main point to be that language means whatever people want it to mean and so on. But if language means _anything_, then it’s possible for someone to make a mistake, and it’s therefore possible for _lots_ of people to make a mistake. The fact that languages evolve over time doesn’t mean that _any_ instance of lots-of-people-making-a-mistake-about-something is a-ok.
reuben 08.20.04 at 5:55 pm
Matt
No one will try to prove you wrong, because we all know what we’ll get.
Aeon
With you on the other hand, I’ll give it one more try:
‘A millenium’ is one concept.
‘The millenium’ is another concept.
The two are very closely related, but aren’t the same thing. As I said before, and as ‘Whoop ass’ Weiner implied, ‘a millenium’ is a 1,000 year period, and can start anywhere, and will always end 1,000 years later. (This section of today’s programme was entitled ‘Fun with Math!’)
When we say ‘the millenium’, however, we are referring not to just any randon one-thousand-year period, but to one that covers a specific, and symbolically meaningful, period of time. What is currently happening in our culture, and presumably others, is that we are not debating what ‘a millenium’ is – all of us, even surly teenagers, agree that it is a 1,000 year period. We are, however, debating what ‘the millenium’ is.
No one has forgotten how to count, though if Matt springs that bottle of whoop ass on us, we probably will. (I’ve seen boxing on tv; i know what can happen.) The only thing that has happened is that the majority of people have harnessed the power of language to change the interpretation of ‘the millenium’ – not ‘a millenium’ – to ‘1,000-year-period starting in “000”.
We didnt’ forget how to count; we just changed the way in which we define a particular concept.
digamma 08.20.04 at 6:01 pm
My daughter was born at dawn on 01-01-01, so that is when the C21st began – she wholeheartedly rejects the barbarity of the C20th and waited just long enough to avoid being associated with it.
To Europeans, the 21st century is much better than the barbaric 20th. A lot of Americans, however, are looking back at the 20th with nostalgia.
aeon skoble 08.20.04 at 6:09 pm
“We didnt’ forget how to count; we just changed the way in which we define a particular concept”
Who’s “we”? I didn’t agree to change the meaning of the concept. I know people get excited about the odometer-rolling-over effect, and that’s fine. But it can’t be a new millenium til the old one is over.
Dr. Weevil 08.20.04 at 6:11 pm
Unfortunately, you’re all wrong about the meaning of “millenium”: the correct definition is “misspelling of ‘millennium'”. By the way, if a millennium is a period of 1000 years, from Latin ‘mille’ (1000) and ‘anni’ (years), then a millenium should be a group of 1000 ‘ani’, that is anuses.
reuben 08.20.04 at 6:20 pm
‘it can’t be a new millenium til the old one is over’
You just keep chanting that, Aeon.
Cheers now – I’m off to the pub to celebrate the fourth and almost three-quarters anniversary of the new millenium.
reuben 08.20.04 at 6:26 pm
So the key question here is what is the connection between the first bottled can of whoop ass and 1,000 Latin anuses?
Max Edison 08.20.04 at 6:32 pm
No European before about 1200, long after the dating scheme arose, had any concept of ZERO. That is an Arab/Indian idea, so it is meaningless in this context.
aphrael 08.20.04 at 6:43 pm
Stentor – that’s precisely the point that Hobsbawm is making when he claims January 1, 1992, as the start of the 21st century. In that view, the 20th century started on July 28, 1914, and encompassed the two world wars and the struggle between capitalism and communism; once the Soviet Union was legitimately dead, we were in a distinctly different historical era.
To the extent that the “war on terror” is the defining issue of our time, though, it is unclear whether the decade between the end of the cold war and the start of the war on terror is better treaded as a coda to the former or a prelud to the latter.
aeon skoble 08.20.04 at 7:15 pm
If Europeans had no concept of zero in 1200, how did they write “1200”? In any case, this isn’t about when the Arabic “zero” entered European mathematics, it’s about the difference between numbers used arithmetically and numbers used to count. Count the apples – 1,2,3 -the first one, the second one, etc. There’s no zeroth apple. Same thing with kids – you have a “first kid” and a “second kid,” but not a zeroth kid. Counting starts with the first thing. That’s not something generally unknown, which is why it’s baffling that people get this issue wrong. The names of our years are counting-names.
Omada 08.20.04 at 8:19 pm
If Europeans had no concept of zero in 1200, how did they write “1200�
MCC, obviously. Not much of a New Roman, are you?
aeon skoble 08.20.04 at 8:28 pm
Were they still using Roman numerals in 1200? I’m really asking: I have no idea when they stopped using Roman numerals.
In any case, though, the real point is that they had the idea of tens, hundreds, and thousands of things. The first thousand years was the first millennium, and so the second one would be the second thousand years. Obviously, we can designate _any_ year as the beginning of _some_ thousand-year period – how about right now? – but when people use the definite article, they’re referring not just to a span of a thousand years, but something more specific. That’s why the second millenium included 2000.
Dick Thompson 08.20.04 at 8:41 pm
They still use Roman Numerals. If you mean to do arithmetic with, there were still some people using them for that in the 16th century. Just like people who insist on a century starting in an 00 year, they refused to give up their old ways for better ones.
Jonathan Edelstein 08.20.04 at 9:05 pm
Actually, it’s more likely JEWISH academics who will be found using CE rather than AD. AD meaning “Year of OUR Lordâ€, and the Jew is not included in the Us in question.
I dunno. I always use AD – I just think of it as “year of their Lord.”
blue 08.20.04 at 10:33 pm
Calendar years are a bit like miles and birthdays. 2000 is the natural turning point, like reaching 20,000 miles on the odometer or turning 20 years old would be (and said rollover points indicate the completion, not the beginning, of mile 20,000/one’s 20th year). In 2000, if we were not in the new millennium, then we must have been in the last decade of the last century of the previous millennium. Yet as everyone knows, the 1990s ended January 1st, 2000.
That shouldn’t have stopped anyone, however, from celebrating the Eve before 2001 like it was 1999.
MaryGarth 08.20.04 at 11:05 pm
“My daughter was born at dawn on 01-01-01”–
A fine birthday!
My grandmother’s birthday was the previous 01-01-01, and she always said she was born on the first day of the first month of the first year…
agm 08.21.04 at 12:54 am
“… turning 20 years old would be…”
Really, who celebrates 20? It’s 21 that matters (unless you’re in a land where 18 is the age that matters). Thus I say the issue is settled: it must be that the new era is the year ending in a one! Pedantry is thus ended: each person has their own year 21, with the zeroth year being only nine months long, and dating becomes a personal issue =). Now that’s an idea, ~seven billion dating systems and more coming every year.
Antoni Jaume 08.21.04 at 1:03 am
//
[…]In 2000, if we were not in the new millennium, then we must have been in the last decade of the last century of the previous millennium. Yet as everyone knows, the 1990s ended January 1st, 2000
//
The 90s ended the 31 of December 1999, the ninth decade of the XX century ended on 31 December 2000 . Your everyone is what? a Bush voter?
Following that kind reasoning we are in the 20 century.
DSW
Dave F 08.21.04 at 1:25 am
This is a ridiculous argument. Common sense once told everyone the world was flat. But it isn’t. People (mainly media marketing people, decided to mark the year that started with a nice round number, 2000, as the start of a millennium or “the millennium” — and distinguishing between the two terms on the basis of whether a definite or indefinite article is used betrays a pedantry far more ludicrous than that of which those who can count are accused. As a sub-editor I am often accused of pedantry, but where would academics be if the newspapers sacrificed numerical accuracy about, ooh, say, body counts, voting figures (think Bush/Gore, etc, in favour of feel-good expediency and lots of ad money. Real newspapers, anyway.
That 2+2=4 is not a pedantic assertion. It is just true.
The dismal performance of British school pupils in the “boring” subject of mathematics has been amply demonstrated by numerous learned fools on this thread.
Chris 08.21.04 at 3:28 am
John, your argument seems to miss the point that “anno domini” and “common era” actually mean something. “A.D. 0” or “0 C.E.” would refer to a time before the year Christ was born (at least for purposes of the calendar system) — a time that wasn’t actually the “year of our Lord” or the “common era” at all.
cafl 08.21.04 at 8:32 pm
“In any case, this isn’t about when the Arabic “zero†entered European mathematics, it’s about the difference between numbers used arithmetically and numbers used to count. Count the apples – 1,2,3 -the first one, the second one, etc. There’s no zeroth apple.”
Try that when referencing array elements in a C program and you’ll reference past the end of the array! The crackers will love you for it, however.
Joseph Briggs 08.23.04 at 1:38 pm
Way late (in internet time; I should visit this site more often) but I just have to go on record here to agree with aeon and especially dave f. (and the surprisingly few others here who managed to state the premise clearly): This is the stupidest false dilemma ever.
Counting is not a difficult concept to grasp. Let’s start: I have 1 year, now I have 2 years… and so on. I could go on about the “common sense” purpose of calendar systems as justification as to why there should be no year zero but like I said, it’s a dumb argument. This is a great exercise in complicating a simple idea and calling it esoteric.
The mention of array elements starting with 0 is inappropriate. Arrays in programming are addressing data in memory.
Dan 08.23.04 at 2:45 pm
Aeon has hit it right on the head, and I think Reuben and several others are missing the most fundamental mathematical point.
Calendars are a system of ordinal numbers. That is: 1st, 2nd, 3rd, etc. Zero doesn’t exist in ordinal counting. The year 2000 is called 2000 because its the 2000th year since a specified reference point. Every calendar ever created works this way. Whether it’s counting the twelfth year of the reign of Ragnar the Terrible or the Year of Our Lord 2000, it’s still ordinal counting, so you can’t have a year zero even if you really, really, really want one. Its simply the nature of how calendars work.
No amount of convention, preference, attitude or even agreement can alter the purely mathematical fact that there is no such thing as zero in an ordinal number system.
You can no more refer to a “zeroeth” year than you can sensibly answer the questions who was the “zeroeth” President or which was the “zeroeth” State admitted to the Union?
Comments on this entry are closed.