Author makes a good point. "1700s" is both more intuitive and more concise than "18th century".
The very first episode of Alex Trebeck's Jeopardy in 1984 illustrates how confusing this can be:
In Icelandic the 1-based towards counting is used almost everywhere. People do indeed say: “The first decade of the 19th century” to refer to the 18-aughts, and the 90s is commonly referred to as “The tenth decade”. This is also done to age ranges, people in their 20s (or 21-30 more precisely) are said to be þrítugsaldur (in the thirty age). Even the hour is sometime counted towards (though this is more rare among young folks), “að ganga fimm” (or going 5) means 16:01-17:00.
Speaking for my self, this doesn’t become any more intuitive the more you use this, people constantly confuse decades and get insulted by age ranges (and freaked out when suddenly the clock is “going five”). People are actually starting to refer to the 90s as nían (the nine) and the 20-aughts as tían (the ten). Thought I don’t think it will stick. When I want to be unambiguous and non-confusing I usually add the -og-eitthvað (and something) as a suffix to a year ending with zero, so the 20th century becomes nítjánhundruð-og-eitthvað, the 1990s, nítíu-og-eitthvað and a person in their 20s (including 20) becomes tuttugu-og-eitthvað.
Logic is in short supply and off-by-one errors are everywhere. Most people don't care. I think it's more doable to learn to just live with that than to reprogram mankind.
The publishing industry already has style guides for large swaths of the industry.
Imagine, for a moment, that AP adopted the OP's "don't count centuries" guidance. An enormous share of English-language publishing outfits would conform to the new rule in all future publications. Within a couple months, a large share of written media consumption would adopt this improved way of talking about historical time frames.
The best part? Absolutely no effort on the part of the general public. It's not like the OP is inventing new words or sentence structures. There's zero cognitive overhead for anyone, except for a handful of journalists and copywriters who are already used to that kind of thing. It's part of their job.
I think a lot of people take these sorts of ideas to mean "thou shalt consciously change the way you speak." In reality, we have the systems in place to make these changes gradually, without causing trouble for anyone.
If you don't like it, nobody's trying to police your ability to say it some other way - even if that way is objectively stupid, as is the case with counting centuries.
Nobody's asking to reprogram anyone. Just stop using one of two conventions. The reason to do it is simple and obvious. I'm really baffled at the responses here advocating strongly for the current way. But I guess that's just a "people thing"
> I think it's more doable to learn to just live with that than to reprogram mankind.
Why not just fix the calendar to match what people expect?
There was no time when people said "this is year 1 AD". That numbering was created retroactively hundreds of years later. So we can also add year 0 retroactively.
I think that's good, because it helps you realize that categorizing art by century is kind of arbitrary and meaningless, and if possible it would be more useful to say something like "neoclassical art from the 1700s". "18th century" isn't an artistic category, but it kind of sounds like it is if you just glance at it. "Art from the 1700s" is clearly just referring to a time period.
No, the first century began Jan 1, 0000. Whether that year actually existed or not is irrelevant - we shouldn't change our counting system in the years 100, 200 etc.
Doesn't matter, we can just agree the first century had 99 years, and be done with it.
We have special rules for leap years, that would just be a single leap-back century.
At the scale of centuries, starting 2nd century at 100 as opposed to 101 is just an 1% error, so we can live with it. For the kind of uses we use centuries for (not to do math, but to talk roughly about historical eras) it's inconsequential anyway.
There are numerous common concise ways to write the 18th century, at the risk of needing the right context to be understood, including “C18th”, “18c.”, or even “XVIII” by itself.
These are even more impractical, so I wonder what your point is? I can come up with an even shorter way to say 18th century, by using base26 for example, so let's denote it as "cR". What has been gained?
What about languages that don’t have an equivalent to “the Xs” for decades or centuries?
Also, 1799 is obviosly more than 1700, as well as 1701 > 1700 – why should the naming convention tie itself to the lesser point? After one’s third birthday, the person is starting their fourth year and is not living in their third year.
It's easy, we should have simply started counting centuries from zero. Centuries should be zero-indexed, then everything works.
We do the same with people's ages. For the entire initial year of your life you were zero years old. Likewise, from years 0-99, zero centuries had passed so we should call it the zeroth century!
At least this is how I justify to my students that zero-indexing makes sense. Everyone's fought the x-century vs x-hundreds before so they welcome relief.
When we refer to 'the first year of life', we mean the time from birth until you turn 1.
Similarly, you'd say something like 'you're a child in the first decade of your life and slowly start to mature into a young adult by the end of the second decade', referring to 0-9 and 10-19, respectively.
But practically speaking we usually do. I always hear people refer to events in their life happening “when I was 26” and never “in the 27th year of my life”. Sure you could say the latter, but practically speaking people don’t (at least in English).
On your sixth birthday we put a big 5 on your cake and call you a 5 year old all year.
Can't say I've ever had to refer to someone's first year or first decade of their life, but sure I'd do that if it came up. Meanwhile, 0-indexed age comes up all the time.
My preference is semi-compatible with both conventions:
First = 0
Second = 1
Toward = 2
Third = 3
…
This way, the semantic meaning of the words “first” (prior to all others) and “second” (prior to all but one) are preserved, but we get sensical indexing as well.
In "figure of speech", or conventual use, people start drinking in their 21st year, not their 22nd. In common parlance, they can vote in their 18th year, not their 19th.
We talk of a child in their 10th year as being age 10. Might even be younger. Try asking a people if advice about a child in their "5th year of development" means you're dealing with a 5 year old. Most will say yes.
So perhaps it's logical to count from zero when there's no digit in the magnitude place, because you haven't achieved a full unit till you reach the need for the unit. Arguably a baby at 9 months isn't in their first year as they've experienced zero years yet!
Similarly "centuries" don't have a century digit until the 100s, which would make that the 1st century and just call time spans less than that "in the first hundred years" (same syllables anyway).
It's unsatisfying, but solves the off by one errors, one of the two hardest problems in computer science along with caching and naming things.
Yeah we do, because their 'first year' isn't their age. We do their age in (also zero-indexed) months/weeks/days.
In Indian English terms, we do 'complete' age - aiui more common in India is to one-index, i.e. you're the age of the year you're in, and to disambiguate you might hear someone say they're '35 complete', meaning they have had 35 anniversaries of their birth (36 incomplete).
Your metaphor is comparing apples and oranges. When we could life, it's "one year old" or "aged one year," both of which mark the completion of a milestone. Using the term "18th century" is all-encompassing of that year, which is a different use case. When one recollects over the course of someone's life, like in a memoir, it would be normal to say "in my 21st year", referring to the time between turn 20 years old and 21 years old.
Another example of confusing numeric systems emerges from 12-hour clocks. For many people, asking them to specify which one is 12AM and which one is 12PM is likely to cause confusion. This confusion is immediately cleared up if you just adopt a 24-hour clock. This is a hill I'm willing to die on.
A few months ago, my girlfriend and I missed a comedy show because we showed up on the wrong day. The ticket said Saturday 12:15am, which apparently meant Sunday 12:15am, as part of the Saturday lineup. Still feel stupid about that one.
You usually know it from context, and if not, 12 noon or 12 midnight is quite common.
But I do wished people would stop writing schedules in the 12 hour system. You get weird stuff like bold means PM etc. to compensate for the space inefficiency of 12 hour system
> just adopt a 24-hour clock. This is a hill I'm willing to die on.
I don't know if I feel that strongly about it but I tend to agree. I see more value in adopting a 24 hour clock than making SI mandatory. AM/PM is silly.
That's not inherent to AM/PM, it's cause for some reason the AM/PM switches on the boundary of 11 to 12, not 12 to 1. To fix this while still having AM work with 24hr time, it should be 0 instead of 12
Yeah it's just weird that a new day starts at 12am, and that the jump from am to pm happens from 11 to 12. If we made the day start at 0am, the noon start at 0pm, things are just logical. If you take x am to be x hours since the new-day marker, (or x pm to be x hours since noon), then you'd naively expect 12am to be noon. You could argue that we work modulo 12, but the extra am/pm bit means that it's not fully the same as working in Z/12Z.
I thought this article was railing against the lumping together of entire spans of hundreds of years as being alike (ie, we lump together 1901 and 1999 under the name ”the 1900s” despite their sharing only numerical similarity), and was interested until I learned the author’s real, much less interesting intention
A lot of this runaround is happening because people get hung up on the fact that the "AD" era began as AD 1. But that year is not magic--it didn't even correlate with the year of Jesus's birth or death. So let's just start the AD era a year before, and call that year "AD 0". It can even overlap with BC 1. BC 1 is the same as AD 0. Fine, we can handle that, right? Then the 00s are [0, 100), 100s are [100, 200), etc. Zero problem, and we can start calling them the 1700s etc., guilt free.
I would also accept that the 1st century has one less year than future centuries. Everyone said Jan 1, 2000 was "the new millenium" and "the 21st century". It didn't bother anyone except Lua programmers, I'm pretty sure.
Things like "17th century", "1600s", or "1990s" are rarely exact dates, and almost always fuzzy. It really doesn't matter what the exact start and end day is. If you need exact dates then use exact dates.
A calendar change like this is a non-starter. A lot of disruption for no real purpose other than pleasing some pedantics.
Exactly. Historians often talk about things like "the long 18th century" running from 1688 (Britian's "Glorious Revolution") to 1815 (the defeat of Napoleon) because it makes sense culturally to have periods that don't exactly fit 100-year chunks.
This reminds me that centuries such as "the third century BC" are even harder to translate into date ranges. That one's 201 BC to 300 BC, inclusive, backward. Or you might see "the last quarter of the second millennium BC", which means minus 2000 to about minus 1750. [Edit: no it doesn't.]
In fact archeologists have adapted to writing "CE" and "BCE" these days, but despite that flexibility I've never seen somebody write a date range like "the 1200s BCE". But they should.
Some people have proposed resetting year 1 to 10,000 years earlier. The current year would be 12024. This way you can have pretty much all of recoded human history in positive dates, while still remaining mostly compatible with the current system. It would certainly be convenient, but I don't expect significant uptick any time soon.
For earlier dates "n years ago" is usually easier, e.g. "The first humans migrated to Australia approximately 50,000 years ago".
> you might see "the last quarter of the second millennium BC", which means minus 2000 to about minus 1750.
From comparing some online answers (see links), I'd conclude that even though the numbers are ordered backward, "first"/"last"/"early"/"late" would more commonly be understood to reference the years' relative position in a timeline. That is, "2000 to about minus 1750" would be the first quarter of the second millennium BC.
Imagine how much code is being rewritten all the time for a variety of reasons. Code is live and must be maintained. Adding one more reason isn't much of a stretch.
Think of how individual years are named. Back in for example 2004, "two thousand and four" was probably the most prevalent style. But "two thousand and .." is kind of a mouthful, even if you omit the 'and' part.
Over time, people will find a shorter way. When 2050 arrives, how many people are going to call it "two thousand and fifty"? I'd almost bet money you'll hear it said "twenty fifty". Things already seem to be headed this way.
The "twenty ___" style leads to the first ten years being 20-oh-this and 20-oh-that, so there you have it, the 20-ohs.
(Yes, pretty much the same thing as 20-aughts, gotta admit)
You can always just say "the 2000s" for 2000-2010. If the context is such that you might possibly be talking about the far future then I guess "the 2000's" is no longer suitable but how often does that happen in everyday conversation?
> This leaves ambiguous how to refer to decades like 1800-1809.
There is the apostrophe convention for decades.
You can refer to the decade of 1800–1809 as "the '00s" when the century is clear from the context.
(The Chicago Manual of Style allows it: https://english.stackexchange.com/a/299512.)
If you wanted to upset people, you could try adding the century back: "the 18'00s". :-)
There is also the convention of replacing parts of a date with "X" characters or an em dash ("—") or an ellipses ("...") in fiction, like "in the year 180X".
It is less neat but unambiguous about the range when it's one "X" for a digit.
(https://tvtropes.org/pmwiki/pmwiki.php/Main/YearX has an interesting collection of examples.
A few give you the century, decade, and year and omit the millennium.)
Edit: It turns out the Library of Congress has adopted a date format based on ISO 8601 with "X" characters for unspecified digits: https://www.loc.gov/standards/datetime/.
Here we say something like the "ninteen-hundred-era" for the 1900s, "ninteen-hundred-ten-era", for 1910s, "ninteen-hundred-twenty-era", etc. In writing 1900-era, 1910-era, 1920-era. The most recent decades are referred to with only the "70-era" for the 70s. The word for age/epoch/era in our language is a lot more casual in this setting.
The 20xx vs 200x does indeed leave some room for ambiguity in writing, verbally most people say 20-hundred-era vs 20-null-null-era.
I find it weird when people take a long time for these little things. My wife still struggles with the German numbers (85 = fünfundachtzig) and the half thing with time (8:30 = halb neun) even though I managed to switch over to those very quickly. I think it depends on the person how hard it is
https://www.youtube.com/watch?v=KDTxS9_CwZA
The "Final Jeopardy" question simply asked on what date did the 20th century begin, and all three contestants got it wrong, leading to a 3-way tie.
Speaking for my self, this doesn’t become any more intuitive the more you use this, people constantly confuse decades and get insulted by age ranges (and freaked out when suddenly the clock is “going five”). People are actually starting to refer to the 90s as nían (the nine) and the 20-aughts as tían (the ten). Thought I don’t think it will stick. When I want to be unambiguous and non-confusing I usually add the -og-eitthvað (and something) as a suffix to a year ending with zero, so the 20th century becomes nítjánhundruð-og-eitthvað, the 1990s, nítíu-og-eitthvað and a person in their 20s (including 20) becomes tuttugu-og-eitthvað.
Deleted Comment
Imagine, for a moment, that AP adopted the OP's "don't count centuries" guidance. An enormous share of English-language publishing outfits would conform to the new rule in all future publications. Within a couple months, a large share of written media consumption would adopt this improved way of talking about historical time frames.
The best part? Absolutely no effort on the part of the general public. It's not like the OP is inventing new words or sentence structures. There's zero cognitive overhead for anyone, except for a handful of journalists and copywriters who are already used to that kind of thing. It's part of their job.
I think a lot of people take these sorts of ideas to mean "thou shalt consciously change the way you speak." In reality, we have the systems in place to make these changes gradually, without causing trouble for anyone.
If you don't like it, nobody's trying to police your ability to say it some other way - even if that way is objectively stupid, as is the case with counting centuries.
It's the rare people that don't who actually change the world.
Why not just fix the calendar to match what people expect?
There was no time when people said "this is year 1 AD". That numbering was created retroactively hundreds of years later. So we can also add year 0 retroactively.
Deleted Comment
Deleted Comment
We have special rules for leap years, that would just be a single leap-back century.
At the scale of centuries, starting 2nd century at 100 as opposed to 101 is just an 1% error, so we can live with it. For the kind of uses we use centuries for (not to do math, but to talk roughly about historical eras) it's inconsequential anyway.
In Polish: [lata] tysiącsiedemsetne (6 [+2] syllables) vs osiemnasty wiek (5 syllables).
(0, because only after the first question, we have actually 1 episode performed. Consequently, the 1-episode is then the second one.)
1700s means 1700–1709
1700ss means 1700–1799
To go one step further:
2000s means 2000-2009
2000ss means 2000-2099
2000sss means 2000-2999
It’s actually the second.
> Trebeck's
Trebek's*
Also, 1799 is obviosly more than 1700, as well as 1701 > 1700 – why should the naming convention tie itself to the lesser point? After one’s third birthday, the person is starting their fourth year and is not living in their third year.
I feel this is relevant https://xkcd.com/927/
Yea, but a rhetorical failure. This sounds terrible and far worse than alternatives.
If we want a better system we'll need to either abandon the day or the Gregorian (Julian + drift) caliendar.
We do the same with people's ages. For the entire initial year of your life you were zero years old. Likewise, from years 0-99, zero centuries had passed so we should call it the zeroth century!
At least this is how I justify to my students that zero-indexing makes sense. Everyone's fought the x-century vs x-hundreds before so they welcome relief.
Izzard had the right idea: https://youtu.be/uVMGPMu596Y?si=1aKZ2xRavJgOmgE8&t=643
No, we don't.
When we refer to 'the first year of life', we mean the time from birth until you turn 1.
Similarly, you'd say something like 'you're a child in the first decade of your life and slowly start to mature into a young adult by the end of the second decade', referring to 0-9 and 10-19, respectively.
But practically speaking we usually do. I always hear people refer to events in their life happening “when I was 26” and never “in the 27th year of my life”. Sure you could say the latter, but practically speaking people don’t (at least in English).
Sure, but no one ever uses that phrasing after you turn one. Then it's just "when they were one", "when they were five", whatever.
So sure, maybe we can continue to say "the 1st century", but for dates 100 and later, no more.
Can't say I've ever had to refer to someone's first year or first decade of their life, but sure I'd do that if it came up. Meanwhile, 0-indexed age comes up all the time.
First = 0 Second = 1 Toward = 2 Third = 3 …
This way, the semantic meaning of the words “first” (prior to all others) and “second” (prior to all but one) are preserved, but we get sensical indexing as well.
This wasn't the case in South Korea until recently:
https://www.aljazeera.com/news/2023/6/28/why-are-south-korea...
In "figure of speech", or conventual use, people start drinking in their 21st year, not their 22nd. In common parlance, they can vote in their 18th year, not their 19th.
We talk of a child in their 10th year as being age 10. Might even be younger. Try asking a people if advice about a child in their "5th year of development" means you're dealing with a 5 year old. Most will say yes.
So perhaps it's logical to count from zero when there's no digit in the magnitude place, because you haven't achieved a full unit till you reach the need for the unit. Arguably a baby at 9 months isn't in their first year as they've experienced zero years yet!
Similarly "centuries" don't have a century digit until the 100s, which would make that the 1st century and just call time spans less than that "in the first hundred years" (same syllables anyway).
It's unsatisfying, but solves the off by one errors, one of the two hardest problems in computer science along with caching and naming things.
Nevertheless the traditional "how old are you" system uses a number 1 less.
In Indian English terms, we do 'complete' age - aiui more common in India is to one-index, i.e. you're the age of the year you're in, and to disambiguate you might hear someone say they're '35 complete', meaning they have had 35 anniversaries of their birth (36 incomplete).
Latin, like Lua, is 1-indexed.
https://douglasadams.com/dna/pedants.html
But I do wished people would stop writing schedules in the 12 hour system. You get weird stuff like bold means PM etc. to compensate for the space inefficiency of 12 hour system
I don't know if I feel that strongly about it but I tend to agree. I see more value in adopting a 24 hour clock than making SI mandatory. AM/PM is silly.
Deleted Comment
Dead Comment
What's this reference to? Afaik, Lua uses `os.time` and `os.date` to manage time queries, which is then reliant on the OS and not Lua itself
A calendar change like this is a non-starter. A lot of disruption for no real purpose other than pleasing some pedantics.
https://en.wikipedia.org/wiki/Long_eighteenth_century
In fact archeologists have adapted to writing "CE" and "BCE" these days, but despite that flexibility I've never seen somebody write a date range like "the 1200s BCE". But they should.
For earlier dates "n years ago" is usually easier, e.g. "The first humans migrated to Australia approximately 50,000 years ago".
From comparing some online answers (see links), I'd conclude that even though the numbers are ordered backward, "first"/"last"/"early"/"late" would more commonly be understood to reference the years' relative position in a timeline. That is, "2000 to about minus 1750" would be the first quarter of the second millennium BC.
https://en.wikipedia.org/wiki/1st_century_BC (the "last century BC") https://www.reddit.com/r/AskHistorians/comments/1akt4zm/this...https://www.quora.com/What-is-the-first-half-of-the-1st-cent...https://www.quora.com/What-is-meant-by-the-2nd-half-of-the-5... etc
Deleted Comment
> There’s no good way to refer to 2000-2009, sorry.
This isn't really an argument against the new convention, since even in the old convention there was no convenient way of doing so.
People mostly just say "the early 2000s" or explicitly reference a range of years. Very occasionally you'll hear "the aughts".
Think of how individual years are named. Back in for example 2004, "two thousand and four" was probably the most prevalent style. But "two thousand and .." is kind of a mouthful, even if you omit the 'and' part.
Over time, people will find a shorter way. When 2050 arrives, how many people are going to call it "two thousand and fifty"? I'd almost bet money you'll hear it said "twenty fifty". Things already seem to be headed this way.
The "twenty ___" style leads to the first ten years being 20-oh-this and 20-oh-that, so there you have it, the 20-ohs.
(Yes, pretty much the same thing as 20-aughts, gotta admit)
Deleted Comment
There is the apostrophe convention for decades. You can refer to the decade of 1800–1809 as "the '00s" when the century is clear from the context. (The Chicago Manual of Style allows it: https://english.stackexchange.com/a/299512.) If you wanted to upset people, you could try adding the century back: "the 18'00s". :-)
There is also the convention of replacing parts of a date with "X" characters or an em dash ("—") or an ellipses ("...") in fiction, like "in the year 180X". It is less neat but unambiguous about the range when it's one "X" for a digit. (https://tvtropes.org/pmwiki/pmwiki.php/Main/YearX has an interesting collection of examples. A few give you the century, decade, and year and omit the millennium.)
Edit: It turns out the Library of Congress has adopted a date format based on ISO 8601 with "X" characters for unspecified digits: https://www.loc.gov/standards/datetime/.
I will resent it till I die.
The 20xx vs 200x does indeed leave some room for ambiguity in writing, verbally most people say 20-hundred-era vs 20-null-null-era.
If it was something I saw/heard every day, I would have adapted much faster.