Readit News logoReadit News
gcp123 · a year ago
Author makes a good point. "1700s" is both more intuitive and more concise than "18th century". The very first episode of Alex Trebeck's Jeopardy in 1984 illustrates how confusing this can be:

https://www.youtube.com/watch?v=KDTxS9_CwZA

The "Final Jeopardy" question simply asked on what date did the 20th century begin, and all three contestants got it wrong, leading to a 3-way tie.

runarberg · a year ago
In Icelandic the 1-based towards counting is used almost everywhere. People do indeed say: “The first decade of the 19th century” to refer to the 18-aughts, and the 90s is commonly referred to as “The tenth decade”. This is also done to age ranges, people in their 20s (or 21-30 more precisely) are said to be þrítugsaldur (in the thirty age). Even the hour is sometime counted towards (though this is more rare among young folks), “að ganga fimm” (or going 5) means 16:01-17:00.

Speaking for my self, this doesn’t become any more intuitive the more you use this, people constantly confuse decades and get insulted by age ranges (and freaked out when suddenly the clock is “going five”). People are actually starting to refer to the 90s as nían (the nine) and the 20-aughts as tían (the ten). Thought I don’t think it will stick. When I want to be unambiguous and non-confusing I usually add the -og-eitthvað (and something) as a suffix to a year ending with zero, so the 20th century becomes nítjánhundruð-og-eitthvað, the 1990s, nítíu-og-eitthvað and a person in their 20s (including 20) becomes tuttugu-og-eitthvað.

Deleted Comment

tempodox · a year ago
Logic is in short supply and off-by-one errors are everywhere. Most people don't care. I think it's more doable to learn to just live with that than to reprogram mankind.
ryukoposting · a year ago
The publishing industry already has style guides for large swaths of the industry.

Imagine, for a moment, that AP adopted the OP's "don't count centuries" guidance. An enormous share of English-language publishing outfits would conform to the new rule in all future publications. Within a couple months, a large share of written media consumption would adopt this improved way of talking about historical time frames.

The best part? Absolutely no effort on the part of the general public. It's not like the OP is inventing new words or sentence structures. There's zero cognitive overhead for anyone, except for a handful of journalists and copywriters who are already used to that kind of thing. It's part of their job.

I think a lot of people take these sorts of ideas to mean "thou shalt consciously change the way you speak." In reality, we have the systems in place to make these changes gradually, without causing trouble for anyone.

If you don't like it, nobody's trying to police your ability to say it some other way - even if that way is objectively stupid, as is the case with counting centuries.

phkahler · a year ago
Nobody's asking to reprogram anyone. Just stop using one of two conventions. The reason to do it is simple and obvious. I'm really baffled at the responses here advocating strongly for the current way. But I guess that's just a "people thing"
swyx · a year ago
oooor we can slowly migrate towards sensibility as we did celsius and centimeters
zepolen · a year ago
That's what most people think and the world keeps trucking along.

It's the rare people that don't who actually change the world.

dgb23 · a year ago
Specifically I agree, but generally I disagree. I’m very glad we got the metric system, standards for commonly used protocols and so on.
jodrellblank · a year ago
What's "more logical" about "the seventeenth century" compared to "the sixteen hundreds"?
adamomada · a year ago
You just made me realize that the common saying “the eleventh hour” isn’t what anyone thinks it is
leereeves · a year ago
> I think it's more doable to learn to just live with that than to reprogram mankind.

Why not just fix the calendar to match what people expect?

There was no time when people said "this is year 1 AD". That numbering was created retroactively hundreds of years later. So we can also add year 0 retroactively.

jrockway · a year ago
On the other hand "1700s art" sounds like trash compared to "18th century art".
burkaman · a year ago
I think that's good, because it helps you realize that categorizing art by century is kind of arbitrary and meaningless, and if possible it would be more useful to say something like "neoclassical art from the 1700s". "18th century" isn't an artistic category, but it kind of sounds like it is if you just glance at it. "Art from the 1700s" is clearly just referring to a time period.
bandyaboot · a year ago
If using “1700s”, I’d write it as “art of the 1700s”.
rz2k · a year ago
How about if you say "settecento"? Maybe it is a new confusion that they drop a thousand years, and maybe it would imply Italian art specifically.

Deleted Comment

cm2187 · a year ago
And 1700s already has a different meaning, i.e. early 18th century.

Deleted Comment

semireg · a year ago
The right answer was, and still is: Jan 1, 1901
glitcher · a year ago
Incorrect, this answer wasn't given in the form of a question ;)
readthenotes1 · a year ago
How can that be if 15 of those centuries are on the Julian calendar?
hgomersall · a year ago
No, the first century began Jan 1, 0000. Whether that year actually existed or not is irrelevant - we shouldn't change our counting system in the years 100, 200 etc.
d0mine · a year ago
There is no "0" year, 1 is the 1st year, so 100th year is still the 1st century, therefore 2nd century starts in 101 and 20th in 1901.
notfed · a year ago
I find this decree frustrating. Someone could have just as easily said "the 'first' century starts at 1 BC" to account for this.
coldtea · a year ago
Doesn't matter, we can just agree the first century had 99 years, and be done with it.

We have special rules for leap years, that would just be a single leap-back century.

At the scale of centuries, starting 2nd century at 100 as opposed to 101 is just an 1% error, so we can live with it. For the kind of uses we use centuries for (not to do math, but to talk roughly about historical eras) it's inconsequential anyway.

hypertele-Xii · a year ago
What? 0 is the year Jesus Christ was born.
ajuc · a year ago
Depends on the language. Century being 3 syllables really makes it long in English, but it's still 5 syllables vs 5 syllables.

In Polish: [lata] tysiącsiedemsetne (6 [+2] syllables) vs osiemnasty wiek (5 syllables).

masswerk · a year ago
So shouldn't this be the "0-episode"? ;-)

(0, because only after the first question, we have actually 1 episode performed. Consequently, the 1-episode is then the second one.)

zozbot234 · a year ago
1700s means 1700–1709, i.e. roughly the first decade in the 18th century. Just like '2000s'. The OP acknowledges this issue and then just ignores it.
Viliam1234 · a year ago
I have a solution that would work in writing, but not sure how to pronounce it:

1700s means 1700–1709

1700ss means 1700–1799

To go one step further:

2000s means 2000-2009

2000ss means 2000-2099

2000sss means 2000-2999

swyx · a year ago
that is fascinating trivia. you could do a whole Jeopardy on Jeopardy facts alone
drivers99 · a year ago
There are numerous common concise ways to write the 18th century, at the risk of needing the right context to be understood, including “C18th”, “18c.”, or even “XVIII” by itself.
anyfoo · a year ago
These are even more impractical, so I wonder what your point is? I can come up with an even shorter way to say 18th century, by using base26 for example, so let's denote it as "cR". What has been gained?
Kwpolska · a year ago
> very first

It’s actually the second.

> Trebeck's

Trebek's*

card_zero · a year ago
Let's reform Alex Trebek's name, it's difficult.
dantyti · a year ago
What about languages that don’t have an equivalent to “the Xs” for decades or centuries?

Also, 1799 is obviosly more than 1700, as well as 1701 > 1700 – why should the naming convention tie itself to the lesser point? After one’s third birthday, the person is starting their fourth year and is not living in their third year.

I feel this is relevant https://xkcd.com/927/

darby_nine · a year ago
> Author makes a good point. "1700s" is both more intuitive and more concise than "18th century".

Yea, but a rhetorical failure. This sounds terrible and far worse than alternatives.

If we want a better system we'll need to either abandon the day or the Gregorian (Julian + drift) caliendar.

milliams · a year ago
It's easy, we should have simply started counting centuries from zero. Centuries should be zero-indexed, then everything works.

We do the same with people's ages. For the entire initial year of your life you were zero years old. Likewise, from years 0-99, zero centuries had passed so we should call it the zeroth century!

At least this is how I justify to my students that zero-indexing makes sense. Everyone's fought the x-century vs x-hundreds before so they welcome relief.

Izzard had the right idea: https://youtu.be/uVMGPMu596Y?si=1aKZ2xRavJgOmgE8&t=643

mmmmmbop · a year ago
> We do the same with people's ages.

No, we don't.

When we refer to 'the first year of life', we mean the time from birth until you turn 1.

Similarly, you'd say something like 'you're a child in the first decade of your life and slowly start to mature into a young adult by the end of the second decade', referring to 0-9 and 10-19, respectively.

Uehreka · a year ago
> No, we don't.

But practically speaking we usually do. I always hear people refer to events in their life happening “when I was 26” and never “in the 27th year of my life”. Sure you could say the latter, but practically speaking people don’t (at least in English).

jcelerier · a year ago
The first year of life is the year indexed with zero, just like the first centimeter/inch in a ruler is the centimeter/inch indexed with zero
kelnos · a year ago
> When we refer to 'the first year of life', we mean the time from birth until you turn 1.

Sure, but no one ever uses that phrasing after you turn one. Then it's just "when they were one", "when they were five", whatever.

So sure, maybe we can continue to say "the 1st century", but for dates 100 and later, no more.

furyofantares · a year ago
On your sixth birthday we put a big 5 on your cake and call you a 5 year old all year.

Can't say I've ever had to refer to someone's first year or first decade of their life, but sure I'd do that if it came up. Meanwhile, 0-indexed age comes up all the time.

daynthelife · a year ago
My preference is semi-compatible with both conventions:

First = 0 Second = 1 Toward = 2 Third = 3 …

This way, the semantic meaning of the words “first” (prior to all others) and “second” (prior to all but one) are preserved, but we get sensical indexing as well.

Tade0 · a year ago
> We do the same with people's ages. For the entire initial year of your life you were zero years old.

This wasn't the case in South Korea until recently:

https://www.aljazeera.com/news/2023/6/28/why-are-south-korea...

kstrauser · a year ago
We don't 0-index people's ages. There are a million books about "baby's first year", while they're still 0 years old.
Terretta · a year ago
Except we do, as soon as we need the next digit.

In "figure of speech", or conventual use, people start drinking in their 21st year, not their 22nd. In common parlance, they can vote in their 18th year, not their 19th.

We talk of a child in their 10th year as being age 10. Might even be younger. Try asking a people if advice about a child in their "5th year of development" means you're dealing with a 5 year old. Most will say yes.

So perhaps it's logical to count from zero when there's no digit in the magnitude place, because you haven't achieved a full unit till you reach the need for the unit. Arguably a baby at 9 months isn't in their first year as they've experienced zero years yet!

Similarly "centuries" don't have a century digit until the 100s, which would make that the 1st century and just call time spans less than that "in the first hundred years" (same syllables anyway).

It's unsatisfying, but solves the off by one errors, one of the two hardest problems in computer science along with caching and naming things.

bmacho · a year ago
If you point at year long intervals, then those will be year long intervals indeed.

Nevertheless the traditional "how old are you" system uses a number 1 less.

User23 · a year ago
We also talk about someone’s first day at work during that day.
bitwize · a year ago
You are, in general, n-1 years old in your nth year. Only when you complete your nth year do you turn n years old.
mixmastamyk · a year ago
How old is the baby now? Six months…
copperx · a year ago
Ages are zero indexed, but people avoid saying zero by counting age in months in year 0, then switch to years in year one.
OJFord · a year ago
Yeah we do, because their 'first year' isn't their age. We do their age in (also zero-indexed) months/weeks/days.

In Indian English terms, we do 'complete' age - aiui more common in India is to one-index, i.e. you're the age of the year you're in, and to disambiguate you might hear someone say they're '35 complete', meaning they have had 35 anniversaries of their birth (36 incomplete).

drewcoo · a year ago
> we should have simply started counting centuries from zero

Latin, like Lua, is 1-indexed.

IshKebab · a year ago
I feel like the Romans had an excuse for that mistake. Not sure about Lua.
usrusr · a year ago
And look at the bloodshed that caused!

https://douglasadams.com/dna/pedants.html

munchler · a year ago
If people start saying “zeroth century”, it’s only going to create confusion, because “first century” will then become ambiguous.
eddieroger · a year ago
Your metaphor is comparing apples and oranges. When we could life, it's "one year old" or "aged one year," both of which mark the completion of a milestone. Using the term "18th century" is all-encompassing of that year, which is a different use case. When one recollects over the course of someone's life, like in a memoir, it would be normal to say "in my 21st year", referring to the time between turn 20 years old and 21 years old.
TheAceOfHearts · a year ago
Another example of confusing numeric systems emerges from 12-hour clocks. For many people, asking them to specify which one is 12AM and which one is 12PM is likely to cause confusion. This confusion is immediately cleared up if you just adopt a 24-hour clock. This is a hill I'm willing to die on.
flakes · a year ago
A few months ago, my girlfriend and I missed a comedy show because we showed up on the wrong day. The ticket said Saturday 12:15am, which apparently meant Sunday 12:15am, as part of the Saturday lineup. Still feel stupid about that one.
wasteduniverse · a year ago
That sounds like their error. If the ticket said Saturday 2am, you wouldn't show up at Sunday 2am, right?
runarberg · a year ago
You usually know it from context, and if not, 12 noon or 12 midnight is quite common.

But I do wished people would stop writing schedules in the 12 hour system. You get weird stuff like bold means PM etc. to compensate for the space inefficiency of 12 hour system

rootusrootus · a year ago
> just adopt a 24-hour clock. This is a hill I'm willing to die on.

I don't know if I feel that strongly about it but I tend to agree. I see more value in adopting a 24 hour clock than making SI mandatory. AM/PM is silly.

rbits · a year ago
That's not inherent to AM/PM, it's cause for some reason the AM/PM switches on the boundary of 11 to 12, not 12 to 1. To fix this while still having AM work with 24hr time, it should be 0 instead of 12
krackers · a year ago
Yeah it's just weird that a new day starts at 12am, and that the jump from am to pm happens from 11 to 12. If we made the day start at 0am, the noon start at 0pm, things are just logical. If you take x am to be x hours since the new-day marker, (or x pm to be x hours since noon), then you'd naively expect 12am to be noon. You could argue that we work modulo 12, but the extra am/pm bit means that it's not fully the same as working in Z/12Z.
cvdub · a year ago
That’s why I always say “12 noon” when writing out scheduling instructions for something at 12PM.
wryoak · a year ago
I thought this article was railing against the lumping together of entire spans of hundreds of years as being alike (ie, we lump together 1901 and 1999 under the name ”the 1900s” despite their sharing only numerical similarity), and was interested until I learned the author’s real, much less interesting intention
endofreach · a year ago
Many people find their own thoughts more interesting than the ones of others. Some write. Many don't.

Deleted Comment

Dead Comment

MarkLowenstein · a year ago
A lot of this runaround is happening because people get hung up on the fact that the "AD" era began as AD 1. But that year is not magic--it didn't even correlate with the year of Jesus's birth or death. So let's just start the AD era a year before, and call that year "AD 0". It can even overlap with BC 1. BC 1 is the same as AD 0. Fine, we can handle that, right? Then the 00s are [0, 100), 100s are [100, 200), etc. Zero problem, and we can start calling them the 1700s etc., guilt free.
jrockway · a year ago
I would also accept that the 1st century has one less year than future centuries. Everyone said Jan 1, 2000 was "the new millenium" and "the 21st century". It didn't bother anyone except Lua programmers, I'm pretty sure.
tetris11 · a year ago
> It didn't bother anyone except Lua programmers, I'm pretty sure.

What's this reference to? Afaik, Lua uses `os.time` and `os.date` to manage time queries, which is then reliant on the OS and not Lua itself

arp242 · a year ago
Things like "17th century", "1600s", or "1990s" are rarely exact dates, and almost always fuzzy. It really doesn't matter what the exact start and end day is. If you need exact dates then use exact dates.

A calendar change like this is a non-starter. A lot of disruption for no real purpose other than pleasing some pedantics.

jhbadger · a year ago
Exactly. Historians often talk about things like "the long 18th century" running from 1688 (Britian's "Glorious Revolution") to 1815 (the defeat of Napoleon) because it makes sense culturally to have periods that don't exactly fit 100-year chunks.

https://en.wikipedia.org/wiki/Long_eighteenth_century

card_zero · a year ago
This reminds me that centuries such as "the third century BC" are even harder to translate into date ranges. That one's 201 BC to 300 BC, inclusive, backward. Or you might see "the last quarter of the second millennium BC", which means minus 2000 to about minus 1750. [Edit: no it doesn't.]

In fact archeologists have adapted to writing "CE" and "BCE" these days, but despite that flexibility I've never seen somebody write a date range like "the 1200s BCE". But they should.

arp242 · a year ago
Some people have proposed resetting year 1 to 10,000 years earlier. The current year would be 12024. This way you can have pretty much all of recoded human history in positive dates, while still remaining mostly compatible with the current system. It would certainly be convenient, but I don't expect significant uptick any time soon.

For earlier dates "n years ago" is usually easier, e.g. "The first humans migrated to Australia approximately 50,000 years ago".

localhost8000 · a year ago
> you might see "the last quarter of the second millennium BC", which means minus 2000 to about minus 1750.

From comparing some online answers (see links), I'd conclude that even though the numbers are ordered backward, "first"/"last"/"early"/"late" would more commonly be understood to reference the years' relative position in a timeline. That is, "2000 to about minus 1750" would be the first quarter of the second millennium BC.

https://en.wikipedia.org/wiki/1st_century_BC (the "last century BC") https://www.reddit.com/r/AskHistorians/comments/1akt4zm/this...https://www.quora.com/What-is-the-first-half-of-the-1st-cent...https://www.quora.com/What-is-meant-by-the-2nd-half-of-the-5... etc

pictureofabear · a year ago
We're too deep into this now. Imagine how much code would have to be rewritten.
hypertele-Xii · a year ago
Imagine how much code is being rewritten all the time for a variety of reasons. Code is live and must be maintained. Adding one more reason isn't much of a stretch.

Deleted Comment

hgomersall · a year ago
This is the right answer.
wavemode · a year ago
I do tend to say "the XX00s", since it's almost always significantly clearer than "the (XX+1)th century".

> There’s no good way to refer to 2000-2009, sorry.

This isn't really an argument against the new convention, since even in the old convention there was no convenient way of doing so.

People mostly just say "the early 2000s" or explicitly reference a range of years. Very occasionally you'll hear "the aughts".

greenbit · a year ago
How about the 20-ohs?

Think of how individual years are named. Back in for example 2004, "two thousand and four" was probably the most prevalent style. But "two thousand and .." is kind of a mouthful, even if you omit the 'and' part.

Over time, people will find a shorter way. When 2050 arrives, how many people are going to call it "two thousand and fifty"? I'd almost bet money you'll hear it said "twenty fifty". Things already seem to be headed this way.

The "twenty ___" style leads to the first ten years being 20-oh-this and 20-oh-that, so there you have it, the 20-ohs.

(Yes, pretty much the same thing as 20-aughts, gotta admit)

conception · a year ago
The 2000-2009’s are the aughts!
kmoser · a year ago
I think you mean "twenty-aughts" (to differentiate them from the nineteen-aughts, 1900-1909).
buzzy_hacker · a year ago
The noughties!

Deleted Comment

savanaly · a year ago
You can always just say "the 2000s" for 2000-2010. If the context is such that you might possibly be talking about the far future then I guess "the 2000's" is no longer suitable but how often does that happen in everyday conversation?
networked · a year ago
> This leaves ambiguous how to refer to decades like 1800-1809.

There is the apostrophe convention for decades. You can refer to the decade of 1800–1809 as "the '00s" when the century is clear from the context. (The Chicago Manual of Style allows it: https://english.stackexchange.com/a/299512.) If you wanted to upset people, you could try adding the century back: "the 18'00s". :-)

There is also the convention of replacing parts of a date with "X" characters or an em dash ("—") or an ellipses ("...") in fiction, like "in the year 180X". It is less neat but unambiguous about the range when it's one "X" for a digit. (https://tvtropes.org/pmwiki/pmwiki.php/Main/YearX has an interesting collection of examples. A few give you the century, decade, and year and omit the millennium.)

Edit: It turns out the Library of Congress has adopted a date format based on ISO 8601 with "X" characters for unspecified digits: https://www.loc.gov/standards/datetime/.

dkdbejwi383 · a year ago
How do you speak it though? The “oh ohs”? “Noughts”? “Zeroes”?
PopAlongKid · a year ago
Aughts.
BurningFrog · a year ago
Immigrating from a country that uses "1700s", it probably took a decade before I had internalized to subtract 1 to get the real number.

I will resent it till I die.

Too · a year ago
Here we say something like the "ninteen-hundred-era" for the 1900s, "ninteen-hundred-ten-era", for 1910s, "ninteen-hundred-twenty-era", etc. In writing 1900-era, 1910-era, 1920-era. The most recent decades are referred to with only the "70-era" for the 70s. The word for age/epoch/era in our language is a lot more casual in this setting.

The 20xx vs 200x does indeed leave some room for ambiguity in writing, verbally most people say 20-hundred-era vs 20-null-null-era.

bowsamic · a year ago
I find it weird when people take a long time for these little things. My wife still struggles with the German numbers (85 = fünfundachtzig) and the half thing with time (8:30 = halb neun) even though I managed to switch over to those very quickly. I think it depends on the person how hard it is
BurningFrog · a year ago
I think this one was hard for me because centuries don't come up that often.

If it was something I saw/heard every day, I would have adapted much faster.

dinkumthinkum · a year ago
Resent is a little strong isn’t it? I feel like today we have such a victim culture that we are oppressed by the most trivial of matters.