There’s also confusion over human response time vs whether you can perceive something. Even if 240Hz looks slightly different, if a human can’t react to that difference (other than to say it looks nicer, which is a personal preference rather than an empirical assessment of “better”) then it doesn’t really matter anyway. Kind of like how Avatar looked different in 48Hz instead of 24Hz, and at the time it was hailed as some revolution in movies, and then it came and went. Personal preference.
As a direct answer to your question, I was a gamedev from 2005 to 2012, and back then people were arguing that 120Hz couldn’t make a difference and that 60Hz was fine. It stuck with me, since it seemed mistaken. So I shouldn’t have said “generally accepted,” just “I vaguely remember the world arguing a decade or so ago that 60Hz was good enough in all situations, e.g. competitive gaming.”
It's so many things, starting with the North American electric grid, NTSC, etc.
Yes my graphics card + monitor could theoretically run at more than 60 Hz. When I got to choose between resolution and refresh rate, I picked resolution. Hence 60 Hz.
Presumably the 3200 Hz is needed for a combination of reasons:
- Under ideal conditions, if you want less than 10% variation in the number of samples per frame at 240Hz, you may need ~2400Hz. This effect is visible even by human eyeballs — you can see multiple cursor images across your field of view, and uneven spacing is noticeable.
- The mouse itself may work less well at a lower sampling rate.
- The OS and input stack may be poorly designed and work better at higher rates.
In any case, the application and cursor implementation are unlikely to ask for a mouse location more than once per frame, so the user is not really using 3200 updates per second, but that’s irrelevant.
Second 3200 was DPI not Hz. I can trivially tell how much I have to move with 3200 DPI (my sweet spot with 2 4K monitors), 4800 DPI, and 6400.
For Hz, it was the polling rate. With a configured 8000 Hz polling rate which is a lie/peak, I still see stalls in the 4ms range with my hardware.
As to acceleration I disable it. To truly lose it at high DPIs I've had to install RawAccel on Microsoft Windows.
There are other differences in the tools, mine was designed for what I wanted to understand so I'm biased toward it.
I'm game for a randomized blinded test on 120 Hz refresh rate vs 240 Hz refresh rate. I would indeed be very curious to confirm I can tell the difference with a proper protocol.
Many years back (we were on CRTs), I was in similar shoes, convinced my friend couldn't tell the difference between 60 Hz and 90 Hz when playing video games.
Turns out he only needed to look at the pointer through one push of the mouse to tell right away, successful 100% of the time in a blinded experiment.
It's like lightning strokes of tens of microseconds making a lasting impression on your perception of the scene. You don't "count" strokes over time, but in space.
When you make circles fast and large enough on screen, you can evaluate the number of cursors that appear before your eyes. At 4 circles per second, is each circle made of ~60 pointers or ~30? Belief not fact: it's not hard to guess.