auto sum = [](auto a, auto b): a+b;
but this is something else. i didn't think i'd like it at first, but actually i think i might be coming around to it. the.. dollar syntax is regrettable, although it's not a show stopper.yes, mozilla's TOS update is a bad thing, but switching to chrome (or chromium-based) for it is really cutting your nose to spite your face.
A game running at 60 fps averages around ~16 ms and good human reaction times don’t go much below 200ms.
Users who “notice” individual frames are usually noticing when a single frame is lagging for the length of several frames at the average rate. They aren’t noticing anything within the span of an average frame lifetime
if you added 200ms latency to your mouse inputs, you’d throw your computer out the of the window pretty quickly.
i’d consider myself a day-to-day c++ engineer. well, because i am. i like lots of things from rust. there’s a few things i don’t. c++ has a lot to learn from rust, if it is to continue to exist.
but really.. isn’t this the point of the language? you need to understand the borrow checker because.. that’s why it’s here?
maybe i’m missing something.
Attached cap is also less convenient when screwing the cap back onto the bottle.
fully unscrew the cap then just either continue twisting the cap over the the edge - honestly effortless - or just.. pull it off? the cap still functions as a cap, afterward.
apologies, but i don’t understand the furore over this change.
I don't see how living (potentially) forever is anything but a horrible, horrible ego driven idea with 0 rational thought put behind it, you may enlighten me here:
- Unlimited human life expectancy vs limited resources? How would that work? - Do we really want the next dictator of XYZ to rule forever? - The lack of control young people experience when it comes to their own lives (voting, etc) will worsen, if the median age is 80+ or older. - Saying stuff like "There should be no death" is a clear example to me why humans in general are problematic. As long as we consume resources and need space we are still part of this ecosystem and cannot just simply change the rules of how it all works just because we would like to. - I suck at Bingo.
Edit:
I just want to clarify the following:
Don't feel attacked, I am curious to hear your take on this and I never said that I am right on this, I know too little to ever make that claim. I wasn't aware how emotional this topic is to many, this happens to me IRL alot too (I am also aware of why). I am just looking for exchange of ideas, i don't need to be right on this.
really, i’m not trying to be mean here. you assert life must be finite, and all i’m asking is how finite it should be.
if modern a modern 60Hz LCD/OLED display couldn't get beneath 16.6ms latency, then what exactly is tearing?
Yes, if you control the whole software stack it is possible to do beam racing to get lower than one frame of latency (assuming low latency hardware for input and display panel scanout). But I'm talking about desktop/mobile applications. In general operating systems do not do this, and many actually make it impossible. Only very recently has it become possible to do beam racing in a windowed application (not using fullscreen exclusive mode) on Windows with recent graphics hardware with multiplane overlay and very, very few people have attempted to do it. I believe it is strictly impossible to do beam racing for windowed applications on macOS and Linux/Wayland. Not sure about iOS and Android.
otherwise, yes, modern APIs go out of their way to avoid the possibility of this (the dreaded "tearing" artifacts you see from the frame buffer being changed during the transmission of the video signal to the monitor). i don't believe older techniques like you've mentioned are at all possible today, and only really made sense to talk about when analogue displays were the norm.
Also an underrated feature of high frame rate is that it's a universal fix for poorly written software with multiple frames of latency. An application with 10 frames of latency will be faster on a 1 kHz display than a perfectly coded application on a 60 Hz display.
thats actually not true. you seem to be implying that the best a 60hz display can manage is 16.6ms of latency. indeed that is the worst case value, but you should consider that early graphics technologies involved changing display modes mid scan.
it’s actually not ridiculous to suggest that old platforms had sub-millisecond latency; they did. if the scanline was on, or just before, the line where you would interact (i.e., the prompt line), the text you enter would appear immediately.
of course, “vsync”, tear free, and such like approaches “fixed” this - necessarily by adding at least a frame’s worth of latency - but also adding perceptual latency.
it’s an oft-overlooked aspect of refresh rates. a 60hz CRT, without vsync, still has the lower bound of latency lower than a 120hz display. perhaps even 240hz.
i’ve used two 240hz displays for years now. i’ll never go slower than that.