You create the synthetic data, move it through 10s of entities and then buy it back cheaply from an entity.
We grew up at a time where SMS was a thing when I became 16. I know that keeping up is cool, but social media is a disease. The amount of dumb and uneducated people that couldn't even listen to expert advice during a fucking pandemic is driving me up the wall.
I'm annoyed mainly because people around me make bad decisions that have an influence on my own life.
The amount of dumb, educated people that blindly accepted everything that was fed to them during the fucking pandemic is driving me up the wall.
"Just two weeks to flatten the curve!"
You can also specify custom new/delete operations for your co-routine for control. I am not sure if you are allowed to delete them to guarantee elision either happens or the program fails to compile.
Much like lambdas and ranged-based for loops, co-routines are pretty much defined as a code transformation into lower-level C++ rather than being black magic.
Regarding the "inline" keyword being used to move a function's body into the caller: that is a compiler hint and not mandatory. The actual purpose of the "inline" function is to allow the implementation of a function to appear in multiple translation units without that causing linking issues (provided the implementation is identical).
In terms of "I’m curious if anyone actually finds something useful to do with these.": the modern C++ Windows Runtime API is built upon async operations and co-routines.
So really, is it faster? Does it reduce the amount of load on network devices? Does it allow a server to serve more connections more quickly than the equivalent HTTP/2.0 stack? Does it make my web app faster?
I mean, for 99% of Wordpress sites the problem is Wordpress, not the transport protocol. For a lot of the web the problem is client-side rendering issues.
QUIC may solve a problem, but is it a problem in real life or a thought experiment that got folded into a standard?
Latency was exceptionally improved. Web pages felt like the loaded faster and at the very least, users could not tell that they were using an encrypted connection.
The protocol essentially worked using a fast-ACK protocol that would preemptively request retransmits (and was occasionally wrong). This enabled it to use connectionless UDP protocol as the underlying transport mechanism. There is, of course, a cost for reduced latency. That cost was slightly higher bandwidth utilization on the network. This was suboptimal for long-lived streams (media and other downloads) so we tried to fault over to ordinary TCP in these instances.
Hard experience taught us that churn is just crazy high, no matter how compatible it easy to use you make it. Getting tens of thousands of stars is not the hard part because it's such an easy concept to like. But I would be surprised there are more than let's say ten thousand piholes in active use.