Readit News logoReadit News
flohofwoe · 3 years ago
This doesn't go deep enough IMHO, I only really 'grokked' async/await (and how it differs from stack-switching coroutines) once I understood that it's just syntax sugar for a "switch-case state machine" (meaning you can emulate async/await-style in any language, even plain old C running on a heavily restricted VM like WASM).

For instance look at the JS output of async/await Typescript when a really old JS version is used that didn't support async/await yet.

It's switch-case all the way down:

https://www.typescriptlang.org/play?target=1#code/FAMwrgdgxg...

capableweb · 3 years ago
There is more happening than switch-cases though, evident even from your link (it mentions Promises but doesn't provide the implementation for it).

I think what another comment mentioned is more accurate:

> Asnyc/await is just syntactic sugar for promises.

> Promises are just wrappers around callbacks.

> Callbacks are used to hook functions into an event loop. The dispatching/scheduling is handled there.

> So really when you use async/await, you’re just writing pseudo-sequential code that gets turned into callback code.

https://news.ycombinator.com/item?id=37360151

bheadmaster · 3 years ago
>> So really when you use async/await, you’re just writing pseudo-sequential code that gets turned into callback code.

Small nitpick:

I wouldn't call it "pseudosequential", personally. By that logic, C is pseudosequential, because when you call a system call, your process state gets saved in the scheduler, control gets passed to the kernel, then when kernel finishes, your process state is resumed. Not even getting started on branch prediction and similar CPU optimizations.

What I mean is, as long as you have a kernel and scheduling, there's no such thing as a real (userspace) sequential program. We just call them sequential because their side effects correspond to a sequential model of execution. So if the side effects of an async/await program corresponds to that of a sequential model, it's a sequential program.

But I get the value of using that word for an explanation of the implementation.

l5870uoo9y · 3 years ago
The whole article properly the best explanation of generators I have come across. This quote stuck out:

> Generators are a special type of function that can return multiple pieces of data during its execution. Traditional functions can return multiple data by using structures like Arrays and Objects, but Generators return data whenever the caller asks for it, and they pause execution until they are asked to continue to generate and return more data.

Applications of generators? I have only used Redux-Saga[1]. Can't even think of other libraries that use them, but would be interested in learning.

[1]: https://redux-saga.js.org/

littlestymaar · 3 years ago
I think seeing generators as special functions makes them feel more magic than what they really are: generators are nothing more than a syntactic sugar for objects with a “call” method that will change the values of the private fields in this object, that's why you can call it repeatedly with different results.

Some patterns become easier to write with the generator syntax, but it's not really adding expressive power. (Unlike futures for instance)

heavenlyblue · 3 years ago
You could also have an object with on_success, on_failure attributes so Futures are the same syntax sugar
btown · 3 years ago
IMO this is like saying a high-level language is syntactic sugar for assembly.

Generators allow you to write a state machine as a procedural function whose local variables are automatically suspended when you yield and restored when the caller returns control to the generator. They compile this down to, as you said, an object with a call method. But that transformation is as nontrivial as any compilation/transpilation project.

masklinn · 3 years ago
Generators are not used much in the JS ecosystem for some reason, but they're used a lot in Python.

The most common application is iteration (lazy producers or transformers), but they're also used for their ability to suspend and resume function execution (interruptible functions) e.g.

- pytest fixtures use generators to setup, allow the test to run (yielding a value or not), and teardown

- contextlib.contextmanager uses generators for similar purposes of entering a context, allowing the wrapped code to run, and then exiting the context

They can also act as cheap state managers, especially with the ability to inject data back into the generator, so they can be used to implement a multi-step function where the user needs multiple points of interaction, without needing a bunch of callbacks, or complicated state management (especially without the benefit of sum types or static state machines)

athanagor2 · 3 years ago
I used generators in the way you described in your last paragraph. And I added the possibility to "replay" the generators with the previous inputs except the last one. It provides a cheap way to have a kind of multi-step input process with cancellation.
Zecc · 3 years ago
You can use them in any place where you want to return a bunch of values, but not necessarily all in one go. This goes for bytes in a Uint8Array, points you want to draw in canvas, DOM elements you are selecting or generating on the fy, data objects you've downloaded from somewhere, etc.

The advantage of generators over arrays is that you don't have to allocate a large array, so you can handle large numbers of elements while keeping a lower memory footprint.

They can also hide batching/chunking logic. You could have code that repeatedly downloads of separate JSON files with 200, or 500, or 5000 elements at a time, then return a generator that goes through each element at a time. The consuming code will be none the wiser.

qudat · 3 years ago
redux-saga maintainer here!

Before async/await, generators were the only way to express async flow control in a synchronous way. Libraries like “co” were really popular at the time.

There are some really downside to async/await which is why redux-saga continued to use generators. Even in 2023, me and a couple others are experimenting with deliminited continuations using generators as the foundation.

A couple of libraries we have built:

- https://github.com/thefrontside/continuation

- https://github.com/thefrontside/effection/tree/v3

- https://github.com/neurosnap/starfx

The last one intends to replace redux-saga using DCs.

Here’s a presentation I gave recently talking about DCs in typescript: https://youtu.be/uRbqLGj_6mI?si=XI0JNMKMoO2VHMvM

roggenilsson · 3 years ago
You can also think of generators as a native implementation of Observables from rx (except you can't replay a generator), especially async generators.

You can implement basic operators like map, filter, take, etc. over generators to create pipelines of operations. Very neat abstraction to work with, but like rx, can quickly get hard to reason about.

Recently I wrote some tooling to read, do some operations, and write hundreds of thousands of files locally. Using generators solved having to think about not loading too much stuff into memory since it only yields files when consumed. Also allows you simply implement stuff like batching, like running X requests to a server at a time, and only starting the next batch once the first one is done.

Tomuus · 3 years ago
Observables and Generators (iterators) are fundamentally different. Observables are push-based (like a promise) whereas iterators are pull-based (like a function).

Glossing over this fact leads to a flawed understanding, not a deeper one.

emadda · 3 years ago
A nice use of async generators is when you have an API with paginated results.

The generator function keeps track of the current page.

This allows you to use a `for await` loop which is quite concise.

https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...

You can do it without generator functions, but generator functions allows the caller to not have to know about the underlying pagination API.

davidmurdoch · 3 years ago
I use generators to stringify a huge amount (several gigabytes) of generated JSON for transmission over http/ws. Node can only handle about 1-2 gigs in a single string or Buffer -- it's a hard-coded limit depending on CPU architecture, AFAIK. So these huge JSON blobs need to be chunked.

A generator is nice here because it makes it easy for the code that converts the JSON for transmission to wait until the OS has transmitted the previous chunks -- which helps avoid catastrophic back pressure and runaway memory use.

mistercow · 3 years ago
Generators are a super clean way to do stream processing (in fact, in modern node, streams are async iterators, and can be used with “for await”). The really nice thing is that you can chain together a whole pipeline of generators to do transformations on a stream which are much more complex than map.

For example, do you have a stream of chunks, and want to turn it into a stream of lines? The implementation of this with generators is super clean and readable, because it’s like you’re just iterating over a list.

heavyset_go · 3 years ago
Use them wherever you need an iterable or lazy-loading. You can build pipelines of generators to interleave work, too.
rzimmerman · 3 years ago
It's obsolete now, but before async/await was part of JS proper I worked on a compile-to-JS language that handled this by compiling to callbacks: https://github.com/rzimmerman/kal

It included try/catch support and fancy stuff, like loops. The source may be interesting for anyone interested in compilers: https://github.com/rzimmerman/kal/blob/master/source/generat...

The Kal compiler is written in Kal, but it's supposed to be easy to read. Surprisingly the browser demo still works: http://rzimmerman.github.io/kal/demo.html

sebazzz · 3 years ago
Before it was par of JS proper Typescript could also transpile to callbacks. Good stuff.
14u2c · 3 years ago
Babel will still pollyfill async await too if configured.
lf-non · 3 years ago
This looks pretty nice. Would you be interested in reviving this to target type-safe ts ?

I have been looking forward to a type-safe coffee-script. Civet.dev comes close but goes into some pretty esoteric directions.

rzimmerman · 3 years ago
That's a cool idea and it could be a lot more useful with modern approaches, like true type systems and pattern matching. I don't really have time to work on that but I support the effort of bringing some of CoffeeScript back to the world.
solarkraft · 3 years ago
Kal looks super cool!
winrid · 3 years ago
Thought this was going to explore V8 source code...
endorphine · 3 years ago
That would be cool
trashburger · 3 years ago
From my (admittedly shallow) understanding, each "await" in an async function creates continuation points, is that correct? And then when the internally used Promise resolves or rejects, the function resumes from the continuation point.
dgb23 · 3 years ago
Conceptually yes.

Practically speaking it’s easier to grasp. The article in question makes a good point of playing around with the concepts to dispel the magic.

Asnyc/await is just syntactic sugar for promises.

Promises are just wrappers around callbacks.

Callbacks are used to hook functions into an event loop. The dispatching/scheduling is handled there.

So really when you use async/await, you’re just writing pseudo-sequential code that gets turned into callback code.

This is also why a function can only await if it’s marked as async. It’s callbacks, function coloring leaking out so to speak.

solarkraft · 3 years ago
> Callbacks are used to hook functions into an event loop. The dispatching/scheduling is handled there.

This part never really made it into my understanding of the concept. For practical purposes I've been completely fine with not thinking about it at all. My function is called at some point, that's all what I find really matters. Thinking about the event loop opens up a can of worms I find more confusing than useful.

quectophoton · 3 years ago
The only reason I know about JS generators, and also about using them to simulate async/await, is only because I was already using JS when they were added to the language. If that hasn't been the case, I probably wouldn't even know they existed.

I haven't used that much JS recently, but my guess is that "for await of" will make generators more widely known and used.

endorphine · 3 years ago
Related Ask HN discussion with lots of insights on how async/await works: https://news.ycombinator.com/item?id=30502067
h1fra · 3 years ago
Good article, love the codesanbdox! I often disregard yield because it's basically the same as having an `await for/of` or `while await` and it's not that clear for beginners.