I use a modern Lisp everyday...Clojure.
My dev environment is VS Code using the most excellent Calva extension which give me REPL-everywhere in my editing experience.
Yes, that 70's experience was better...and it's still here, refined and built on modern tooling and runtimes.
I love clojure with CIDER but I have heard as a REPL experience it doesn't compare to CL with SLIME. I like emacs as a lisp machine but I know that a big ball of mutable state and functions with side effects on a single thread really doesn't live up to a proper lisp machine.
Clojure REPL experience is pretty primitive though, you don’t have a step debugger or something like the conditions system, being a hosted language you only get stacktraces from the VM.
What he's getting at is single level storage. Ram isn't used for loading data & working on it. Ram is cache. The size of your disk defines the "size" of your system.
This existed in Lisp and Smalltalk systems. Since there's no disk/running program split you don't have to serialize your data. You just pass around Lisp sexprs or Smalltalk code/ASTs. No more sucking your data from Postgres over a straw, or between microservices, or ...
These systems are magnitudes smaller and simpler than what we've built today. I'd love to see them exist again.
I make a ton of progress with ipython and vim/st3/other text editor with a vertical split. I sometimes screen the terminal in two if I want to run external elements.
We seem to keep iterating back to that modality, but honestly, it's just nice. VS Code starts there but then Java language servers and broken Python language servers and rust language servers and exploding npm dependencies and supply chain attacks just muck everything up.
Try running ipython inside a vim terminal (:below term). Being able to yank and paste between buffers and the terminal (which is itself backed by a vim buffer), and vice versa, is a big multiplier
I think there are benefits of interpreters as well as compilers, possibly for the same programming language in some cases (although some programming languages are better for use with interpreter and some are better for use with compilers).
I think JSON is not the best format for the persistence though (despite their claim that it is better than binary formats, it has many problems); it has many problems, such as not a proper integer type and not non-Unicode text types or binary types, etc; and different format would better, such as DER. If you use DER, then you can use better types, better character sets, binary data, and not limits to how big numbers can be, etc. (JSON might work for JavaScript, but even then only for some types; JavaScript has more types than JSON has, including JavaScript does actually have a integer type even though JSON does not.) (With using JSON, it will just mean that you will then have to wrap other data in JSON anyways (e.g. encoding DER as base64) and the meaning will have to be converted.) (Also, you can work around not being a text format, because you can have a program to display them, like a program can be used to display any other kind of data in other ways (e.g. syntax highlighting, pictures, auto formatting, etc).)
No love for Forth? The opening paragraph just nails the whole concept of a machine operator instead of machine programmer - the driving principle behind Forth, right?
Macro backed by fprintf(stderr, ..) since stdout is buffered.
Generally avoiding C unless something specific and small absolutely requires performance (rather than premature optimization), because maintainability, safety, and velocity tend to be better in higher-level languages.
I run jupyter in a docker container and have the ability to just checkpoint it, it would work for any language that runs a REPL. The gotcha's are any GPU state stuff or any database/socket connections.
I don't know what author is using, but all these features are present in Visual Studio (2026 now) proper, and pale in comparison with what refactoring tools, code navigation, and autocomplete give you.
VS has most of these significantly improved over too. E.g. live testing. Performance profiler. Like it is really heaven and earth difference in favor of VS.
Yes, that 70's experience was better...and it's still here, refined and built on modern tooling and runtimes.
https://www.flow-storm.org/
(haven't use it myself yet)
But things like CIDER give you step debugging if you want. For whatever reason it always felt clunkier than ELisp's debugger
This existed in Lisp and Smalltalk systems. Since there's no disk/running program split you don't have to serialize your data. You just pass around Lisp sexprs or Smalltalk code/ASTs. No more sucking your data from Postgres over a straw, or between microservices, or ...
These systems are magnitudes smaller and simpler than what we've built today. I'd love to see them exist again.
We seem to keep iterating back to that modality, but honestly, it's just nice. VS Code starts there but then Java language servers and broken Python language servers and rust language servers and exploding npm dependencies and supply chain attacks just muck everything up.
Simple simple simple is so refreshing.
I think JSON is not the best format for the persistence though (despite their claim that it is better than binary formats, it has many problems); it has many problems, such as not a proper integer type and not non-Unicode text types or binary types, etc; and different format would better, such as DER. If you use DER, then you can use better types, better character sets, binary data, and not limits to how big numbers can be, etc. (JSON might work for JavaScript, but even then only for some types; JavaScript has more types than JSON has, including JavaScript does actually have a integer type even though JSON does not.) (With using JSON, it will just mean that you will then have to wrap other data in JSON anyways (e.g. encoding DER as base64) and the meaning will have to be converted.) (Also, you can work around not being a text format, because you can have a program to display them, like a program can be used to display any other kind of data in other ways (e.g. syntax highlighting, pictures, auto formatting, etc).)
In terms of ops usability, it's difficult to beat reliability, availability, and serviceability (RAS) features of mainframes. Erlang/OTP comes close.
Generally avoiding C unless something specific and small absolutely requires performance (rather than premature optimization), because maintainability, safety, and velocity tend to be better in higher-level languages.
VS has most of these significantly improved over too. E.g. live testing. Performance profiler. Like it is really heaven and earth difference in favor of VS.