Couple of things I like
- tarantool https://www.tarantool.io/en/
- rebol/red-lang https://www.red-lang.org/
- U++ : https://www.ultimatepp.org/
- lazarus: https://www.lazarus-ide.org/
- fasm: https://flatassembler.net/
- tarantool https://www.tarantool.io/en/
- rebol/red-lang https://www.red-lang.org/
- U++ : https://www.ultimatepp.org/
- lazarus: https://www.lazarus-ide.org/
- fasm: https://flatassembler.net/
I don't mean just vacuum tubes or even electronics at all. Mechanical analog computing is insane when you get down to it. You have special shapes that move against each other and do calculus.
We make these mechanical models as analogs of more complex physical systems. We can turn huge calculations into relatively simple machines. That we can roll two weirdly shaped gears together and get an integral out says to me something very profound about the universe. I find it to be one of the most beautiful concepts in all of the sciences.
What's even more wild is that we can take those mechanical analogs of physical systems and build an electronic analog out of vacuum tubes. That vacuum tubes work at all is just completely insane, but it's some absolutely beautiful physics.
And yes, there are equally beautiful problems that can only be solved in the digital domain, but it just doesn't speak to me in the same way. The closest thing is the bitwise black magic like fast inverse square root from a special constant and some arithmetic. Besides, that's more a property of number systems than it is of digital computation.
I understand how and why digital took over, but I can't help but feel like we lost something profound in abandoning analog.
The tide height is a function of the earth/sun/moon systems. Earth and Moon aren't at a fixed distance from eachother, and neither is the sun, so every day is a unique tide but you can predict the range.
The analog way to do it is to make a gear for each point of data in the system and synchronize all their gears. Then you use them all to rotate one final gear, which will show you the prediction for the time you've chosen.
[1] https://zapatopi.net/kelvin/papers/on_the_age_of_the_suns_he...
[2] https://www.youtube.com/watch?v=IgF3OX8nT0w from about 3 minutes.
Once you really understand how these systems are an analog of a physical problem, everything makes so much more sense
Honestly it seems like a perfect application. Neural networks are analog systems. An analog computer can represent neurons very accurately and the entire network is inherently parallel, for free!
I can't wait to see what comes out of this research
Also, most people don’t know that the word „Analog”, as in „analog circuits” comes from „analogy”.
In today's world, we still build analogs, we just coerce them into strictly numerical, digital models. I don't know if you can call it better or worse, but digital is definitely less magical and wondrous than mechanical analog systems.
Imagine writing a program if every time you wanted to change something you had to cut a new gear, or design a new mechanism, or build a new circuit. Imagine the sheer complexity of debugging a system if instead of inspecting memory, you have to disassemble the machine and inspect the exact rotation of hundreds of gears.
Analog computing truthfully doesn't have enough advantages to outweigh the advantage of digital: you have one truly universal machine that can perform any conceivable computation with nothing but pure information as input. Your application is a bunch of binary information instead of a delicate machine weighing tens to hundreds of pounds.
Analog computing is just too impractical for too little benefit. The extra precision and speed is almost never enough to be worth the exorbitant cost and complexity.
As for something you can easily get your hands on, micrometers are incredible. A simple screw and graduated markings on the shaft and nut give you incredibly precise measurements. You can also find mechanical calculators (adding machines) on eBay. But those really aren't very sexy examples of the concepts.
Analog computers aren't very common anymore. Your best bet is visiting one of the computer museums that house antique machines. Or watching YouTube videos of people showing them off. There's plenty of mechanical flight computers in particular on YouTube.
If you have access to a 3D printer, there's plenty of mechanisms one can print. The antikythera mechanism is a very interesting celestial computer from ancient times, and 3D models exist online.
These machines can calculate ballistic trajectories with incredible accuracy, accounting for the relative motion of the ships, wind speed, and even the curvature of the earth. Those calculations are not at all trivial!
https://youtu.be/s1i-dnAH9Y4?si=oHHJGRqnFx-ydQu1
- Tarantool is some sort of in-memory DB with optional persistence
- Red is a programming language that has made the odd syntax decision to use {} for strings and [] to define scopes
- U++ is one of those all-encompasing C++ frameworks like QT
- Lazarus is a Pascal(?) IDE
- And FASM is a toolkit for building assemblers
I'm struggling to find the common thread across these links, apart from the OP probably being an enthusiast of obscure programming languages
http://www.rebol.com/
https://en.wikipedia.org/wiki/REBOL
It's a protocol/tool for async file transfer, built for disconnected/intermittent connectivity amongst known parties (trusted friends as p2p), allowing even for sneakernet-based file transfer.
It's started as a modern take on usenet, but it boggles my mind how cool it is:
Want to send a TV Series to your friend? send it via nncp, and it will make it through either via line-based file transfer (when connection allows, pull or push, cronjob, etc), or even via sneakernet if there is "someone going that way".
Comms priority system lets you hi-prio message-checking via expensive network link vs bulk file transfer using trunk lines later.
It even can be configured to run arbitrary commands on message receive, to allow indexing/processing of files (like a ZFS-receive hook, mail/matrix ingestion...)
See all the usecases: http://www.nncpgo.org/Use-cases.html
As with many of these cool techs, I just wish I had a good reason to use it =D
You can create highly specialized templates in Lua, and there's a RDBMS extension called Cargo that gives you some limited SQL ability too. With these tools you can build basically an entirely custom CMS on top of the base MW software, while retaining everything that's great about MW (easy page history, anyone can start editing including with a WYSIWYG editor, really fine-grained permissions control across user groups, a fantastic API for automated edits).
It doesn't have the range of plugins to external services the way something like Confluence has, but you can host it yourself and have a great platform for documentation.
Personally I would prefer a wiki with git backend. I wrote one [1] but I dont recommend using it.
https://github.com/entropie/oy
[1] Docusaurus:
https://docusaurus.io/
[2] Tinasaurus:
https://github.com/tinacms/tinasaurus
As an administrator, I wish MediaWiki had a built-in updater (bonus points if it could be automated).
I get that by using the container distributions. I just mount My LocalSettings.php and storage volumes in the appropriate places and I get a new version.
And since I run on ZFS and i take a snapshot before updating if something goes wrong I can rollback the snapshot, and go back to when stuff just worked (and retry later).
Deleted Comment
https://www.reddit.com/r/Notion/comments/16zon95/are_there_a...
What I wish more people knew was that you don't need to do those things to get value from Nix. Create project specific dev shells that install the packages (at the correct versions) to work with that project can almost replace 90% of the docs for getting setup to work on a project.
conceptually a game changer for me. In practice it's far from a silver bullet (because every language prefers its own package management so you still have to manage those), but when it works it's quite magical.
Or was the issue that you expected them to be portable? Or use commonly known dynamic library locations?
For example I tried to run pip install yesterday on MemGPT on Nix.
It failed with a C++ error because they use miniconda.
I just created a nix shell with python, pip, etc and ran the pip install command.
Things work fine.
[0] https://www.jetpack.io/devbox
a) Unless you literally write everything in one language, you will have to deal with learning, supporting and fixing bugs in N different package/environment managers instead of just one.
b) If you have a project that uses several languages (say, a Python webapp with C++ extensions and frontend templates in Typescript), then Nix is the only solution that will integrate this mess under one umbrella.
Additionally, as machine-generated content proliferates, I think having services use something like the web of trust concept for membership would be super powerful. The problem is, of course, the terrible UX of cryptographic signatures. But I think there's a lot of opportunity for the group that makes it easy to use.
[0]: https://en.wikipedia.org/wiki/Web_of_trust
Programmability though
- https://arcan-fe.com/2022/10/15/whipping-up-a-new-shell-lash...
- https://arcan-fe.com/2021/04/12/introducing-pipeworld/
- https://arcan-fe.com/2020/12/03/arcan-versus-xorg-feature-pa...
- https://arcan-fe.com/2021/09/20/arcan-as-operating-system-de...
The latest EU funded 'a12' things are also soooo high concept but not fever dream.
Your book looks great, will check it out.
A Nim talk would be a great fit for the event.
Thanks for mentioning this! I work remote in SC and its nice to hear about a nearby convention.
At the time nimble also required me to have NPM to install the the Nim package manager, Nimble. This was not ideal, but looking at [the nimble project install docs](https://github.com/nim-lang/nimble#installation) it seems like it is now package with the language.
Might try dusting it off for some AoC puzzles this year :)
http://nim-lang.github.io/Nim/atlas.html
Deleted Comment