It's amazing that a platform that's been dead since the early 90's is still getting so much love.
There's a UNIX shape void in the embedded world. Todays microcontrollers routinely come with several megabytes of RAM.
1: https://github.com/RetroBSD/retrobsd
2: https://github.com/RetroBSD/retrobsd/tree/master/src/cmd/sma...
Supports Arm M0 (Raspberry Pi Pico), ESP32 and a variety of others, including Z80.
Fun fact: Hal Finney (yes, that Hal) wrote a BASIC interpreter for the Intellivision back in 1978 or so in a weekend. It was 2K of code. Mattel shipped it on a cartridge.
ROM space was so tight, the only error message it produced was:
EH?
Which Hal was very proud of. He showed it to me to make me laugh. At the time I was programming the Mattel Intellivision Roulette cartridge.Amazingly, Aminet is still up-and-running with frequent uploads.
I contributed a package to that once.
I also made an animation on the Amiga, "Sadistic Circus". A circus dog jumps through a hoop a few times, then gets set on fire. What a sick, sick little puppy I am. I submitted it to a PD disk collection one time.
The dog was an image I got off of a magazine disk, which I mucked around with to create my animation.
Pretty rubbish really, but whatever. Happy days.
Ah, nostalgia ain't what it used to be -- source contested
The fact that another breaking change has been introduced confirms my suspicion that Zig is not ready for primetime.
My conclusion is to just use C. For low-level programming it's very hard to improve on C. There is not likely to be any killer feature that some other contender will allow you to write the same code in a fifth of the lines nor make the code any more understandable.
Yes, C may have its quirky behaviour that people gnash their teeth over. But ultimately, it's not that bad.
If you want to use a better C, use C++. C++ is perfectly fine for using with microcontrollers, for example. Now get back to work!
I used words like IMMEDIATE and POSTPONE to create words that create words. I quickly get lost doing that, though.
Forth is very very cool to play with. Pragmatically, virtually any other language is better.
I love "This isn't craftsmanship cosplay, it's software engineering.". I will definitely steal this, let me put it in my notebook.
https://gitlab.com/mcturra2000/cerbo/-/blob/master/x64-asm/0...
Some people seem to revel in assembly, but I now know why C exists.
yes | pv > /dev/null
and was getting about 5.4GiB/s
On the fasm code, I was getting a meagre 7.3MiB/s. Ouch! The non-assembly version is considerably faster. I wonder if it is because I make a syscall for every write I want to perform, whereas C uses buffering, or something.
It's possible to be semi-famous and still able to go to the grocery store and pump your own gas without getting recognized. The local sports radio guys don't need an entourage, even if they do get recognized. But as a rising artist, you hit a point where you can no longer go out in public at all. It's really shocking when it happens because it's so abrupt. My dad's famous friend was a regular at a local restaurant and wasn't bothered for a long time, even when his name/face started showing up in the media. Then one day another customer shouted his name and he got mobbed by fans, and he realized he couldn't go out to eat like a normal person anymore. I think Charli crossed that line with the success of her album Brat last year. It's the point where you start to ask yourself if it's really worth it, and maybe consider going full recluse like Thomas Pynchon. (That's not even getting into the online stan culture stuff that Charli talks about in the article.)
I also heard about Matt Lucas, of Little Britain fame. He was slowly plugging away at it, and was about to give up. At around 30 years old, he teamed up with David Walliams, describing it as the last roll of the die. Their popularity exploded.
Morgan Freeman didn't become famous until he was in his 50's. Someone asked him if he was upset that it took so long. His response was: "No, because it didn't have to happen at all."