Good job, OP. Stay away from the haters.
And right now, transformers literally can transform any source to any target, so languages are not as valuable as they once used to be and any idioms in one language will eventually become outmoded thanks to the rapidly evolving way we work today.
Python will still be a favorite language for me, just not as much as it used to be.
If someone makes a new modern, tiny, self contained runtime (Bun like) for Python deployments, hit me up. I'll be happy to try it. That's something I've been wanting for a while.
https://github.com/guilt/DotFiles
There's no knowledge shared here, just my own setup which can be barebones git cloned. Planning to add a curlbash installer here to help me set it up with a oneliner.
A better long term approach would be to onboard people and give their time to lean in, understand practice and do their best work.
When deadlines are short, there needs to be a well defined practice and things to quickly execute on, with everything well documented.
The problem is that the internet is a centralized system practically even though it is decentralized and some are fighting to keep it free.
Fight for decentralization instead, it will remove the need for unnecessary security and reduce the compute cost significantly.
The focus was on being able to demonstrate training, inference and attention, all in one file;
This can be run on a GPU thanks to cupy, a kernel needn't be written for this whole thing to run. I definitely think that more people can mess around with different attention mechanisms and models and try training models out on their computers. That is the post.
Give it one more release then drop it?