So after a decade plus of doing it the right way inside companies, I felt the need for a break from all the processes and pipelines involved in shipping code. So now, on personal projects, I dump a github repo on a server, run a Go server and front it with nginx. Then I commit directly on the server. I will say that parts of me want to create a whole framework around doing it the right way, but the hacker in me is really happy to just write and ship code without all the stuff in between.
Anyone else doing this?
Advantages: Controlled dev environment. No more days-long getting the app to work on a new developer's laptop. Can change dev environment and tools in one place. All developers can see each other's work and collaborate at any time. Dev environment identical to production and easy to refresh as needed.
Disadvantages: Network latency, though less of a problem that I expected. Must have internet connection to work (at least to see the changes and test). A more subtle disadvantage is a lot of developers can't use the Linux command line at all, or not effectively, and some sulk about having to learn "the terminal."
If I know I'll be traveling or without internet I can git pull the source and work on my laptop with no internet connection, the rsync to my dev instance later.
I'd say this can work out to be an advantage by filtering these out.
Then again I've had to work with "front-end developers" and even so-called "full-stack" developers who don't know HTML, CSS, and Javascript.
So some people just need a nudge in the right direction.
They use hosted db (like AWS RDS or MongoDB Atlas), an external service for some user data (sorta like Firebase to handle user accounts, paid/free entitelments, etc) and I believe also a server image on AWS to quickly be able to recreate a clean server if something goes wrong.
Deployment is done with a webhook on git push and a local script that does some folder swapping.
For the rest the application (500k+ active users) is a single machine with apache+php for prod and clones of it for testing.
Thanks to php request-response model there is little to no server-side state out of the db so if load where to become excessive they could slap a load balancer and manually spin up a second server.
If your business model is different from "must get a ton of views" or "let's monopolize this market" this should be quite enough.
So I've heard :-)
For personal stuff, I go straight to most efficient way.
- Spin up a docker image from a base on my home server that has all the stuff setup (ssh server, python/node, open ports, e.t.c)
- Use VSCode with Remote SSH to develop directly in the docker container. Don't even bother with git. I just keep all code around in different files, never delete it.
- I have a cron job running on the docker host that snapshots containers every day, keeps snapshots for a week. Sometimes useful for backups.
- Once server is up and running and I want public access, I set up cloudfare tunnel to it with a subdomain.
I've meant to configure systemd but haven't not gotten around to it
Entire app is a single Go binary I just replace and restart in tmux manually.