https://yt-better-subs.web.app/
I went through quite the hassle to get the app's oauth scopes approved with Google so that it can keep your subscriptions up-to-date as you add or remove YouTube channel subscriptions.
1. Set up logical replication to a new database server. We used https://github.com/2ndQuadrant/pglogical, but maybe you don't need that any more with newer versions of postgres?
2. Flip a feature flag that pauses all database queries and wait for the queue of queries to complete.
3. Wait for the query queue to drain and for replication to catch up.
4. Flip a feature flag that switches the connection from the old db to the new db.
5. Flip the flag to resume queries.
It helped that we were written in OCaml. We had to write our own connection pooling, which meant that we had full control over the query queue. Not sure how you would do it with e.g. Java's Hikari, where the query queue and the connection settings are complected.
We also had no long-running queries, with a default timeout of 30 seconds.
It helped to over-provision servers during the migration, because any requests that came in while the migration was ongoing would have to wait for the migration to complete.
https://g.co/bard/share/7966410c42af
ChatGPT also figured it out, but Bard is much better at displaying information: https://chat.openai.com/share/ba5d5acc-7b40-46e1-ada5-74b4a6...
1. Make `globalId` part of a "Node" interface that all of the types implement. This will work better with tooling like Relay (used for refetching and caching). It will also let you add a `node` field that can be used to fetch any node in the graph.
2. Make the sort input an enum so that you have `sort: TITLE_DESC` instead of `sort: {by: TITLE, order: DESC}`.
3. Implement the connection spec instead of returning a list of items: https://relay.dev/graphql/connections.htm. This will let you add pagination data to the field and other useful info like totalCount.
4. Spin up a postgraphile instance with the `@graphile-contrib/pg-simplify-inflector` and `postgraphile-plugin-connection-filter` plugins and copy everything they do.
I do wish curl would split the curl backend protocols (http, ftp etc) into separate loadable modules so that we (downstream distro packagers) can reduce the total attack surface through packaging changes.
For example, we could have:
/usr/lib/libcurl/curl-http.so
/usr/lib/libcurl/curl-ftp.so
packaged separately as "libcurl-http" and "libcurl-ftp". Curl clients which only want HTTP would be set to depend on "libcurl-http" only, so the FTP support wouldn't even be on the system unless a package needs it.There is a way to whitelist protocols already in curl (CURLOPT_PROTOCOLS(3)), but that requires modifying existing programs, and I think this could be used in addition.
There are lots of weird modules in curl (telnet, gopher, pop3 -- which don't get me wrong I think is great!) but they should not be a part of the default install of most Linux distros.
I think it may be one of those things you have to see in order to understand.
For example:
He mentions that C and C++ allow const variables, but Clojure doesn't support that.clj-kondo has a :shadowed-var rule, but it will only find cases where you shadow a top-level var (not the case in my example).