As others have mentioned here, the majority of new content recently is mid-level sequels, spin-offs, prequels, etc. It's becoming much harder to differentiate, so people stick with what they know: Netflix, Disney, HBO/Max. I just don't see enough of a market for the other providers to persist long-term (given the high-cost of managing a streaming infrastructure).
I feel like Disney, in particular, should consider spinning off a few of their larger brands, like Marvel and Star Wars, opening them up for licensing revenue and new creative direction.
As a commercial project it was a failure. It only became successful due to two decades of open-source development, and the willingness of its users to invest in it - even if only to stimulate the development of a competitor to expensive proprietary software.
Blender only became an even remotely viable option in 2011, after being open-source for 9 years. Its popularity only really started in 2019 after a massive UI rework made it actually nice to use. This and related changes led to Blender receiving a $1.2M grant in 2019, leading to other companies re-evaluating it and awarding even more grants.
If anything, compared to today's successes its initial proprietary development should be seen as nothing more than a historical curiosity.
Is this standard business communication lingo for layoffs?
At this point the decision has been made in our org to firewall their products off the internet and internal networks, and migrate to something else by 2024.
[1] https://hn.algolia.com/?q=atlassian
[2] https://confluence.atlassian.com/security/cve-2023-22515-pri...
Looking at our Confluence usage over the years, I noticed that we use it primarily as a knowledgebase/documentation tool and less for collaboration. With our on-prem license expiring, we are migrating to a dedicated knowledgebase for our FAQ and frequently changing content and switching to a Markdown tool + Git for our more formal documentation.
I have no problem with Apple bundling these apps and making them work seamlessly together, and I don't even mind that they're all updated simultaneously (except for Safari, which I wish I could update independently without relying on the "Technology Preview" beta channel). But I do have a problem with upgrading my entire OS and disabling the new bloatware features just because I want to keep auto-updates enabled. I used to delay updating and then would end up way behind, which is why I enrolled in auto-update. But now it feels like I'm being held hostage to their update schedule.
And for what benefit? There are hardly any useful OS-level changes in this release, but there are a bunch of new features I'll need to disable (while hoping the next auto-update doesn't break my external monitor), all powered by freshly written code contributing to an expanded attack surface. If I had my way, then I'd take the OS updates and skip all the apps. Keep the attack surface small while still meaningfully improving the core. I don't care about the rest.
I do end up removing many of the Apple apps, but recently started using Safari, Notes, and Reminders more. Apple does an excellent job making their apps work seamlessly within the Apple ecosystem.
I love this quote at the end: "Caring about quality is the heart of craftsmanship. Until you're hooked into those outcomes, micro-optimizing the individual parts is pointless."
Perhaps, as we move towards a web dominated by AI agents, quality will supersede metrics.
As an aside, these articles are the gems that keep me coming back to HN.
This comment is mandatory for every post mentioning Flash, and although the Flash player/plugin was a resource hog and an absolute security nightmare, it's still true.