What's the state of nginx nowadays? Last I heard the original core team had fractured and formed two different forks while F5 continued to develop OG nginx, so there's three nginxes being developed in parallel now. Have the forks gained any traction?
The state of nginx is fine, similar to pfsense. Both made a "Plus" enterprise support offering, open source clones were forked, the originals remain dominant for enterprise and free users anyways. Not to detract from the great projects that are being worked on, like freenginx and opnsense.
> the originals remain dominant for enterprise and free users anyways.
I'm a former pfSense user that reluctantly moved to OPNsense a handful of years ago after a lot of bad press around Netgate started circulating widely causing me to believe that support for the community offering might wane over time. I was under the impression that many people had moved off of pfSense for home use. I'm surprised by your assertion that it "remains dominant" for free users, and I wonder how you might know this?
OPNsense has been rock solid for me, btw. I was reluctant to switch only because of the time sink and perceived risk. Nobody wants to spend a weekend debugging VLAN tagging on their WAN port or some such. Luckily for me, there were no such issues when switching over.
That ones new to me, I was aware of Angie and Freenginx which are both led by former nginx developers who left F5 after the acquisition. TEngine looks to be a much older fork but I can't find much recent discussion about it, though that may be because it's an Alibaba/Taobao project with a primarily Chinese userbase judging by the GitHub issues.
You know, for most projects, I'd think that'd be pretty...bad. Given that nginx only recently got dynamic module support, I'm curious how many people are out there having grown to build it from source, letting them switch upstreams a bit more easily. perhaps. maybe.
Pretty much every major nginx deployment I’m familiar with has been from source. Dynamic modules aren’t really that new but certainly post-date a lot of deployments. But also bigger deployments tend to want full control of which in-tree modules are compiled into nginx, which dependencies they pull in (for security and deployment reasons), and how quickly patches and security releases can be updated.
It also has a fairly simple from-source deployment with a fairly solid build script.
nginx is one of the building blocks inside Kubernetes. I'd say it's doing okay and probably will for the foreseeable future. I've had the chance to look into the code base and it's relatively easy to read and work with, I doubt there will ever not be funding to have people contribute.
There are some alternatives like https://grep.app or https://sourcegraph.com/search if you want fast live search, but at the end of the day these are services offered by companies, and rather expensive ones especially for free anonymous users, so you should probably at least accept that service providers can and do change things like this.
You can also run something like your own copy of Zoekt and then ingest repositories on demand though it isn't quite as instant. But if it's code you're already using extensively, it seems like it might be worth it. Maybe you can write some boondoggle to automatically ingest repos based on dependency metadata, even.
Github managed to provided search to free anonymous users since its inception in 2007, to mid-2023 when they introduced this new code search.
I would submit that this change is entirely business-related: it's a power-play to make people create accounts and stay logged in so they can track you better. It is not that they cannot afford it, it is that they are enshittifying the service to further their interests.
If they were really worried about money, they could lock it down completely so only paying customers could use the service at all... and then they'd lose a huge chunk of customers and lose all the prestige they build in convincing a huge pile of the world's free/open source software to use them as their hosting. So they don't do that - they keep all the prestige and the network effects by seeming _quite_ open, but they'll lock down _parts_ of the experience to try and force specific behaviour.
> you should probably at least accept that service providers can and do change things like this.
Indeed, you should. It should serve as a wake-up call that other people's services/platforms aren't under your control, and you can't rely on them to meet your needs.
It's worth mentioning here I think that github's code search is really quite good. I'm not trying to say that github can do no harm or that github "owning" OSS code hosting is a good thing, but the github search bar is a utility that IMO is worth the price of admission.
I think that sourcegraph maintains a similar quality OSS code search that can be searched for free but I have not personally used it.
The problem is that GH makes it the login process as painful as possible. Login tokens expire frequently, necessitating new logins. Logins require 2fa every time, which makes them extremely flow-breaking. Post-login you're not returned to the file you were on, so now you need to navigate back to search.
Logins are per domain and per device, so I end up dealing with this 4x per day if I'm using GitHub heavily. It's unnecessary.
If you find searching locally cumbersome, you should know that you can do a "shallow" git clone which only downloads the most recent commit and is much faster than cloning the whole repo.
You can change github.com to github1s.com to get VSCode in browser with fairly capable search without logging in. E.g. https://github1s.com/nginx/nginx
Do you have to be logged in to open a repo with the web version of vscode on GitHub? If not, that could make for a Good Enough search interface. Try pressing `.` on a repo page to see if it works
And yet, nothing stops them from continuing to offer their existing search to anonymous users. The search they have offered since inception.
They chose to take the existing search away from anonymous users to drive signups and logins. "Sign up and log in to get improved search" is not as compelling as "sign up and log in to get any search at all"
You must surely know that people use Microsoft GitHub for far more than source control, right?, with issue tracking, email notification, CI, and GitHub Actions.
I recently tried to get a small FOSS project to switch to Codeburg. The answer was "no" because the free CI for them let them catch some MacOS on Apple Silicon bugs (the devs don't have that hardware locally), and because they are already used to GitHub, making it easier to onboard people and review PRs.
Quite the opposite, depending on how it’s used it is THE authoritative system for the code, which gets exponentially more valuable with more contributors
I think it would've been a better choice to move to something they host like gitea or gitlab. nonetheless it's a step in the right direction, nobody should use mail+git in this day and age.
Yup, here I was wondering why I already had it starred if this move only happened today, but then reached the same conclusion that it was probably a mirror repository before.
With a mailing list you download a patch and apply it with "git am", then push it to the repository -- as you are presumably the maintainer who has permission to do that, and assuming the patch is good. You basically just do code review through email and reading the patches with some git functions. It completely sucks in my opinion, but some people like it, or it's how they do things in those parts, etc. When in Rome...
Having done a similar rodeo in the past -- migrating a project to an actual code review tool that enforces some more rigid structure, over plain patch files -- the interim process will probably be something like:
- Previously, some key people were allowed to commit to trunk directly.
- They would read emails/patches, do code review, apply, and push them to trunk.
- For now, you can keep emailing people your patches like you did before. Nothing will change.
- But at a certain point, you'll have to use this new Other Method.
- So, you should probably get familiar with Other Method early, by using it in the meantime, so you can be ready.
- At some point, no more patch files will be accepted and you will have to use Other Method.
- In the meantime, the maintainers will do double-duty and handle both venues.
Most projects are small enough where the double-duty isn't so bad. Most people will switch quick enough and you probably aren't dealing with 1,000 patches. It sucks but the payoff is considered worth it.
Eventually once this is completed you can do things like stop pushing directly to trunk and handling all patches to main through the Other Method. But you don't have to do that. It does sound like they'll stop accepting email patches, though.
Around 15 or so years ago, when a lot of projects were moving from cvs/mailing lists to git, there were a plethora of perl scripts and other tools which automatically took the code and sent the commit to git, usually taking a "rules" file as input which stipulated how to match various email headers with git tags etc. No idea how many of them are still around or used, but there should be some
No, git is an email driven program, not a terrible-webapp driven program. You pipe the email into git am (if it's from mercurial insert hg-patch-to-git-patch into the pipeline, which iirc just rewrites date formats mainly?)
I'm sorry people are being dismissively snide and downvotive about this reasonable characterization of things (see: https://marc.info/?l=linux-kernel&m=111288700902396) because they think knowing the factoid that linux used bitkeeper is some kind of "epic own".
Mercurial has many neat features, and I much prefer working with it. I don't think Git is all bad, but I do feel sad that it has basically become an expectation that you use it, to the exclusion of all other options.
I'm a former pfSense user that reluctantly moved to OPNsense a handful of years ago after a lot of bad press around Netgate started circulating widely causing me to believe that support for the community offering might wane over time. I was under the impression that many people had moved off of pfSense for home use. I'm surprised by your assertion that it "remains dominant" for free users, and I wonder how you might know this?
OPNsense has been rock solid for me, btw. I was reluctant to switch only because of the time sink and perceived risk. Nobody wants to spend a weekend debugging VLAN tagging on their WAN port or some such. Luckily for me, there were no such issues when switching over.
It also has a fairly simple from-source deployment with a fairly solid build script.
Sure, I can clone it and run grep/ripgrep - but sometimes I like the ability to search the code on the browser.
Is it only GitHub where this is a restriction or GitLab is similar?
You can also run something like your own copy of Zoekt and then ingest repositories on demand though it isn't quite as instant. But if it's code you're already using extensively, it seems like it might be worth it. Maybe you can write some boondoggle to automatically ingest repos based on dependency metadata, even.
I would submit that this change is entirely business-related: it's a power-play to make people create accounts and stay logged in so they can track you better. It is not that they cannot afford it, it is that they are enshittifying the service to further their interests.
If they were really worried about money, they could lock it down completely so only paying customers could use the service at all... and then they'd lose a huge chunk of customers and lose all the prestige they build in convincing a huge pile of the world's free/open source software to use them as their hosting. So they don't do that - they keep all the prestige and the network effects by seeming _quite_ open, but they'll lock down _parts_ of the experience to try and force specific behaviour.
> you should probably at least accept that service providers can and do change things like this.
Indeed, you should. It should serve as a wake-up call that other people's services/platforms aren't under your control, and you can't rely on them to meet your needs.
I think that sourcegraph maintains a similar quality OSS code search that can be searched for free but I have not personally used it.
Logins are per domain and per device, so I end up dealing with this 4x per day if I'm using GitHub heavily. It's unnecessary.
GitLab has had this long time.
They chose to take the existing search away from anonymous users to drive signups and logins. "Sign up and log in to get improved search" is not as compelling as "sign up and log in to get any search at all"
There is too much stuff of GitHub. From a resiliency point of view and from a monopoly point of view this is bad.
I recently tried to get a small FOSS project to switch to Codeburg. The answer was "no" because the free CI for them let them catch some MacOS on Apple Silicon bugs (the devs don't have that hardware locally), and because they are already used to GitHub, making it easier to onboard people and review PRs.
Deleted Comment
In my project a considerable amount of stars come from blank accounts, that like also non-paying projects to avoid detection.
I moved to codeberg now for my non work projects.
Does someone take the mailing list updates and manually PR them into Github? I've never actually used a mailing list so I'm curious how it works.
Having done a similar rodeo in the past -- migrating a project to an actual code review tool that enforces some more rigid structure, over plain patch files -- the interim process will probably be something like:
- Previously, some key people were allowed to commit to trunk directly.
- They would read emails/patches, do code review, apply, and push them to trunk.
- For now, you can keep emailing people your patches like you did before. Nothing will change.
- But at a certain point, you'll have to use this new Other Method.
- So, you should probably get familiar with Other Method early, by using it in the meantime, so you can be ready.
- At some point, no more patch files will be accepted and you will have to use Other Method.
- In the meantime, the maintainers will do double-duty and handle both venues.
Most projects are small enough where the double-duty isn't so bad. Most people will switch quick enough and you probably aren't dealing with 1,000 patches. It sucks but the payoff is considered worth it.
Eventually once this is completed you can do things like stop pushing directly to trunk and handling all patches to main through the Other Method. But you don't have to do that. It does sound like they'll stop accepting email patches, though.
The real economic reason to open source part of your product.
Mercurial has many neat features, and I much prefer working with it. I don't think Git is all bad, but I do feel sad that it has basically become an expectation that you use it, to the exclusion of all other options.