I am sure we are all grateful for Proxmox's generous donation; but if €10k is newsworthy for a Foundation with Perl's historic profile, I would be very worried.
Fundraising is hard. There's a longer history around it that I don't have the space to fully explore here, but the quick version is that I'm currently looking for more sponsors in this 10k range rather than having to rely on 100k donations from very large orgs.
Some companies immediately understand the value of this kind of support. Getting that news out will hopefully allow me to find more orgs who can/will donate in this range.
So, if anyone has any leads, please do contact me: olaf@perlfoundation.org If you take a close look at your stack, you'll probably find Perl in there somewhere.
we live in a world where it's easier to burn $10M for a repackaged chatgpt than to have someone wire $10k for a core infrastructure project. sad reality, but reality still.
if you're motivated to do OSS work, the best bet is to figure out how to take VC money to do that and don't end up on some blacklist.
Have you every counted how many old buildings that, if replaced/redesigned/rebuilt, would be both 100x more useful and worth 10x more you see in any large city's center?
Shouldn't their donation be weighed against the revenues they enjoy using the Foundation's labors? Is Proxmox enjoying particularly strong revenues, and doesn't their product involve much more than what Perl provides? I think the donation is pretty fair. Their success certainly owes more to things beyond Perl itself.
I think the point is proxmox's donation is fair, but there's many more businesses getting much more value from perl which are not donating, if their fair donation is notable.
Or, be inspired that it proves there's business value in supporting the foundation? They wouldn't spend thousands of euros if they thought it was going down the drain.
Unpopular opinion that's gonna get downvoted/flagged soon based on my experience here: That just shows you how broke EU tech companies are that even 10k is newsworthy for them.
For context, I only worked for average (not big-tech/unicorns) EU/French/Dutch/German tech companies my whole life and I was shocked to see how much more, the average US tech companies spend on frivolities(forget the wages) than the European companies I worked at.
From what I saw, for US tech companies big or small, buying fully decked MacBooks and Herman Miler chairs for their SW devs was the norm, while I only saw the discount bin chairs and base spec HP/Dell/Lenovo laptops wherever I worked here.
Even where I work now, SW engineers get the same crappy HP notebooks that HR uses to read emails, since beancounters love fleets of the same cheap laptops for everyone, so my manager needs to jump through hoops for the beancounters to let him order more powerful notebooks for the SW engineers in his team, cause ya' know, Docker and VMs eat more resources than Outlook and Excel. This is relatively unheard of for US SW engineers where little expense is spared on equipment.
They even reduced costs related to facility management like AC runtime and cleaning services, so trash bins are emptied twice a week instead of daily, leading to some funny office aromas in summer, if you catch my drift. And of course, they blame Putin for this.
I always naively assumed (mostly due to anti-US propaganda we get here in EU) that due to how expensive US labor and healthcare is, it would be the opposite that US companies would have to cost cut to the bone, but no, despite that, US companies still have way more money to spend than European ones. Crazy.
My explanation is, European companies are just run the same way EU governments are run, on austerity and cost cutting, so instead of trying to make money by innovating and investing and splurging to get the best of everything, they instead try to increase the profits by cutting all possible costs from purchasing to labors' wages and offshoring, since I assume that's also what's taught in European MBA schools.
How many US companies are donating more than 10k to perl? Or even that much?
If your anti-EU rant is right, open source projects should just naturally come into a whole bunch of donations this size from these rich US companies. Does that happen? Especially once we get past the big 3?
Note that Proxmox did not put out any news, the Perl foundation did, and that is based in the USA, so not sure if it really shows what you try to suggest.
> I always naively assumed (mostly due to anti-US propaganda we get here in EU) that due to how expensive US labor and healthcare is, it would be the opposite that US companies would have to cost cut to the bone, but no, despite that, US companies still have way more money to spend than European ones. Crazy.
It's actually almost the opposite here in the US. Because labor is expensive, especially engineering labor, it doesn't make sense to scrimp on expenses like computer equipment or office chairs that support that labor, as you want to reduce development costs. If a faster laptop costs $500/unit more but reduces toil/wasted time for a $300k/yr developer of 30 minutes a day, it is easily justified that it's worth the cost since labor cost is so much higher.
> European companies are just run the same way EU governments are run, on austerity and cost cutting
To a certain extent yes.
But that's because of margins. An large automotive parts vendor or a major biopharma company isn't generating revenue on software, so software engineering is treated as a cost center.
It's the same in the US, but at least you have tech first companies that generate most of their revenue of tech.
Europe's tech industry died out because of a mixture of mismanagement and bad luck. Velti [0] used to outcompete AdSense, but their management was incompetent and borderline fraudulent, and that's why the AdTech industry boomed in the US and collapsed in Europe.
The same thing happened to Nokia, Ericcson, and others.
The good European startups like Datadog, Spotify, Databricks, and UIPath all ended up relocating to the US so an entire generation of startups doesn't exist in the EU.
> they instead try to increase the profits by cutting all possible costs from purchasing to labors' wages and offshoring, since I assume that's also what's taught in European MBA schools
I've dealt with MBA grads from the EU and US - they aren't taught any of this. IMO, the issue is most leadership in European companies tend to be ex-accountants from KPMG/EY/PWC/Deloitte/Mazars-type companies, because manufacturing industries like pharma, automotive, etc have severely low margins.
Been using proxmox for a home lab and I still can't believe how much value they provide for free.
I use it with Cursor and create vm templates and clone them with a proxmox MCP server I've been adding features to and it's been incredibly satisfying to just prompt "create template from vm 552 and clone a full VM with it".
I use it for DevOps at work and it’s just wonderful. The data center features alone are worth the license fees .. but what I like most of all is how easy it makes managing ZFS pools.
To be fair, Proxmox is essentially a UX wrapper around QEMU/KVM, which is free software and the true kernel of value. If you are going the MCP route I wonder if a direct QEMU or libvirt MCP server would be much more powerful and precise.
While UI/UX is–as probably everywhere–a huge topic, we actually have spent most engineering power in the whole management stack. And of that managing QEMU/KVM–while surely significant–is by far not the biggest part of our also 100% free and open source code bases. I'd invite you to try our full feature set, from clustering, to SDN to integrated Ceph management, to containers, backups including third party and our own backup server, and many more, all accessible through a REST API with full permission management and modern access control.
And we naturally try to contribute to every project we use however possible, be it in the form of patches, QA, detailed bug reports, ... and especially for QEMU et al. we're pretty content with our impact, at least compared to resources of other companies leveraging it, I think.
If all it'd take is being "just" a simple UI wrapper, it would make lots of things way easier for us :-)
While you could do that, proxmox offers lot of value with its UI which I need to default to time to time. With just an API key I generate from proxmox I have a wide range of capability that I can hook up an MCP server to.
The funny thing is with Cursor I can just generate a new capability, like the clone and template actions were created after asking Sonnet 4.
Calling Proxmox a wrapper for KVM is hilarious, you're ignoring that Proxmox does all the work to make a functional cluster of VM servers including stuff like shared storage and live migrations and networking. If you only use Proxmox on a single server with local storage then I could see how you would say this but having a fleet of VMs on a cluster of servers where you can take down physical hosts to patch transparently is the "hard problem."
Proxmox has a UI and a bunch of APIs so I don't need to rebuild them myself, and maintains everything quite well (all major upgrades I've done have been pretty seamless). Proxmox is definitely an easy path, and you still have root access for drawing outside the lines.
This is true for the act of launching VMs, but it’s pretty reductive towards the entire suite of important features that Proxmox provides like clustering, high availability, integration with various storage backends, backups, and more that qemu doesn’t.
I mean, that's actually not being fair... It's like saying Windows is just a UX wrapper around a microkernel. There is quite a bit of functionality provided by that wrapper.
What do you use all these VMs for in your homelab? I've dabbled with Proxmox in the past but settled on plain Ubuntu for my home server that I now treat as a pet managed with Ansible.
For me Proxmox is mainly a means to be able to have more than 1 pet (partially for simplicity's sake of not having to make everything play well together in the same install, partially because I have some things which require Windows and some things which require Ubuntu).
I guess I do also sometimes use it for ephemeral things without having to worry about cleaning up after too. E.g. I can dork around with some project I saw on GitHub for an afternoon then hit "revert to snapshot" without having to worry about what that means for the permanent stuff running at the same time.
I personally self-host a bunch of stuff for myself and my household. Nextcloud for my phone, mattermost for in-house communication, private wordpress as a multimedia diary, a bunch of experiments, wekan for organization, network storage, network printer.
I found Turnkey Linux pretty nice. They provide ready to use Linux images for different services. Proxmox integrates with them, so for example to install Nextcloud, all I needed to do is to click around a bunch on the Proxmox interface, and I'm good to go. They have around 80-90 images to choose from.
Most management has no idea of the importance of the open source building blocks that their business rests upon. Similarly they cannot begin to conceive of the benefits of making a donation. Probably the most effective thing you could do is to somehow attempt to copy the "greenwashing" effect of companies being environmentally responsible, and having an environmental section to their annual reports. "The health of the open source ecosystem is essential to sustainable research, development, and operations at ACME Corp. This year we have sponsored the following...for our current and future benefit".
I've long tried to figure out how we can donate to projects. If we were to buy/license those tools it would cost thousands of dollars, but I don't know how to get any money for the free tools we use. When I ask half of management doesn't understand the question, and the rest don't know either.
Perhaps look to your marketing folks rather than engineering.
"Purchasing silver sponsership with [org] as a way to grow our brand awareness" is intrinsically understandable to pretty much any businesses manager.
"Giving away money for something we already have", which is what most technical managers will hear regardless of your actual pitch, is completely inexplicable to many.
It does require that sponsership is even possible, and recurring sponsership may be harder than recurring license fees of course, so its not a sure thing, just an option to try.
Even though Varnish may not be in fashion any more, there were many companies happily using it for free and still demanding security updates.
I like their transparency about who actually supports them, and what the whole community gets for it. I wish other projects would do that, if for no other reason than to make it obvious that FOSS isn't just something that happens.
I remember at a former company, we had a major migration away from Perl 12 years ago. The Perl code base was considered extremely ancient even back then.
I am working for a company maintaining an enterprise grade software system that is primarily driven by Perl 5 and Postgres. It generates about EUR 50 million in revenue every year.
To avoid creating new Perl code from scratch we created a REST API many years ok which new frontends and middlewares use instead of interacting with the core itself. That has been successful to some extent as we can have frontend teams coding in JS/TypeScript without them needing to interact with Perl. But re-writing the API‘s implementation is risky and the company has shied away from that.
Fixing API bugs still require to dive into a Perl system. However, I found it easier turning Python or JS devs into Perl devs than into DB engineers. So, usually, the DB subsystem bears the greater risk and requires more expensive personnel.
If you let your codebase get into an "ancient" state then that's a problem of your own creation rather than that of the language or system in which it is written.
Perl is still what I reach for when I have a regex heavy task. At my job there's a near 50/50 split between python and perl scripts. I've re-written some of the perl used for general sysadmin tasks in python too, but I haven't seen enough benefit to justify doing more. It works. Plus, in my opinion, perl is more fun to write.
I maintain and develop the back-end of a telecom services provider. It's almost 100% Perl, and once in a blue moon when we add something entirely new we usually stick with Perl.
Yes. I do. Just in case anyone doesn't know Perl has been in active development and has been adding new features the entire time. It was never 'dead'. This is Perl's new OO interface that was recently added as a stable feature in 5.40:
It's still considered an experimental feature, just as it was in 5.38.x. Also the linked documents aren't 100% reflective of what's been implemented so far, a better URL would probably be https://perldoc.pl/perlclass.
How good are LLMs at writing Perl? I've tried to use Perl a few times, even being pretty conversant with shell scripting, sed, awk, etc. I found Perl to be difficult because it's so full of idioms that you "just have to know" and (to me anyway) TIMDOWTDI actually makes things harder for a new/learning perl developer.
What I want is TITBWTDI (this is the best way to do it).
Check out Modern Perl. It won’t limit you to one best way for everything. It will give you one best way to do many things, and a smaller set of good ways to do others. https://isbn.nu/9781680500882
Keep in mind though that the current state of Perl includes being in the process of getting a native object model baked into the language core. So that’s still in some flux but it’ll be better than choosing among eight different external object model libraries. It’s also more performant. The docs for that I’m not sure are in a bound paper book anywhere yet, but I’d happily be corrected.
I quite enjoyed Perl Best Practices[0] for the rationales behind every decision, most of which I could get on board with. Plus, if you really like it you can auto-reformat code with perltidy[1] using the "--perl-best-practices" flag or check your files with Perl::Critic[2] policies based on PBP.
It's dear to me because it came along at a time when I needed short breaks from thesis writing.
Not bad. At my last gig we had a modernization project where we were converting Perl to Python and the company invested in their own self-hosted co-pilot. (Bank)
It would hiccup where it would write the existing perl codebase in to a hallucinated python syntax but this was two years ago.
The TIOBE index says Perl is currently the #11 most popular programming language (up from #30 a year ago). ref: https://www.tiobe.com/tiobe-index/
Now, I don't actually believe this, because that puts Perl way ahead of Rust (currently at #18). So the big thing I'm taking away from this little research post is that I no longer trust the Tiobe index. Too bad - it felt pretty reliable for a long time.
Generally only places that have a large amount of existing Perl and Perl developers. It’s a fine language with a strong ecosystem but even most Perl shops have a second backend language these days. I would happily use it again, but the only compelling reason to write new stuff in Perl is if most of your existing stuff is.
People are going to HATE me for this, but I genuinely think: This feels like beating a dead horse...
I had my Perl phase. I even wrote the first piece of code for my employer in Perl. Well, it was a CGI script, so that was kind of natural back then.
But really, since all the hollow Perl6 stuff I've seen, I've never really read or heard anything about the language in the past, what, 10 to 15 years?
There are tons of languages out there, all with their own merits. But everything beyond Perl5 felt like someone was trying to piggyback on a legacy. If you invent a new language, thats fine, call it foobar and move on. But pretending to be the successor to Perl feels like a marketing stunt by now...
Raku has perl DNA running through it … both languages were authored by Larry Wall and the Raku (perl6 at the time) design process was to take RFAs from the perl community and to weave them together.
I do wonder why you consider Raku to be hollow? Sure it has suffered from a general collapse of the user base and exodus to eg. Python. This has slowed the pace of development, but the language has been in full release since Oct 2015 and continues to have a very skilled core dev team developing and improving.
There are several pretty unique features in Raku - built in Grammars, full on unicode grapheme support (in regexes), lazy evaluation, hyper operators, and so on that work well together.
In future sight most of modern languages are coming from a C derivative.
Perl isn't possible without C
Python isn't possible without C
Go isn't possible without C
With those languages you can't get any more raw than C.
Unlike a language like Pascal that is still modern today that as is based on ALGOL are forgotten about. Make's me wonder why such older languages were left behind. Just a ramble.
Some companies immediately understand the value of this kind of support. Getting that news out will hopefully allow me to find more orgs who can/will donate in this range.
So, if anyone has any leads, please do contact me: olaf@perlfoundation.org If you take a close look at your stack, you'll probably find Perl in there somewhere.
if you're motivated to do OSS work, the best bet is to figure out how to take VC money to do that and don't end up on some blacklist.
How important it is, how much it's used and the money they get.
It's nice when companies contribute fixes and testing up-stream, even when it's not a monetary contribution.
Deleted Comment
Unpopular opinion that's gonna get downvoted/flagged soon based on my experience here: That just shows you how broke EU tech companies are that even 10k is newsworthy for them.
For context, I only worked for average (not big-tech/unicorns) EU/French/Dutch/German tech companies my whole life and I was shocked to see how much more, the average US tech companies spend on frivolities(forget the wages) than the European companies I worked at.
From what I saw, for US tech companies big or small, buying fully decked MacBooks and Herman Miler chairs for their SW devs was the norm, while I only saw the discount bin chairs and base spec HP/Dell/Lenovo laptops wherever I worked here.
Even where I work now, SW engineers get the same crappy HP notebooks that HR uses to read emails, since beancounters love fleets of the same cheap laptops for everyone, so my manager needs to jump through hoops for the beancounters to let him order more powerful notebooks for the SW engineers in his team, cause ya' know, Docker and VMs eat more resources than Outlook and Excel. This is relatively unheard of for US SW engineers where little expense is spared on equipment.
They even reduced costs related to facility management like AC runtime and cleaning services, so trash bins are emptied twice a week instead of daily, leading to some funny office aromas in summer, if you catch my drift. And of course, they blame Putin for this.
I always naively assumed (mostly due to anti-US propaganda we get here in EU) that due to how expensive US labor and healthcare is, it would be the opposite that US companies would have to cost cut to the bone, but no, despite that, US companies still have way more money to spend than European ones. Crazy.
My explanation is, European companies are just run the same way EU governments are run, on austerity and cost cutting, so instead of trying to make money by innovating and investing and splurging to get the best of everything, they instead try to increase the profits by cutting all possible costs from purchasing to labors' wages and offshoring, since I assume that's also what's taught in European MBA schools.
If your anti-EU rant is right, open source projects should just naturally come into a whole bunch of donations this size from these rich US companies. Does that happen? Especially once we get past the big 3?
It's actually almost the opposite here in the US. Because labor is expensive, especially engineering labor, it doesn't make sense to scrimp on expenses like computer equipment or office chairs that support that labor, as you want to reduce development costs. If a faster laptop costs $500/unit more but reduces toil/wasted time for a $300k/yr developer of 30 minutes a day, it is easily justified that it's worth the cost since labor cost is so much higher.
To a certain extent yes.
But that's because of margins. An large automotive parts vendor or a major biopharma company isn't generating revenue on software, so software engineering is treated as a cost center.
It's the same in the US, but at least you have tech first companies that generate most of their revenue of tech.
Europe's tech industry died out because of a mixture of mismanagement and bad luck. Velti [0] used to outcompete AdSense, but their management was incompetent and borderline fraudulent, and that's why the AdTech industry boomed in the US and collapsed in Europe.
The same thing happened to Nokia, Ericcson, and others.
The good European startups like Datadog, Spotify, Databricks, and UIPath all ended up relocating to the US so an entire generation of startups doesn't exist in the EU.
> they instead try to increase the profits by cutting all possible costs from purchasing to labors' wages and offshoring, since I assume that's also what's taught in European MBA schools
I've dealt with MBA grads from the EU and US - they aren't taught any of this. IMO, the issue is most leadership in European companies tend to be ex-accountants from KPMG/EY/PWC/Deloitte/Mazars-type companies, because manufacturing industries like pharma, automotive, etc have severely low margins.
[0] - https://www.businessinsider.com/how-velti-one-of-the-largest...
also proxmox is German.
I use it with Cursor and create vm templates and clone them with a proxmox MCP server I've been adding features to and it's been incredibly satisfying to just prompt "create template from vm 552 and clone a full VM with it".
https://github.com/agentify-sh/cursor-proxmox-mcp
And we naturally try to contribute to every project we use however possible, be it in the form of patches, QA, detailed bug reports, ... and especially for QEMU et al. we're pretty content with our impact, at least compared to resources of other companies leveraging it, I think.
If all it'd take is being "just" a simple UI wrapper, it would make lots of things way easier for us :-)
The funny thing is with Cursor I can just generate a new capability, like the clone and template actions were created after asking Sonnet 4.
https://community-scripts.github.io/ProxmoxVE/scripts
I guess I do also sometimes use it for ephemeral things without having to worry about cleaning up after too. E.g. I can dork around with some project I saw on GitHub for an afternoon then hit "revert to snapshot" without having to worry about what that means for the permanent stuff running at the same time.
I found Turnkey Linux pretty nice. They provide ready to use Linux images for different services. Proxmox integrates with them, so for example to install Nextcloud, all I needed to do is to click around a bunch on the Proxmox interface, and I'm good to go. They have around 80-90 images to choose from.
I like having a local server I can carry with me and control using just Cursor to manage it.
So basically the freedom that comes with a homelab without using proxmox UI and ssh.
"Purchasing silver sponsership with [org] as a way to grow our brand awareness" is intrinsically understandable to pretty much any businesses manager.
"Giving away money for something we already have", which is what most technical managers will hear regardless of your actual pitch, is completely inexplicable to many.
It does require that sponsership is even possible, and recurring sponsership may be harder than recurring license fees of course, so its not a sure thing, just an option to try.
I like their transparency about who actually supports them, and what the whole community gets for it. I wish other projects would do that, if for no other reason than to make it obvious that FOSS isn't just something that happens.
https://phk.freebsd.dk/VML/2025/
I remember at a former company, we had a major migration away from Perl 12 years ago. The Perl code base was considered extremely ancient even back then.
To avoid creating new Perl code from scratch we created a REST API many years ok which new frontends and middlewares use instead of interacting with the core itself. That has been successful to some extent as we can have frontend teams coding in JS/TypeScript without them needing to interact with Perl. But re-writing the API‘s implementation is risky and the company has shied away from that.
Fixing API bugs still require to dive into a Perl system. However, I found it easier turning Python or JS devs into Perl devs than into DB engineers. So, usually, the DB subsystem bears the greater risk and requires more expensive personnel.
If you let your codebase get into an "ancient" state then that's a problem of your own creation rather than that of the language or system in which it is written.
That's the point of PCRE though - you get Perl's excellent regex implementation in more places.
Perl 6?
It was so breaking they don't even call it Perl anymore.
https://github.com/Perl-Apollo/Corinna
What I want is TITBWTDI (this is the best way to do it).
Keep in mind though that the current state of Perl includes being in the process of getting a native object model baked into the language core. So that’s still in some flux but it’ll be better than choosing among eight different external object model libraries. It’s also more performant. The docs for that I’m not sure are in a bound paper book anywhere yet, but I’d happily be corrected.
It's dear to me because it came along at a time when I needed short breaks from thesis writing.
It would hiccup where it would write the existing perl codebase in to a hallucinated python syntax but this was two years ago.
Now, I don't actually believe this, because that puts Perl way ahead of Rust (currently at #18). So the big thing I'm taking away from this little research post is that I no longer trust the Tiobe index. Too bad - it felt pretty reliable for a long time.
a) there is probably several orders of magnitude more Perl code running out there in the wild than Rust?
b) the TIOBE index was ever meaningful?
I had my Perl phase. I even wrote the first piece of code for my employer in Perl. Well, it was a CGI script, so that was kind of natural back then.
But really, since all the hollow Perl6 stuff I've seen, I've never really read or heard anything about the language in the past, what, 10 to 15 years?
There are tons of languages out there, all with their own merits. But everything beyond Perl5 felt like someone was trying to piggyback on a legacy. If you invent a new language, thats fine, call it foobar and move on. But pretending to be the successor to Perl feels like a marketing stunt by now...
Raku has perl DNA running through it … both languages were authored by Larry Wall and the Raku (perl6 at the time) design process was to take RFAs from the perl community and to weave them together.
I do wonder why you consider Raku to be hollow? Sure it has suffered from a general collapse of the user base and exodus to eg. Python. This has slowed the pace of development, but the language has been in full release since Oct 2015 and continues to have a very skilled core dev team developing and improving.
There are several pretty unique features in Raku - built in Grammars, full on unicode grapheme support (in regexes), lazy evaluation, hyper operators, and so on that work well together.
Maybe unpopular, true. But hollow?
Maybe take a look at the search results from
https://duckduckgo.com/?q=perl%20conference%202025
and you'll learn that there are ongoing events related to Perl and Raku.
Perl isn't possible without C
Python isn't possible without C
Go isn't possible without C
With those languages you can't get any more raw than C.
Unlike a language like Pascal that is still modern today that as is based on ALGOL are forgotten about. Make's me wonder why such older languages were left behind. Just a ramble.
Nowadays the codebase of Go toolchain and runtime contains almost no C. Just for OS and C libraries interoperability.
https://github.com/search?q=repo%3Agolang%2Fgo+lang%3AC+NOT+...