Readit News logoReadit News
throwaway894345 · 6 years ago
For those who were having deja vu, this is a notification that GitHub completed its acquisition of NPM.
behindsight · 6 years ago
Indeed, and here is the accompanying HN discussion when they first announced their intent

https://news.ycombinator.com/item?id=22594549

dang · 6 years ago
Since that's also how the first sentence of the article puts it, we've put that in the title above.
VonGuard · 6 years ago
This is a good thing. When they were independent, NPM was a disaster area. The company spent 100% of its time chasing down social issues and insanity in the community and never figured out how to make money, or at least, it took them FOREVER to figure that out.

Years ago, they introduced "orgs" which they sat there and explained to me with slides and pictures and concepts and business bullshit for an hour. I did not understand a thing they'd said. Finally, they were like "We're selling private namespace in the npm registry for blessed packages for groups or businesses." I understood that. If they'd just said that up front....

They had some great people, some very smart folks like CJ, but they completely biffed every business decision they ever made, and when you'd go in and talk to the leadership, they were always acting as if they had some sort of PTSD from the community. I mean, people were putting spam packages in NPM just to get SEO on some outside webpage through the default NPM package webpages. People were squatting and stealing package names. Leftpad... the community management here is nightmarishly hard, and I was never convinced they'd ever make money on it. MS doesn't NEED to make money on it. They can just pump in cash and have a brilliant tool for reaching UX developers around the world, regardless of whether they use Windows or not.

I feel like the GitHub group at Microsoft is now some sort of orphanage for mistreated developer tool startups. GitHub had similar management issues: they refused to build enterprise features at all for years unless they were useful to regular GitHub.com. And there were other people issues at the top for years. Chris seemed more interested in working with the Obama administration on digital learning initiatives than with running GitHub, for example.

zozbot234 · 6 years ago
> I mean, people were putting spam packages in NPM just to get SEO on some outside webpage through the default NPM package webpages. People were squatting and stealing package names. Leftpad... the community management here is nightmarishly hard

It's not NPM's fault (well, other than wrt. the leftpad thing), it's all about the "community". The Javascript open source community is a dumpster fire.

gedy · 6 years ago
Oh please... I’ve solved more customer needs and business value from this "dumpster fire" than I have with other "real languages" I've worked with in my career. Give it some credit.
z3t4 · 6 years ago
The JS community is the worlds biggest open source community and that's what they've been buying.
ex3ndr · 6 years ago
No, it is an NPM fault. Their team is highly toxic and selfish.
VonGuard · 6 years ago
Correct, it is not their fault. But they ended up spending all their time on it, and in an unproductive manner, I feel.
andrewaylett · 6 years ago
As an online community gets larger, the probability of a member deserving a comparison involving Nazis or Hitler approaches 1?

That the JS community is huge and overwhelmingly newer to the industry than (for example) me does not excuse bad behaviour. What it _does_ excuse is a lack of awareness of history and historical context. And the solution to that is education.

It pains me to see so many wheel-reinventions, both technical and social. But the flip side is that people who don't know something's not supposed to be possible will _sometimes_ manage to do it anyway.

NPM, on the other hand, seemed very much to be all about package management being a simple problem, and learned the hard way that there's a reason why other systems have more complexity.

Cthulhu_ · 6 years ago
They could have made a LOT of money (I think) by offering a "secure" registry - only packages that were reviewed, verified and signed would end up in there, and they would be sealed and made available forever. Companies could have their developers use only that because there is still a huge risk of a malicious actor pushing a patch version of a library with a security vulnerability in it, and at the moment security is still a reactive action in the Node ecosystem.
riyakhanna1983 · 6 years ago
What companies do you see paying for this service? Why would they prefer paying a company for this over a community-driven effort to curate secure packages.
pjc50 · 6 years ago
> I feel like the GitHub group at Microsoft is now some sort of orphanage for mistreated developer tool startups.

That seems .. not so bad really? Microsoft gets to buy them cheaply, and they don't get obliterated or acquihired, and they don't seem to have Yahoo'd them into slow death either.

redthrowaway · 6 years ago
npm is still a disaster, but for other reasons:

    $ time rm -rf node_modules/

    real 1m2.969s 
    user 0m0.409s
    sys  0m15.853s

tracker1 · 6 years ago
Are you using a 5400rpm hdd?

In an existing UI project repo... ci (which clears node_modules) then installs from lock...

    added 1880 packages in 23.283s

    real    0m24.073s
    user    0m0.000s
    sys     0m0.135s
Still slower than I'd like... but I'm pretty judicious in terms of what I let come in regarding dependencies. That's react, redux, react-redux, material-ui, parcel (for webpack, babel, etc) and a few other dependencies.

For one of the API packages ci over an existing install...

    added 1069 packages in 12.911s

    real    0m13.708s
    user    0m0.045s
    sys     0m0.076s
So either you're including the kitchen sink, or you're running on a really slow drive.

searchableguy · 6 years ago
I just use yarn. V2 forces PNP by default which speeds up the installation by 2-3x.

https://yarnpkg.com/features/pnp

ex3ndr · 6 years ago
You are saying it is like you don't have to put JDK to your Dockerized java apps. (pick any language you want that is not statically compiled)
gowld · 6 years ago
It's not a bad idea for MS to sponsor this stuff as a PR play, but it needs a strong commitment to ombudsing and community oversight, or it becomes just another Embrace/Extend/Extinguish.
tomnipotent · 6 years ago
It's 2020, please stop bringing up E/E/E. That boat sailed long ago.

Dead Comment

sytse · 6 years ago
Someone asked "Would this have made sense for a company like GitLab if they didn't have the corporate backing of something like MS?" and deleted their comment while I was writing the answer below:

Being the canonical registry for a language (Rubygems) or technology (DockerHub) tends to be a huge expense.

The main expenses are cloud costs (bandwidth and storage) and security (defense and curation).

I've not seen examples of organizations turning this into a great business by itself. For example Rubygems is sponsored by RubyCentral http://rubycentral.org/ who organize the annual RubyConf and RailsConf software conferences.

Please note that running a non-canonical registry is a good business. JFrog does well with Artifactory https://jfrog.com/artifactory/ and we have the GitLab Package Registry https://docs.gitlab.com/ee/user/packages/ that includes a dependency proxy and we're working on a dependency firewall.

gramakri · 6 years ago
It's a mystery how DockerHub remains free. The network and storage costs must be massive. Docker also sold it's enterprise business, so I am not sure who is paying for all this.
diggan · 6 years ago
If you're running a package registry, you'd be painfully unaware of your own requirements if you're using a host/cloud that you need to pay for the amount of bandwidth.

Hosting a package registry on AWS for example, unless you figure out a way of seriously reducing the amount of traffic (which seems to be against working towards making a popular registry), is suicidal because of the bandwidth costs.

Deleted Comment

beberlei · 6 years ago
This massively depends on the architecture of the package manager though. Composer (PHPs primary package manager) only serves the metadata and directs the installer to the already packaged ZIPs hosted on Github/Bitbucket/Gitlab. This allowed them to serve the whole community as a small team of 2-4 for almost a decade now without any outside funding.
drewbug01 · 6 years ago
> This massively depends on the architecture of the package manager though. Composer (PHPs primary package manager) only serves the metadata

See also, CocoaPods. But we're really talking about two things here as if they're the same, and I think it's because the language we use for them isn't clear: package managers (like CocoaPods and Composer) and full-blown package registries (like RubyGems and NPM). I mean, the difference between them is slight; and there's overlap. They both resolve package dependencies and sort out problems, for example. But they're such very, very different things!

It's not really fair to compare these two approaches and say things about Composer like "this allows them to serve the whole community with a team of 2-4 people": they're just not directly comparable. Those 2-4 people + the entire GitHub staff is what supports the whole community.

Now - don't get me wrong: I think it's fine that GitHub does this! I'm not saying that CocoaPods and Composer are making the wrong choice here at all. In fact, I'm proud to work for a company that provides this service to everyone. But it hand-waves away a lot of stuff as just an "architecture" decision, when in reality it's a decision not to do the hardest part - which is hosting, because it requires gobs of money and dedicated staff and people on-call 24/7.

Framing it as just a different "architecture" implies that registries like NPM are making the wrong choices by hosting their own downloads, which I think is not nearly that simple. Because what happens if GitHub or Bitbucket decided to not allow this type of usage? It would be devastating for the community - so I think it's very appropriate for package managers to make a decision about whether or not they also want to be a registry. They're considering what could happen down the road and what that would mean.

CJ's comments on all of this, re: entropic, are really good: https://www.youtube.com/watch?v=MO8hZlgK5zc

g8oz · 6 years ago
Composer is amazing and I'm going to say far better than what I've seen in JavaScript and python world.
oaiey · 6 years ago
Speaking of... The NuGet registry in GitLab deserves an option to run without authentication from other builds and/or the "public" accessing it. Currently, it does not even handle the build job token and I have to cross connect builds using my personal access token (which then again is accessible to other maintainers of a repo).
tr-gitlab · 6 years ago
GitLab PM here - Agreed that is an important feature and one we have scheduled for milestone 13.1 which will be released in June 2020.

As a workaround, you can add your access token as an environment variable and use that to publish/install packages via CI/CD.

And if you are interested in contributing to GitLab that issue is a great way get started.

blyry · 6 years ago
We stopped using gitlabs artifact repositories for this exact reason!
gowld · 6 years ago
Why is non canonical better? Because they get to charge a fee?
sytse · 6 years ago
Yep, non-canonical registries are used by business and they are more likely to pay. For example our dependency firewall will be a paid feature that delays updates from packages that are recently updated under suspicious circumstances like: author changed, author information changed, different programming language, activity after a long period of inactivity, large change to the code, etc.
kortilla · 6 years ago
Yep, in particular the use case is private packages.
yhoiseth · 6 years ago
There's also https://packagist.com/. I don't know how well the business is doing.
wp381640 · 6 years ago
Can anybody comment on how hosting a package repo for someone large may help peering? I'd imagine they would see significant upstream traffic and globally but not certain if that makes an impact
sytse · 6 years ago
I think hosting a package repo will generally create more downloads (costs money) than uploads (helps with peering but costs storage) because the average number of downloads for a package is more than 1.

Deleted Comment

montroser · 6 years ago
I never quite got a warm-fuzzy feeling from npm -- the tool, the service, the company. This announcement does nothing to help, from my perspective. Is my dependency on this or that JavaScript library something that really needs to be owned by a for-profit company?

I also kind of wonder what is the real value of a centralized repository versus just directly referencing git repos. I haven't used this gpk[0] project yet, but it looks like an interesting alternative, on paper.

[0]: https://github.com/braydonf/gpk

coderzach · 6 years ago
You'd be surprised how often git repos disappear when you have 100s or 1000s of deps.
Waterluvian · 6 years ago
You can still reference repos directly with npm.
KenanSulayman · 6 years ago
Much better: mandatory vendoring of packages. Can't break and being forced to push the packages to the repo makes you appreciate the lack of transient dependencies.
Touche · 6 years ago
Immutability and semver are the reasons
montroser · 6 years ago
Can't we get all that directly from git repos and signed tags?
animalCrax0rz · 6 years ago
This brought up in my mind the thought that while Deno is still WIP (for example, packaging of Rust plugins is not yet resolved) and the ecosystem around it barely exists it was designed to have no dependency on 3rd party tools like npm and yarn.
jakear · 6 years ago
It also provides none of the benefits of npm/yarn. In my understanding it’s as if every package you used pinned all of their deps.
Tistron · 6 years ago
Wouldn't you just include a file by it's minor version like this:

  import * as E from 'https://cdn.jsdelivr.net/npm/fp-ts@2.5/lib/Either.js';
Whenever you `--reload` you'll get the latest 2.5.x release, no?

This should work out recursively for all the deps if they follow the same pattern, no?

animalCrax0rz · 6 years ago
I think there's some confusion here.

NPM is both a massive repository as well as a package manager.

Deno will (soon?) have a package manager, but it won't be tied to one central repository run by a private company, now part of a massive corporation (Microsoft.)

So let's not lump package manager and repository: you can have a package manager that pulls in all the exact or latest dependencies at build time, but it does not have to be tied to one central repository owned and administered by one company.

Centralize-able yet decentralized.

tracker1 · 6 years ago
I'm a bit mixed... that said, it's only a small step apart from what rust does for packages...

I would like to see a package repo for deno, if only to ease publishing/finding modules.

okareaman · 6 years ago
The Deno team recently blessed Pika

https://dev.to/pika/introducing-pika-cdn-deno-p8b

mtm7 · 6 years ago
Out of curiosity, what benefits does Microsoft/GitHub get from owning a package registry? I'd be fascinated to learn more about their long-term strategy here.
jawns · 6 years ago
Others have commented on why the acquisition makes strategic sense from a technological perspective, but I think it's also important to consider how it makes strategic sense from a psychological perspective. For a long time, devs loved bagging on Microsoft. I once saw some Microsoft guys demo something cool at a conference, and they had to basically apologize that they were from Microsoft, because dev sentiment toward the organization was so negative, even though they were doing cool stuff.

The acquisition of GitHub was absolutely intended to capture a tool/ecosystem that developers liked using and benefit from that positive sentiment. That's why Microsoft has been so cautious about branding GitHub as a Microsoft property out the gate. It's trying to ease devs into the idea that the company is something devs can like, and I wouldn't be surprised if this psychological strategy is at work with the npm acquisition, too.

DeathArrow · 6 years ago
>I once saw some Microsoft guys demo something cool at a conference, and they had to basically apologize that they were from Microsoft

Microsoft, "the people" were never bad. They did a lot of cool things, they had great coders and scientists. The upper management did bad things, and made poor decisions, guys like Ballmer and business strategists.

With the new management things changed a lot and I hope Microsoft will keep up at doing good things.

Not all things are rosy: desktop development with MS tools suck, they changed framework after framework, Windows Forms and WPF aren't cross platform, C++ desktop programming is still Windows only, and not Visual at all. I don't get why the naming: "Visual C++" since QT or C+= Builder are much more "Visual" than MFC and Winapi.

But I guess desktop doesn't bring MS too much money so they don't care enough about it. If that's the case, I don't get why they don't promote and support a third party, cross platform development framework like Uno or Avalonia.

janee · 6 years ago
> capture a tool/ecosystem that developers liked using and benefit from that positive sentiment

So...appeal to devs, something something, money??

I really like how MS are improving our tools and embracing open source, I really do. But I've never quite understood how the return on investment in these things justify the cost. I just struggle to the an obv big picture here.

I.e. is it incorrect to think of the GH acquisition as mostly an azure marketing expense?

syshum · 6 years ago
They can, and will have to continue to, thank MS legal for that bad rep..

While Upper Management has lessened some of their excesses, they still tend to be very aggressive in some area's, and time will tell which 1/2 of the company will win. The Open Collaboration group, or the "We want to control the world" group, there is still an internal struggle there

Many believe the "We love Open Collaboration" is just a facade and the "real Microsoft" will revel itself in a few years

lucidone · 6 years ago
I agree with this take. I'm primarily JavaScript/TypeScript developer, doing both front end and back end web app development. Microsoft being a good steward of TypeScript (and now the js ecosystem) inclines me to give C# and .NET Core a shot, and probably Azure down the line.
chrisweekly · 6 years ago
VSC has been huge in this regard, too.
gibs0ns · 6 years ago
I for one appreciate that Microsoft is allowing these dev related acquisitions, such as Github, to maintain an individual brand image. I can't think of anything worst if they decided to reskin/theme these brands to be all Microsoft-y.

Deleted Comment

Deleted Comment

metreo · 6 years ago
Accruing positive sentiment in the dev community through acquisitions of someone else's nice things. Who came up with that crazy idea? GitHub isn't Microsoft. You can't buy brand loyalty.
sneak · 6 years ago
Control.

Getting to decide what goes into npm (the client tool) and what does not allows them to focus the ecosystem onto brands that they own and encourages people to buy their proprietary software and services.

It also provides them the option of cutting off third-party tools (e.g. yarn) in the future if they deem it beneficial.

My prediction is that they will prioritize the platform features that can only be accessed by first-party, branded tools, like the forthcoming GitHub mobile app. Eventually they will stop maintaining support for the APIs that the other tools use, and it'll be Microsoft tools from end-to-end. Pushing to GitHub and publishing to NPM from right within VS Code, et c.

Of course, using them on Windows will always work best, and deploying to Azure will always be easiest.

lstamour · 6 years ago
I’m less pessimistic. Microsoft has engineers with a business mindset in charge. For now, it’s true. But we saw node vs IO.js — Microsoft can’t afford a fork or they lose their own control. They won’t close open source, they’ll add subscription or cloud usage fees for large enterprises. The value, like GNU “open core” is that open is popular, while enterprise features on the other hand are expensive. The enterprise is being led now by open source because it’s cheap, then value add to solve the problems they still have around control and oversight, etc.

If anything, I expect basic Visual Studio internals will eventually get open sourced as cheaper to maintain that way than Roslyn rebuild-all-the-things. And VS Code will adopt them and continue to cannibalize VS mindshare.

lioeters · 6 years ago
The first word that came to mind was "integration", but I suppose "control" is another way of putting it.

I doubt Microsoft would intentionally degrade developer experience, like "cutting off third-party tools". Rather, they're seeking to gain market advantage from the tight integration of services. It would make sense for them to encourage third-party tools to play well in that "Microsoft ecosystem".

> Pushing to GitHub and publishing to NPM from right within VS Code

That's exactly what I picture coming soon, if not here already. Also: develop in VS Code, click to build, push to GitHub, deploy to Azure.

saghm · 6 years ago
It's worth noting that TypeScript is a Microsoft product. As TypeScript has become increasingly popular, I'd imagine that Microsoft has also paid increasing amounts of attention to the JavaScript ecosystem.
muglug · 6 years ago
Yup. There’s a great memo from 1995 describing the web as the next big platform: http://pstella.com/reading/THEWEB.pdf

Looking at the massive growth in frontend technologies within the last decade, it’s easy to see why Microsoft wants to be a little ahead of the curve this time.

prosim · 6 years ago
It's all about securing the software supply chain. Mark Russinovich had a keynote on the general topic at RSA 2020, especially the section on package managers from 0:29 onwards: https://www.rsaconference.com/industry-topics/presentation/c...
metreo · 6 years ago
Ironic really given that Microsoft has the most notoriously insecure products I can think of.

Can anyone point to a technology company that has a larger attack surface or a more fundamentally insecure product?

koolba · 6 years ago
Synergy. End to end tracing of dependencies. Licensing mirrors subsets to on premise clients. Single point of flow for code to published package. There’s plenty of places to extend money they’re making elsewhere.
thelastbender12 · 6 years ago
Lowering the entry barrier for JS developers increases the plausible TAM if you make money from downstream developer tools/services, described as 'commoditizing your complement`. (ref https://www.joelonsoftware.com/2002/06/12/strategy-letter-v/)
DeathArrow · 6 years ago
They get developer mind share and goodwill. That means Microsoft selling more of its services to companies.

The old model of business was like selling pies to dad. Dad might buy or not. Now, you give a free candy to the kid, and dad will buy the pie, too. :)

bobmaxup · 6 years ago
spooky music

Maybe to be able to be "legally compelled" to distribute backdoors onto linux servers / developer machines running npm in exchange for being awarded the JEDI contract.

mrscottson · 6 years ago
I'm sure the prism/NSA angle is still in play, how easy is it to compromise millions of projects in production owning both github and npm?

Deleted Comment

metreo · 6 years ago
They want to buy a community of developers.
empath75 · 6 years ago
Think webassembly, not node.
rl3 · 6 years ago
Curious world we live in, where the infrastructure behind so many OSS projects can simply be acquired.

What's preventing the dream of decentralization from taking off? We have the technology.

mjibson · 6 years ago
Money prevents it. It takes money to host things and pay people to work on infrastructure. While people often volunteer to contribute to OSS products because they like or use them, not many are willing to write infrastructure that can handle this kind of traffic in their spare time. Even if you can find someone to donate the time, you'd still need to fund that infra in some way. Having an infra company (say, Google donates a bunch of GCP credits) to cover the hosting costs still puts the project at risk if the host company decides to stop funding.
metreo · 6 years ago
I seed several torrents for OSS, it's not much but that's about all I can do to help.
pluc · 6 years ago
Whatever happened to people hosting things in an old computer in their basement? That used to be a more popular thing back in the days before the cloud came about and before we had these stable broadband connections. Obviously an infra like npm couldn't deliver with such a setup but at scale, who knows
toomuchtodo · 6 years ago
Start a non-profit 501(c)(3). Only a non-profit can acquire the assets of another non-profit, so it's somewhat of a poison pill. Budgets for orgs listed below are anywhere between $100k up to a few million dollars per year. This doesn't mean you can't run on a shoe string; Hacker News and Pinboard run on single servers with hot spares.

Examples: Let's Encrypt (Certs), Internet Archive (Culture), Quad9 (DNS), Wikipedia (Knowledge), OpenStreetMap (GIS), Python Packaging Authority ["PyPi"] (as part of the Python Software Foundation)

EDIT: Seriously, start non-profits whenever considering implementing technology infrastructure you're unlikely to want to extract a profit from and are seeking long term oversight and governance.

Endlessly · 6 years ago
There are laws, non-profits executives & boards must follow, but there is nothing stopping any of the non-profits you listed from selling any assets you listed with them as long as doing so is in the best interest of the nonprofit; which at best is a subjective topic, at worst, easily dismissed given properly crafted pretext & reasoning.

Generally speaking, much like for profits, it is the people that run it that decided its culture, not a legal structure.

hprotagonist · 6 years ago
open whisper systems
paxys · 6 years ago
There's nothing stopping someone from pulling code from an alternate package registry, or directly from someone's computer. So all the required decentralization infrastructure already exists. For people to use it, though, it has to be convenient.

The average developer isn't interested in showcasing their social/political views, starting a revolution or building the future of the internet. They just want to get the job done as quickly and effectively as possible and go home.

mlyle · 6 years ago
Are you interested in mitigating risk by reducing central dependencies? I am. But it's too costly right now.

All being equal, though, we should try to avoid technical systems with unnecessary trust relationships and critical concentrated dependencies, though.

And if we must have a concentrated dependency, we should do our best to pick very trustworthy ones-- both based on their track record and their likely future interests relating to organizational structures.

ilaksh · 6 years ago
I remember some years ago when npm was having a lot of stability issues because the guy just was not being given adequate time/resources from Joyent or whatever, I made a post on r/node saying now was the time for a fully decentralized package registry.

The post if I remember was mostly ignored, but received a few downvotes and maybe a couple of negative comments.

Based on that, it seems that what's preventing decentralization from taking off is ignorance and apathy.

A day or two later the guy announced npm, Inc. if I remember.

There are actually a lot more developers that have accepted a federated services worldview than a peer-based fully distributed one. But there are package registry projects along both of those lines.

https://github.com/orbs-network/decentralized-npm

https://blog.aragon.one/using-apm-to-replace-npm-and-other-c...

https://github.com/entropic-dev/entyropic

https://github.com/axic/mango

But again, ignorance, apathy, and the status quo remain the most popular options.

imtringued · 6 years ago
From my own attempts at developing a decentralized service I've learned that there is no free lunch. Consensus is still unsolved without resorting to a nuclear option like a blockchain. Federation is a far better strategy. Let people host their own instances and let instances interact with each other.
ocdtrekkie · 6 years ago
NPM by it's very nature is a centralized repository, not really an agent of decentralization, where you'd get code from the authors' sites/servers directly.
briffle · 6 years ago
Ubuntu and Debian have mirrors of their apt repositories hosted all around the world by groups, schools, businesses, etc. There is no single point of failure, you grab from the best mirror near you.
xrendan · 6 years ago
Who pays for that decentralized infrastructure though?
ghayes · 6 years ago
This is the dream of the token model for networks like Bitcoin, Ethereum or Filecoin / ipfs.
guerrilla · 6 years ago
it can be the cost of access
DeathArrow · 6 years ago
>What's preventing the dream of decentralization from taking off? We have the technology.

Time and money.

tracker1 · 6 years ago
discoverability mostly... the social aspect a second, smaller issue.

I think this is probably a good thing in general, and should maybe lead to some interesting enhancements, and maybe even finally solve the distribution of binary modules at a better level.

arkanciscan · 6 years ago
See: left-pad
doctoboggan · 6 years ago
Question from a new JS developer: Should I be using NPM to manage my dependencies?

I have recently started getting into JS programming. I have thus far avoided NPM, because I've been trying to use CDNs for all my external dependencies.

My thinking is that it saves me bandwidth costs and potentially saves my user's bandwidth as well if they get a cache hit.

I get the downsides are that I don't control the CDN and they could go offline, but honestly I expect I am much more likely to go down from some mistake in my own deployment rather than a well known CDN being offline.

I am wondering if I am missing something though, because absolutely every JS package I read about suggests you use NPM (some also link a CDN, many don't). Should I be using NPM to manage my JS dependencies instead of using CDNs?

giantDinosaur · 6 years ago
IIRC it turns out the cache hits from CDN'ed Javascript files ended up being fairly low and neglible, due to how many different versions there are of everything. Better just reduce the file size.
doctoboggan · 6 years ago
I feel like bootstrap and jquery stand a decent chance of being caches in a large enough portion of the user base.

And even ignoring my user's bandwidth, it would still save me significant bandwidth (depending on the size of my website).

I guess eventually your site might grow such that your dependencies are not a significant portion of your total download size, but I am not currently there.

ehnto · 6 years ago
If you can afford to load an image, you can afford to send a JS file. Compile all your JS in to one file and host it yourself. The original sell for JS CDNs was that the library would already be cached from the user visiting another site, and that it would serve it from a local edge. It's really not that big of a sell, and comes with a bunch of risk.

CDNs are far less reliable than my own site, and if my own site is down it's not much help that the CDN is up. Pull in two libraries from CDNs and suddenly you have three points of failure instead of one. Their traffic spikes become your traffic spikes, their downtime is your downtime. And for what? The possibility that maybe the user had that one tiny js library cached, or that the CDN has a node 100ms closer? Not worth it.

lioeters · 6 years ago
Personally, I only use external library CDNs during early stages of development, or quick prototypes.

There are advantages to having all needed assets locally. The main point for me is to minimize external dependencies during runtime - fewer points of failure. Also, vendor libraries can be a single minified bundle served from the same domain. In production they can be moved to a CDN, i.e., CloudFlare.

Using NPM makes sense once you start having more than a few dependencies, or a build step.

On the other hand, if you can get by with library CDNs and don't feel the need for NPM - I'd say that sounds fine, to keep it simple and practical.

armatav · 6 years ago
Good luck not using NPM - it's completely pervasive. CDN should be used in certain contexts, for everything else you're best off using NPM.
doctoboggan · 6 years ago
In which contexts should I use CDNs and why am I otherwise better off with NPM?