Readit News logoReadit News
racbart · 8 years ago
PSA: Please be cautious because this is an excellent opportunity for taking over packages and injecting malware by malicious people.

Example: https://www.npmjs.com/package/duplexer3 which has 4M monthly downloads just reappeared, published by a fresh npm user. They published another two versions since then, so it's possible they've initially republished unchanged package, but now are messing with the code.

Previously the package belonged to someone else: https://webcache.googleusercontent.com/search?q=cache:oDbrgP...

I'm not saying it's a malicious attempt, but it might be and it very much looks like. Be cautious as you might don't notice if some packages your code is dependent on were republished with a malicious code. It might take some time for NPM to sort this out and restore original packages.

incogitomode · 8 years ago
I just tested, and it definitely looks like a troll / hack.

> duplexer3@1.0.1 install /Users/foo/Code/foo/node_modules/duplexer3 > echo "To every thing there is a season, and a time to every purpose under the heaven: A time to be born, and a time to die; a time to plant, and a time to pluck up that which is planted; A time to kill, and a time to heal; a time to break down, and a time to build up; A time to weep, and a time to laugh; a time to mourn, and a time to dance; A time to cast away stones, and a time to gather stones together; a time to embrace, and a time to refrain from embracing; A time to get, and a time to lose; a time to keep, and a time to cast away; A time to rend, and a time to sew; a time to keep silence, and a time to speak; A time to love, and a time to hate; a time of war, and a time of peace. A time to make use of duplexer3, and a time to be without duplexer3."

To every thing there is a season, and a time to every purpose under the heaven: A time to be born, and a time to die; a time to plant, and a time to pluck up that which is planted; A time to kill, and a time to heal; a time to break down, and a time to build up; A time to weep, and a time to laugh; a time to mourn, and a time to dance; A time to cast away stones, and a time to gather stones together; a time to embrace, and a time to refrain from embracing; A time to get, and a time to lose; a time to keep, and a time to cast away; A time to rend, and a time to sew; a time to keep silence, and a time to speak; A time to love, and a time to hate; a time of war, and a time of peace. A time to make use of duplexer3, and a time to be without duplexer3.

pul · 8 years ago
logantpowell · 8 years ago
I got this today as well! WTF? It showed up today and is preventing me from using npm, node, etc...
maxander · 8 years ago
And all this is happening just as after the public release of a serious exploit which allows malicious code to do all sorts of nefarious things when it is somehow installed on the target machine. Hmm.

Given that there's hints, at least, that the problems were caused by some particular developer's actions, I wonder about the security model for package-managed platforms altogether now. If I were a big cybercrime ring, the first thing I'd do would be, get a bunch of thugs together and knock on the front door of a developer of a widely-used package; "help us launch [the sort of attack we're seeing here] or we'll [be very upset with you] with this wrench." Is there a valid defense for a platform whose security relies on the unanimous cooperation of a widely-scattered developer base?

ytpete · 8 years ago
With cases like the current one, or the leftpad incident in 2016, I'm surprised package registries still allow recycling old package names after a package was deleted. Really seems like deleted packages should be frozen forever - if the original author never recreates it or transfers ownership, then people would have to explicitly choose to move to some new fork with a new id.

But your point about pressuring or bribing package authors still stands as a scary issue. Similar things have already happened: for example, Kite quietly buying code-editor plugins from their original authors and then adding code some consider spyware (see https://news.ycombinator.com/item?id=14902630). I believe there were cases where a similar thing happened with some Chrome extensions too...

ioquatix · 8 years ago
You could sign packages and record their signatures along with the version. Which, coincidentally, is basically what https://teapot.nz does, e.g.: https://github.com/kurocha/geometry/blob/master/development-...

Although, I've never considered this in the case of an actual attack. It would make sense to actually fingerprint the entire source tree and record this too somewhere, so when you build it you know you are getting the right thing. Teapot basically defers this to git.

tetha · 8 years ago
> Is there a valid defense for a platform whose security relies on the unanimous cooperation of a widely-scattered developer base?

The defense is staged deployment and active users. This obviously depends on the blutness of the malicious code.

If I may assume easily noticed effects of the malicious code: A dev at our place - using java with maven - would update the library, his workstation would get owned. This could have impacts, but if we notice, we'd wipe that workstation, re-image from backup and get in contact with sonatype to kill that version. This version would never touch staging, the last step before prod.

If we don't notice on the workstation, there's a good chance we or our IDS would notice trouble either on our testing servers or our staging servers, since especially staging is similar to prod and subject to load tests similar to prod load. Once we're there, it's back to bug reports with the library and contact with sonatype to handle that version.

If we can't notice the malicious code at all until due to really really smart activation mechanisms... well then we're in NSA conspiracy land again.

dmitriid · 8 years ago
On top of that, they way countless packages are used everywhere is potentially exploitable: https://medium.com/@david.gilbertson/im-harvesting-credit-ca...
MagicWishMonkey · 8 years ago
Wouldn't you need to install those packages as root for the code to have privileges to take advantage of that exploit?
wybiral · 8 years ago
An ipfs model would help. People would use a strong hash if the package or something.
jannotti · 8 years ago
That's a scary scenario, and all too possible.
weinzierl · 8 years ago
Detailed description what you could do with a malicious npm package is currently on he front page: "Harvesting credit card numbers and passwords from websites"

https://news.ycombinator.com/item?id=16084575

marpstar · 8 years ago
am I the only one who thinks this could be more than a coincidence?
wybiral · 8 years ago
NPM doesn't make the package names unavailable after removal???

EDIT: That would be a massive security problem!

no29 · 8 years ago
maybe it's time to push for adding signed packages to npm

long discussion here: https://github.com/node-forward/discussions/issues/29

edejong · 8 years ago
I am very surprised that a package manager of this calibre and impact abstains from best practices when it comes to authentication through code-signing. Other package managers are miles ahead of NPM. For example, Nix, which uses immutability and hashing to always produce the same artifact, regardless of changes of the sources.
naasking · 8 years ago
Signing won't help unless the end user specifies the signature or certificate that they expect (signing would only help ensure package upgrades are from the same author).

If you're going to have clients specify a signature anyway, then you don't need to sign packages, you just need strong one way hash function, like SHA-1024 or something. User executes "pkg-mgr install [package name] ae36f862..."

Either way, every tutorial using npm will become invalid.

Pyrodogg · 8 years ago
I'm surprised there wasn't a global lock-down on new package registrations (or at least with the known names of lost packages) while they were working to restore them.
swang · 8 years ago
didn't npm make some changes where a published package name cannot be republished, at least not without npm intervention?
eropple · 8 years ago
Yes, but the packages disappeared. That people can dupe these suggests that the database was modified.
therein · 8 years ago
I thought so too. I thought they did that after the left pad incident.
csdreamer7 · 8 years ago
How does RubyGems handle a package being removed and replaced by a different (and maybe malicious) actor? Not allow a package to be deleted? Block the package name from being claimed by someone else?
eric_h · 8 years ago
From http://help.rubygems.org/kb/gemcutter/removing-a-published-r...:

> Once you've yanked all versions of a gem, anyone can push onto that same gem namespace and effectively take it over. This way, we kind of automate the process of taking over old gem namespaces.

codetoliveby · 8 years ago
Shit. That's a good point, I downloaded the Heroku CALI during the attack and it uses duplexer3. I got a weird message that seemed "off" during postinstall.
wybiral · 8 years ago
Wait, they both say username = floatdrop [1] for me. What did they say for you?

[1] https://twitter.com/floatdrop

seldo · 8 years ago
Hi folks, npm COO here. This was an operational issue that we worked to correct. All packages are now restored:

https://status.npmjs.org/incidents/41zfb8qpvrdj

yashap · 8 years ago
Were any of the deleted packages temporarily hijacked? It seems strongly like this was the case. If so, please confirm immediately so people who installed packages during this time can start scanning for malware.

Even if the answer is “yes, 1+ packages were hijacked by not-the-original author, but we’re still investigating if there was malware”, tell people immediately. Don’t wait a few days for your investigation and post mortem if it’s possible that some users’ systems have already been compromised.

electric_sheep · 8 years ago
I would also hope for and expect this to be communicated ASAP from the NPM org to its users.

@seldo, I understand that you don't want to disseminate misleading info, but an abundance of caution seems warranted in this case as my understanding of the incident lines up with what @yashap has said. If we're wrong, straighten us out --- if we're not, please sound an advisory, because this is major.

Dead Comment

nnutter · 8 years ago
Seems like you should have froze publishing instead of saying, "Please do not attempt to republish packages, as this will hinder our progress in restoring them." Especially, to prevent, even temporary, hijacking.
thsowers · 8 years ago
Any chance of a technical write-up so that we can all learn from whatever happened?
seldo · 8 years ago
Absofuckinglutely. It's being done as we speak.
xwvvvvwx · 8 years ago
What was the root cause of the issue?
chrisfosterelli · 8 years ago
Yes I'd be very curious to see a debrief on what the technical cause was. Thanks to the npm team for a quick weekend fix, at any rate!
drdrey · 8 years ago
Or rather: what were the contributing factors of the issue?
seldo · 8 years ago
Update: (this is not the post-mortem, this is just more detail) http://blog.npmjs.org/post/169432444640/npm-operational-inci...

Dead Comment

rootlocus · 8 years ago
> I was here.

> We made history! Fastest issue to reach 1000 comments, in just 2 hours.

> cheers everyone, nice chatting with you. 17 away from hitting 1000 btw!

> Is GitHub going to die with the volume of comments?

Kind of disappointed the NPM community is turning github into reddit right now.

TeMPOraL · 8 years ago
There's probably a large overlap between the two communities.
xauronx · 8 years ago
Considering almost every human I know uses Reddit in some capacity (technical and non-technical), that's pretty likely.
nukeop · 8 years ago
NPM is extremely vulnerable to typosquatting. Be cautious with what you install. The install scripts can execute arbitrary code. NPM's team response is that they hope that malicious actor won't exploit this behaviour. According to my tests, typosquatting 3 popular packages allows to take over around 200 computers in 2 weeks time it takes their moderators to notice it.
swang · 8 years ago
nukeop · 8 years ago
That's okay, but it's not enough - it's easy to swap two letters and do similar substitutions to fool many users. If a package is downloaded 10,000 times every day, surely once in a while someone will misspell the name somehow.

Other than that, their reaction to similar incidents was to wait for somebdoy on twitter to notify them, ban the responsible users, and hope that it won't happen again. It's still extremely exploitable and there are surely many other novel ways of installing malware using the repository that we haven't even heard of yet. The NPM security team is slow to act and sadly doesn't think ahead. They're responsible for one of the largest software ecosystems in the world, they should step up their game.

hateduser2 · 8 years ago
Why assume they’ve already seen it? They probably just haven’t
corndoge · 8 years ago
typical JavaScript engineering
nukeop · 8 years ago
Javascript is a very handy language, it's held back by all the gymnastics it needs to do to get over browser/www limitations, and an influx of low skill developers with no diploma.
sergiotapia · 8 years ago
Yikes, what is it about node/npm/javascript that makes it feel like a house of cards?
DougWebb · 8 years ago
Yikes, what is it about node/npm/javascript that makes it feel like a house of cards?

I think the (short) answer is "node, npm, and javascript".

The longer answer has something to do with the automatic installation of dependencies, and the common use of shell scripts downloaded directly off the internet and executed using the developer's or sysadmin's user account.

I used to use CPAN all the time. CPAN would check dependencies for you, but if you didn't have them already you'd get a warning and you'd have to install them yourself. It forced you to be aware of what you're installing, and it applied some pressure on CPAN authors to not go too crazy with dependencies (since they were just as annoyed by the installation process as everyone else.)

These days I use NuGet a lot. It does the dependency installation for you, but it asks for permission first. The dialogs could be better about letting you learn about the dependencies before saying they're ok. (In general, NuGet's dialogs could be a lot better about package details.)

zbentley · 8 years ago
> CPAN... forced you to be aware of what you're installing

I think CPAN is pretty sweet for variety/wide reach of packages available, but this is flat-out wrong.

CPAN is not a package manager; it is a file sprayer/script runner with a goal of dependency installation. That's perfectly sufficient for a lot of use cases, but to me "package manager" means "program that manages packages of software on my system", not the equivalent of "curl cpan.org/whatever | sh".

CPAN packages can (and do by very common convention) spray files all over the place on the target system. Then, those files are usually not tracked in a central place, so packages can't be uninstalled, packages that clobber other packages' files can't be detected, and "where did this file come from?"-type questions cannot be answered.

Whether CPAN or NPM "force you to be aware of what you're installing" seems like the least significant difference between the tools. When NPM tells you "I installed package 'foo'", it almost always means that the only changes it made to your system were in the "node_modules/foo" folder, global or not. When CPAN tells you "I installed package 'foo'" it means "I ran an install script that might have done anything that someone named 'foo'; hope that script gave you some verbose output and told you everything it was doing! Good luck removing/undoing its changes if you decide you don't want that package!"

There are ways around all of those issues with CPAN, and plenty of tools in Perl distribution utilities to address them, but they are far from universally taken advantage of. CPAN is extremely unlike, and often inferior to, NPM. Imagine if NPM packages did all of their installation logic inside a post-install hook; that's more like a CPAN distribution.

bartread · 8 years ago
I had very limited contact with CPAN some years ago but I imagine it was slightly more sane in terms of granularity of dependencies.

Whereas a lot of npm modules are relatively small - some tiny - and have their own dependencies. So a simple "npm install blah" command can result in dozens of packages being installed. Dealing with that manually would, in fairness, be a giant chore.

Now of course there's a discussion to be had about whether thousands of weeny little modules is a good idea or not but, to be honest, that's a religious debate I'd rather steer clear of.

username223 · 8 years ago
> I used to use CPAN all the time...

CPAN has a setting that force-feeds you dependencies without asking, but I don't think it's on by default. Also, CPAN runs tests by default, which usually takes forever, so users get immediate feedback when packages go dependency-crazy. The modern Perl ecosystem is often stupidly dependency-heavy, but nothing like Node.

corpMaverick · 8 years ago
Also. In CPAN there was a culture of having comprehensive unit tests. If something broke, you would likely see it at installation.
foepys · 8 years ago
I have recently taken over an Angular project (with a C# backend, thankfully) at my job. It took two hours to get it to even compile correctly because some dependencies were apparently outdated in package.json and it just ran on the other dev's machine by accident. I don't understand why I need over 100 dependencies for a simple Angular Single Page App that pulls JSON from the backend and pushes JSON back. Meanwhile, the C# backend (a huge, complicated behemoth of software) ran on the first click.
ep103 · 8 years ago
Three developers on my team spent the last 4 years pushing for angular. Four years ago, I was 50/50 on it vs react, so whatever, but if my team's really for it, let's do it.

Fast forward to angular 2, and we're down to two developers who are still for it.

Fast forward to today, I'm down to one angular dev who's still for it, and two of the original three have left for react jobs. Meanwhile, I'm left with a bunch of angular 1 code that needs to be upgraded to angular 2, and a few testing-out-angular-2 projects that are dependency hell.

The only reason I ultimately embraced angular 1 to begin with (above reasons aside), was because it was so opinionated about everything, I could throw it at my weaker developers and say: "just learn the angular way to do it", and there was very little left they could meaningfully screw up. Angular proponents on the team would see it as a point of expertise to teach the "angular way" to more junior devs, and everyone left the day feeling good.

When it comes to Javascript 95% difficulty with writing good maintainable code is ensuring that your team is all writing to a very exact, and consistent quality and style, since there are so many different ways you can write js, and so many potential pitfalls. And if the team all wants to embrace Google's Angular standard, that works for me. Its far easier to be able to point to an ecosystem with an explicit, opinionated way of writing code, than it is to continuously train people on how to write maintainable code otherwise.

But with angular 2, if you haven't been drinking to cool-aid for a while now, it requires so much knowledge just to get running, I can't even have junior devs work on it, without a senior dev who's also an angular fanboy there to make sure everything is set up to begin with. Its absurd. And I'm supposed to sell to the business that we need to migrate all my Angular 1 code to this monstrosity? And then spend time again every 6 months making the necessary upgrades to stay up to date? Get real.

bartread · 8 years ago
Mark, is that you?

Kidding - but we had exactly the same problem, except with a React app rather than an Angular one just before Christmas.

No joke on with this statement though: every time we have a time-consuming build issue to deal with it comes down to some npm dependency problem. Honestly, if there were a way we could realistically ditch npm (NO! YARN IS NOT ANY BETTER - to preempt that suggestion - it's simply an npm-a-like) I'd happily do so but sadly there isn't.

newfoundglory · 8 years ago
The basic explanation is that the dependencies for the angular app are much smaller, but I’m not sure which bit is confusing you. You don’t understand why an incorrectly written program required work to run when a bigger but correctly written program was easy?
abritinthebay · 8 years ago
You are describing bad development practices.

Not sure why you’re stuck on the number of deps either - as long as they’re small who cares?

tgtweak · 8 years ago
Does that backend use nuget for dependencies?
__s · 8 years ago
Thankfully we now have package-lock.json
joezydeco · 8 years ago
How about the idea that Node has been a hack from day one?
gkya · 8 years ago
Node was a very interesting thing back when it started. It was a hack, but a nice kind of hack. You could write some efficient servers with it. But then the community that formed around it, with it the project went berserk.
fiatjaf · 8 years ago
What hasn't?

Dead Comment

briandear · 8 years ago
borntyping · 8 years ago
The npm repository is the largest package repository in the world. A lot of the major incidents they've could have happened to other ecosystems (e.g. PyPi allows a user to delete packages that other packages depend on), but they've either not happened or haven't had as large an impact. When npm breaks, everyone notices, because everyone either uses npm or knows someone who does.
megaman22 · 8 years ago
Largely because Javacsript is so broken by default that it is almost required to depend on a whole slew of dependencies for functionality other languages contain in their built-in standard libraries. And furthermore, NPM dependencies are broken down into stupidly small units, versioned rapidly, and enforces very little consistency among transitive dependencies.

Other languages and package management systems don't encourage this kind of insanity.

taurath · 8 years ago
This in particular is a huge trust failure - working with mutable/replaceable libraries is like working with mutable/replaceable APIs.
allover · 8 years ago
Well they aren't mutable/replaceable, at least not since after the left-pad incident where npm announced new rules to prevent package unpublishing. It seems this was a operational bug at npm inc.
mamcx · 8 years ago
I wonder how much damage need to be done with JS/Node until seriously is put to rest the madness. Is absolutely necessary to break backward compatibility and rebuild JS from the start. With WebAssembly this is doable (without excuses!) and we already have a nice tag to declare what script language to use.

This is not possible, you ask?

In fact, JS/CSS is the most viable of all the stacks to move forward. Let's use the "advantage" that any JS library/ecosystem die fast and put enough hipster propaganda declaring the ultimate solution.

Is too hard? JS is so bad that fix it is too easy. You only need more than the week it originally take to build it.

lioeters · 8 years ago
As a counterpoint, couldn't any sufficiently complex structure be called a hack and a house of cards, when you really dig down into how it's put together? Mm, maybe not any - as some complex systems are well-tested with solid architecture - but just some, or most..
ken · 8 years ago
"Have you ever noticed that anybody driving slower than you is an idiot, and anyone going faster than you is a maniac?" --George Carlin

I think the software version of this is: any system with more structure than your program is an over-engineered monstrosity, and any system with less structure than your program is a flakey hack.

notatoad · 8 years ago
A "house of cards" implies that you don't have to dig to topple it. If you have to really dig down into how it's put together in order to start pulling it apart it isn't really a house of cards.

I don't use npm or node for anything serious, and i don't really have any knowledge of how NPM works, but this isn't the first time i've read this story of a whole bunch of packages disappearing and everybody's builds breaking. If everything is a house of cards, then why don't i hear the same stories about PyPI or gems or crates?

k__ · 8 years ago
Almost all problems I have with JS projects have to do with npm. It got better with lockfiles, but it seems they're inventing new problems...
nanodano · 8 years ago
Left-bad, I mean, the left-pad fiasco should have been the wake up call.
setr · 8 years ago
How do you not feel embarrassed using such low quality insults..?
megaman22 · 8 years ago
Because it is... It's mollochian complexity heaped on top of layers of excrement and ducktape, and we have collectively entered a state of mass Stockholm Syndrome about the situation.

I really would love to ditch web dev and all its myriad tendrils, and go back to native desktop software.

Pica_soO · 8 years ago
Somehow i imagine a native C-Desktopdev and a Webdeveloper meeting in No-Mans Land each party escaping from its own nightmare with that line on the lips, starting with a "Dont run into this direction-"
JepZ · 8 years ago
Btw. for those who don't know:

Yarn (which is an alternative to npm) uses a global cache [1] on your machine which speeds things up, but probably also protects you from immediate problems in cases like the one currently on progress (because you would probably have a local copy of e.g. require-from-string available).

[1] https://yarnpkg.com/lang/en/docs/cli/cache/

ris · 8 years ago
Already counting down the days before yarn is considered old and broken and people are recommending switching to the next hot package manager/bundler...
scrollaway · 8 years ago
yarn is one of those things coming out of the JS world that is actually really well made. yarn, typescript, react; say what you want about js fatigue, these are rock-solid, well-tested projects that are really good at what they do.

A major reason for the high toolchurn in that ecosystem is how many of those tools are not designed from the ground up, don't quite solve the things they ought to, or solve them in really weird ways (due to the low barrier of entry partly). But that doesn't mean all of it deserves that label.

hateduser2 · 8 years ago
It badfles me that technologists commonly complain about new technology. As far as I can tell your complaint boils down to “people should stop making and switching to new things”.. I find it hard to understand why someone with this attitude would be a technologist of any kind, and I find the attitude really obnoxious.
dmitriid · 8 years ago
Yarn was the only thing that made npm get off their collective asses and do something about their dog-slow issue-ridden CLI and services.
SquareWheel · 8 years ago
And yet yarn's changes directly led to npm making significant improvements of their own...

Do you also insist that Chrome and Firefox shouldn't exist because IE does the job adequately?

k__ · 8 years ago
Already counting down th days before yarn is considerd the new defacto standard package manager...
the_duke · 8 years ago
It's useless in cases like this though, where the package is already invalidated in the yarn cache, which is the case right now for many packages.
rmrfrmrf · 8 years ago
You should be using the --frozen-lockfile flag in any production build system.
matte_black · 8 years ago
That’s it. I’m using yarn.
izacus · 8 years ago
Hmm, I Java world we pretty much always used a local (company-owned) Maven proxy server, which grabbed packages from public repos and cached them locally to make sure builds still work if public servers were down or slow... or packages disappeared.

This isn't a standard practice in JS world?

tomjakubowski · 8 years ago
I’ve worked at places where the Java devs used Maven Central directly. I’ve also worked at a place where the Node devs use an on-premises copy of dependencies for builds and deploys.

It might not be as standard a practice in the Java world as you think.

brown9-2 · 8 years ago
Where did those Java devs who pulled from Maven central directly publish their artifacts?
bpicolo · 8 years ago
It is at big orgs, but not small shops.
usernam33 · 8 years ago
We do it even at home. Not full dep-server but local backup.
rmrfrmrf · 8 years ago
Yes, if your company has Artifactory or similar. I think most of these outraged folks are just memeing.
ake1 · 8 years ago
i use yarn-offline-mirror and save all tarballs in the repository, works fairly well.
mjal · 8 years ago
We use Nexus at work and have linked up NPM to it along with several Maven repos. I don't know why anyone wouldn't do this if they were a business.
james-mcelwain · 8 years ago
Yes, of course. A centralized repository for a package manager is always going to be a single point of failure.
8n4vidtmkvmk · 8 years ago
Not "standard", but there's definitely a couple solutions for this. We tried one at my work but it seemed a bit flakey.
javajosh · 8 years ago
Yes, for those who use `yarn`. (Yarn's package caching looks a lot like Mavens)
fcarraldo · 8 years ago
yarn does local caching in developer laptops. What GP is referring to is having an on-prem private dependency server which acts as a cache and proxy to the centralized public dependency repo.
krzyk · 8 years ago
So they didn't learn anything from left-pad situation from 1.5 year ago?

Packages that are published should be immutable, just like in maven repo case.

christophilus · 8 years ago
They don't allow removal of packages. This is likely a cascading storage failure or something along those lines (or else a major hack).
dmitriid · 8 years ago
They sweared this would never happen again.

Then it happened again not two months after left-pad. And now it happened again.

lotyrin · 8 years ago
I don't really understand any public package repository that fails to have immutable package versions and publisher-namespaced package names.
schmookeeg · 8 years ago
I'm looking forward to envying the blokes who release blockchain-repo and make a gozillion bucks for solving this problem. :)
tbrock · 8 years ago
That’s fine but it doesn’t prevent newer packages from being published by new owners as is happening here.
briandear · 8 years ago
Isn't that how RubyGems works?