That's an issue with the packages themselves though, not with package management as a whole. You and the comment above you are talking about different things. While there's plenty of pain to be had with npm, if you have a project that used to work years ago, you can generally just clone, install and be done, even if on older versions. On Python this used to mean a lot of hurt, often even if it was a fresh project that you just wanted to share with a colleague.
Node/NPM was a poster child of an ecosystem where projects break three times a week, due to having too many transitive dependencies that are being updated too often.
This argument makes no sense. Your dependencies don't change unless you change them, npm doesn't magically update things underneath you. Things can break when you try to update one thing or another, yes, but if you just take an old project and try and run it, it will work.
Assuming the downloads still exist? Does NPM cache all versions it ever distributed?
That's always one major thing I saw breaking old builds: old binaries stop being hosted, forcing you to rebuild them from old source, which no longer builds under current toolchains - making you either downgrade the toolchain that itself may be tricky to set up, or upgrade the library, which starts a cascade of dependency upgrades.
It's not like Node projects are distributed with their deps vendored; there's too much stuff in node_modules.
> Does NPM cache all versions it ever distributed?
Yes it does, that's the whole point. You can still go and install the first version of express ever put on npm from 12 years ago. You can also install any of the 282 releases of it that have ever been put on npm since then. That's the whole point of a registry, it wouldn't be useful if things just disappeared at some random point in time.
The only packages that get removed are malware and such, and packages which the vendor themselves manually unpublish [0]. The latter has a bunch of rules to ensure packages that are actually used don't get removed, please see the link below.
No, I was using this as an argument for why I don't expect Node projects older than a year or two to be buildable without significant hassle.
(Also note that outside the web/mobile space, projects that weren't updated in a year are still young, not old. "Old" is more like 5+ years.)
The two things are related. If your typical project has a dependency DAG of 1000+ projects, a bug or CVE fix somewhere will typically cause a cascade of potentially breaking updates to play out over multiple days, before everything stabilizes. This creates pressure for everyone to always stay on the bleeding edge; with a version churn like this, there's only so many old (in the calendar sense) package dists that people are willing to cache.
This used to be a common experience some years back. Like many others, I gave up on the ecosystem because of the extreme fragility of it. If it's not like that anymore, I'd love to be corrected.
I don't know if it is still as fragile as you remember but if you just never update your package-lock then it is super stable as you (transitive) dependencies never change.
The non-trivial exception being if some dependecy was downloading resources on the fly (maybe like a browser compat list) or calling system libraries (eg running shell commands)