Why are we still developing software like we used to 40 years ago?<p>Why not just import external dependencies into your project at a function/class level rather than at a package one: if you only use FooPackage.BarClass and FooPackage.bazMethod(), you should be able to choose to make your project depend on just those two things, ideally in a completely transparent way by your IDE.<p>Then having to manage the full scope of packages and their versions becomes less relevant, because the IDE can check whether a new version has been released in the package manager of choice, check whether those two things have changed and if they have, then demand that the developer have a look at the code changes to refactor code if necessary, otherwise not requiring any action on their part. Furthermore, if you ever need to migrate to a different package or even rewrite the functionality itself, you could just look at the few signatures that you use, rather than having to remove the package entirely and see what breaks in your IDE. Why can't we have itemized lists of everything that's called and everything that we depend on, as opposed to the abstraction of entire packages?<p>Better yet, why even depend on binary blobs or hidden code (that's ignored by your IDE) in the first place? Why not just download the source for every package that you use and upon updates be able to review the code changes to the package much like a regular Git diff?<p>Of course, this would probably require getting rid of reflection and other forms of dynamic code, which i fully support, since those have never been good for much in the first place and only destroy any hopes of determinism and full control flow analysis.<p>As for the possible counterargument of this being hard to do: it wouldn't really be with more granular packages. Instead of trying to shove an ecosystem inside of a single package, why not split it into 10-20 bits of reusable code instead? Smaller packages, which would be easier to review and manage.<p>Context: i dislike how Spring and Spring Boot in Java force a huge ecosystem of fragile dependencies upon you, with their reflection which ensures that your apps will break at runtime, for example, when moving from Spring Boot 1.5 to 2.0. Furthermore, in the JS world, the node_modules folders are probably 10-1000x too large for what's actually necessary to display a webpage with some interactive behaviour.<p>Disclaimer: i have no delusions about any of the above being technically feasible right now. Perhaps "Why?" would be a good question to ask. In my eyes, perhaps the industry is too set in its current ways and as long as we don't have languages and entire ecosystems that approach established practices from a wildly different angle, no one can actually contest the decades of already built packages and tools.<p>On the bright side, WireGuard kind of displaced OpenVPN somewhat and there are numerous benefits to it being smaller, which is a good example of getting rid of bloated software. Furthermore, Nix may or may not do that to alternative approaches to package management, but its usage still remains really low.<p>In my eyes, the only long term solution is to be able to tell exactly what's necessary for your code to build and work, and to test every dependency update that comes out against this automated process. Static dependencies but at the speed of updates of dynamic dependencies. Then you'd just have to wait for a day until the devs fix the newest regressions of a new dependency release before getting a new statically linked app, as opposed to using dynamic linking and finding that some dependency breaks your app as it happens in prod.