I think I have decided that I'm done with Makefiles. They are very tempting, because they follow naturally from interactive exploration. You see yourself writing a command a lot, and think "I'll just paste that into a Makefile". Now you don't have to remember the command anymore.<p>But the problem is that building software is a lot more than just running a bunch of commands. The commands represent solutions to problems, but if the solutions aren't good enough, you just make more problems for yourself.<p>The biggest problems I've had with Makefile-based builds are getting everyone using the repository the right version of dependencies, and incrementality. A project I did at work involved protos, and it was great when I was the only person working on it. I had a Makefile that generated them for Go and Typescript (gRPC-Web) and usually an incremental edit to a proto file and a re-run of the Makefile resulted in an incremental update to the generated protos. Perfect. Then other people started working on the project, and sometimes a simple proto change would regenerate the entire proto. Sometimes the protos would compile, but not actually work. The problem was that there is a hidden dependency of the proto compiler, protoc-gen-(go|ts), and the language-specific proto API version that controls the output of the proto compilation process. Make has no real way to say "when I say protoc, I mean protoc from this .tar.gz file with SHA256:abc123def456..." You just kind of yolo it. Yolo-ing it works fine for one person; even if your dev machine gets destroyed, you'll probably get it working again in a day or two. As soon as you have four people working on it, every hidden dependency destroys a day of productivity. I just don't think it's a good idea.<p>Meanwhile, you can see how well automated dependency management systems work. Things like npm and go modules pretty much always deliver the right version of dependencies to you. With go, the compiler even updates the project definition for you, so you don't even have to manage files. It just works. This is what we should be aiming for for everything.<p>I have also not had much luck with incremental builds in make. Some projects have a really good set of Makefiles that usually results in an edit resulting in a changed binary. Some don't! You make a change, try out your binary, and see that it decided to cache something that isn't cacheable. How do you debug it? Blow away the cache and wait 20 minutes for a full build. Correctness or speed, choose any 1. I had this problem all the time when I worked on a buildroot project, probably because I never understood what the build system was doing. "Oh yeah, just clean out those dep files." What even are the dep files? I never understood how to make it work for me, even after asking questions and getting little pieces of wisdom that seemed a lot like cargo-culting or religion. Nobody could ever point to "here's the function that computes the dependency graph" and "here's the function that schedules commands to use all your CPUs". The reason is... because it lives in many different modules that don't know about each other. (Some in make itself, some in makefiles, some in the jobs make runs... it's a mess.)<p>Meanwhile, I've also worked on projects that use a full build system that tracks every dependency required to build every input. You start it up, and it uses 300M of RAM to build a full graph. When it's done it maxes out all your CPUs until you have a binary. You change one file, and 100% of the time, it just builds what depended on that file. You run it in your CI environment and it builds and the tests pass, the first time.<p>I am really tired of not having that. I started using Bazel for all my personal projects that involve protocol buffers or have files in more than one language. The setup is intense, watching your CPU stress the neighborhood power grid as it builds the proto compiler from scratch is surprising, but once it starts working, it keeps working. There are no magic incantations. The SHA256 of everything you depend on is versioned in the repository. It works with traditional go tools like goimports and gopls. Someone can join your project and contribute code by only installing one piece of software and cloning your repository. It's the way of the future. Makefiles got us far, but I'm done. I am tired of debugging builds. I am tired of helping people install software. "bazel build ..." and get your work done.