Recently, I did a round of optimizations on an old side-project. There turned out to be two types of changes I made which made the bulk of the difference:<p>1) Rewriting queries that were pulling data out of the DB that wasn't all used
2) Adding in-memory caching for repeated queries<p>A possible third, which I didn't need but could see being beneficial:<p>3) Querying data in bulk when possible (e.g. n users each make the same query for their version of some data, which could instead be made in one bulk query and then stored in memory and parceled out to each user)<p>In principle, for this simple app all three of these optimizations could have been made through a tool that looks at user code, request runtime, and a list of all queries being made and the time spent on each one. My process was to make one of the optimizations, rerun through some representative workflows, and verify that everything still worked and that runtimes / queries made changed in the way I expected.<p>This process felt formulaic but time-consuming (to make sure things like invalidation were done correctly). It seems like there's a lot of tooling out there that can point you at slow queries. Are there any linters / tools which go far enough to drive most of the optimization loop themselves?