My AWS tracker has caught a ton of a activity - they really do ship new feature in that tool every day of the week: <a href="https://github.com/simonw/help-scraper/commits/main/aws" rel="nofollow">https://github.com/simonw/help-scraper/commits/main/aws</a><p>My scraper against the GitHub GraphQL schema catches some interesting details too: <a href="https://github.com/simonw/help-scraper/commits/main/github" rel="nofollow">https://github.com/simonw/help-scraper/commits/main/github</a>
Interesting idea, it’s like running an integration test for someone else’s tool.<p>I did something a bit like this for a few internal tools I built at Airbnb. I built a docs system that indexed all the markdown files in every repo or project’s .infra/docs folder into a central documentation browser website.<p>Then in each tool project, we had a CI job that would loop over all the commands and spit the —-help output in Markdown format into that folder. That way when someone added commands or changed a command’s args or help text, the searchable docs were automatically updated.<p>We didn’t use this to generate a change log to show to users per se, but it did mean that you could run `git log .infra/docs` to see a log of every change to the tool’s interface.
that's cool, but shouldn't the best way to solve this would be to ship the changelog with the CLI itself? Something like `cli --changelog`?
I mean I don't understand why its not a general practice with CLI authors