There are no benchmark numbers to back the claim that it's fast. At least the OP should have done some measurements; I'm interested in anything that will speed up our DB's 300+M record tables.<p>The article also does not link to the official maintainer page/github page.
1. Shows password on CLI<p>2. Segfaults for me<p>I'll be using mysqldump for now...but doing a flush+lock+copy the innodbdata file(s) seems like it would be much, much faster.
Watch out last time I looked most of these tools miss some things like stored procedures, though I haven't checked mydumper specifically recently. mysqlpdump certainly did, and I think I tried mydumper.<p>There's also other things to potentially worry about; events, functions, etc.
One of my colleagues from my time at Percona worked fairly extensively on Mydumper and Myloader; making the locking semantics better and the segemented files work. It's a pretty amazing tool, since it lets you dump tables in parallel, with very granular locks (something mydumper does not allow).<p>As a logical backup tool, it's invaluable. When combined with binary logs and regular binary backups, it makes for a nice and comprehensive MySQL backup solution. It even works with managed DB services like RDS.<p>YMMV of course, but given how easy it is to test for your own use cases, it's absolutely worth taking a look at.
Note that with 5.7 there is a new tool, "mysqlpump" that can do parallel export/import and direct compression.<p>I haven't test it yet because mysqldump is enough for our needs.
What tool would you recommend to easily take frequent backups and pump them to S3 glacier storage for example? I don't want to spend hours configuring something as it's only for a small personal server.
Nice for my Linux (and possibly BSD) customers but some of my customers use MySQL on Windows servers, it would be nice to give them some love every now and then :-)