This article starts to get at an interesting controversy about the definition of value. Is the value of a project equal to the amount of labor time it saves over previously used methods? Or is it equal to the amount of labor time it took to create the project? I think most software devs would intuitively pick the former, and that is what the article sides with and also what the market awards to innovations in the short term, but I think there is merit to the latter and people should at least consider it. The article presents an apparent contradiction (the author's side project was coded in a few hours and has arguably had more of a positive impact than their entire day job career's output) but that contradiction is resolved by the latter definition of value, the labor theory of value.<p>The labor theory of value explanation for this is straightforward. In general, LTV asserts that the value output of some work approximates the amount of work that ordinarily has to go into it, or more precisely, of the socially necessary labor time going into that work (ie, the time it would take an average worker to do it without slacking off). This is because if you wanted that work done and didn't care how it got done, the socially necessary labor time would be the real cost of doing it yourself or paying someone else to support themselves while they do it (before various market dynamics and other distorting factors - it is an idealized model). I.e., in an assembly line, the cost of a part is the cost of raw materials + the cost of the labor added to them. This seems straightforward for assembly line work, but is a little less intuitive when the actual work is about making other people's work more efficient, which a lot of software dev falls into. But if someone simply said to themselves "I need the functionality of mammoth.js", the core idea still applies - of being able to replace the worker, hire a generic software dev, and get comparable work (or at least, good enough work), for a similarly low amount of value. Another way to think of it is that mammoth.js might save a lot of people a lot of time, but getting some version of mammoth.js implemented is probably historically inevitable and has a fixed and much smaller cost to actually do.<p>How does this resolve the contradiction in the article? Well, mwilliamson's day job career labor output might possibly have saved less time of other people's work than mammoth.js. But their day job career labor output probably couldn't have been replaced in any way but by a similarly large amount of time and effort from other developer(s). Meanwhile, mammoth.js could be reimplemented in a similarly small amount of time by someone else, maybe taking a couple of tries to get it right. If mammoth.js hadn't been written at the time it was, maybe that would have happened.<p>This isn't to discount ingenuity or insight going into this side project, or the usefulness of something like mammoth.js being in the right place and time. But I think it is a more precise way to think about how much value and what kinds are being added to the world by larger or smaller amounts of effort. In other words, devs shouldn't feel bad about having worked hard on stuff that is less neatly labor-saving than a small widget, as long as that hard work turned out to be useful.