Interesting question. It's a cost most engineers consider when evaluating dependencies, but I doubt many actually track time lost keeping them up-to-date.<p>"Weekly time" is probably not the best metric, because it's spikier than that, dependent on release schedules / vulnerability discovery. Personally, I try to keep dependencies minimal and limited to established projects known to follow semantic versioning. That eliminates most <i>surprises</i>, but there is still the issue of upgrading when the time comes.<p>I try to keep within one major version of latest, usually only upgrading when it is absolutely necessary. For example, maybe I need a feature in the new version, or maybe I'm about to write a bunch of code that touches the dependency and now is a good time to upgrade. In reality, what that means is I have a slowly building list of chores in the icebox to "upgrade X to 3.0, upgrade Y to 2.0, etc..."