Say there's some world wide computer glitch and the clocks have to be reset. Is it possible that we lose track of exactly how many seconds passed since the epoch?
Actual answer: The VLBI radio telescope Celestial Reference System measures the Earth's rotational position to 10 microarcseconds, which is good to within two thirds of a millionth of a second of sidereal time-of-day. The position of distant quasars is very stable. <a href="https://en.wikipedia.org/wiki/International_Celestial_Reference_System_and_its_realizations" rel="nofollow">https://en.wikipedia.org/wiki/International_Celestial_Refere...</a>
Put the question another way: Suppose you want to determine the moment of midnight to within 0.5s without using any clock more accurate than counting the number of days. You are, however, allowed to use telescopes, reference materials and perform arbitrary amounts of calculations. And you're allowed to wait for ideal observing conditions.<p>The general kind of instrument you want is the <a href="https://en.wikipedia.org/wiki/Transit_instrument" rel="nofollow">https://en.wikipedia.org/wiki/Transit_instrument</a> -- if you can measure the transit of a star to 1 arcsecond, that's better than 1 second/day. That article says there are transit instruments that work down to .01 arcsecond.<p>Unless you have to go a very long time just counting days (enough for a half a leap second to accumulate), it seems pretty clear you could get the correct UTC second.<p>You mention computer glitches; there also exist non-transistorized clocks that are accurate to much more than 1s/day. In fact, the Wikipedia article about the Elgin Observatory talks about how use of a transit instrument to discipline a mechanical clock was accurate to .01s <a href="https://en.wikipedia.org/wiki/Elgin_National_Watch_Company_Observatory" rel="nofollow">https://en.wikipedia.org/wiki/Elgin_National_Watch_Company_O...</a> -- though I doubt such a system is in continuous use today.
You'd be able to get it back from GPS satellites but barring that, the same place synchronized time originally came from - astronomical observations and records of astronomical observations.
Back down on earth, though, most computer time has drift anyway. Very few devices need super precise time (GPS does, for example, because of relativity). Your home computer is subject to network time fluctuations +/- some number of milliseconds. Like NTP over the internet is only accurate down to a few tens of milliseconds.<p>If you have two databases running on two different computers, there's no guarantee their timestamps are in sync relative to each other or relative to a particular atomic clock's epoch. Is it within a second? Probably, on a modern operating system. Same millisecond? Very unlikely.