I am an professional software developer, and have occasionally needed to work with dates and times, including applications that require a certain level of accuracy (e.g. processing pricing feeds from financial institutions)<p>Since 1970, there have been 27 leap seconds applied. But these don't show up anywhere in any computer system I have ever used (OS, languages, applications, third party APIs). If I create a date object of 1970-01-01T00:00:00 and repeatedly add 86400 seconds, should I not end up with a date/time that is no longer midnight?<p>I assume that we are collectively just ignoring leap seconds and hoping for the best. Is this OK?
Computers usually report time in UTC but don't actually have leap seconds. Most computers are pretty bad at keeping time, and regularly sync to NTP servers for the current time. NTP servers generally have better time keeping hardware (or sync with better sources). When a leap second occurs, NTP servers will smear that second over 12 hours or so. Different NPT servers have different smear standards. During that smear, seconds are a bit longer, or shorter for negative leap seconds, than usual. But for general computing this doesn't matter. The smear is on a similar order of magnitude of normal clock drift. This isolates the leap second to only computers that need to track it. UTC/UT1 is designed to be used by humans and is based on the Earth's revolution and rotation about the sun (well now it's actually measured using distant pulsars).<p>The big reason to isolate this is because leap seconds aren't totally determinate. We can predict them to some degree, but ultimately rely on measurement. Unlike something like leap days, you cannot safely code a system that accounts for future leap seconds. Leap days follow a well-defined formula, leap seconds do not.<p>For time sensitive applications like navigational computers, TAI is used. TAI is currently 37 seconds off from UTC. TAI is associated with the SI Second unit. While originally derived from the solar day, it is now based on the vibrational frequency of a caesium atom.<p>For a more detailed explanation see <a href="http://mperdikeas.github.io/utc-vs-ut1-time.html" rel="nofollow">http://mperdikeas.github.io/utc-vs-ut1-time.html</a> for a good summary.<p>For an overview on how Google handles the smear see <a href="https://developers.google.com/time/smear" rel="nofollow">https://developers.google.com/time/smear</a>
> Since 1970, there have been 27 leap seconds applied. But these don't show up anywhere in any computer system I have ever used (OS, languages, applications, third party APIs). If I create a date object of 1970-01-01T00:00:00 and repeatedly add 86400 seconds, should I not end up with a date/time that is no longer midnight?<p>The timestamp on your computer is is in a timescale (called UNIX time) which is defined as not including leap-seconds, so no. The advantage of this system is that there is an algorithm for converting the integers to points on a calendar and back again. Kinda as you mentioned (ie. just keep adding 86400 to go forward a year, a bit more if it's a leap year, etc.). The downside is that you can set your timestamp integer to some value and it can refer to a second in UTC which happened twice (like those 27 leap seconds), or not at all (if a negative leap second is inserted in the future).<p>If you want to use UTC as your timescale, then you would have a different representation (something like TAI) and you would need to know about leap-seconds in order to do the integer-to-calendar-or-back-again type of calculations.<p>As with all things in engineering, it is a trade-off and what is the best choice depends very much on what you are trying to do.
POSIX requires that we ignore leap seconds. POSIX had no alternatives to this choice because the information needed to do time right was not, and still is not, readily available via an authoritative and robust mechanism. No international recommendation has ever required the creation nor funding of a mechanism better than "This one agency in Paris will use the post office to send out letters to your national government time service agency at least 8 weeks in advance of a leap."
Most of us don't need to hope for the best because leap seconds won't trigger bugs in our applications.<p>In my work, I deal with dates and times, but a leap second would have no consequence. A negative leap second would be interesting, because database records could appear to have been created out of order (a possible problem for many apps), but my apps wouldn't care.<p>There are programmers that have to worry about leap seconds, I'm happy I'm not one of them.
A failure rate of ~once every two years is so tiny compared to the rate of failure introduced by other things (from human error, on up); and for many time-related things being off by a second is irrelevant (or again tiny compared to all the other sources of noise in measuring time). So it seems reasonable to me to ignore leap seconds in the vast majority of projects.<p>That said, modern cloud environments do hide this problem for you with leap smearing [1], which seems like the ideal fix. It'd be nice to see the world move to smeared time by default so that one-day = 86400 seconds stays consistent (as does 1 second = 10e9 nano-seconds); but the length of the smallest subdivisions of time perceived by your computer varies intentionally as well as randomly.<p>[1] <a href="https://developers.google.com/time/smear" rel="nofollow">https://developers.google.com/time/smear</a>
I've never worked on systems that required absolute timing accuracy to more than a few minutes. The only date/times were timestamps on human activity, on computer sets by a humans, not NTP.
> including applications that require a certain level of accuracy (e.g. processing pricing feeds from financial institutions)<p>Should leap seconds matter then? As long as you are synchronizing with the same source of truth as the financial institution's back-end, both clocks should tell the same time.
Leap smear was invented to make this easier on programmers and others: <a href="https://developers.google.com/time/smear" rel="nofollow">https://developers.google.com/time/smear</a>
I've always wondered that, too. There are fields where exact second level precision is required, but virtually everyone is OK with systems being up to a few seconds off here and there.
> If I create a date object of 1970-01-01T00:00:00 and repeatedly add 86400 seconds, should I not end up with a date/time that is no longer midnight?<p>Of course and that’s why you shouldn’t do that. I don’t know about other languages, but if you want to advance a certain number of days, you don’t just add seconds. Any code review would pick that apart.<p>You add (or subtract) date components, meaning you specify a day, week or whatever unit.<p>Do I misunderstand the question?