This is (serious) a part of my retirement planning. I'll be mid-50s when this hits, and have enough low level system knowledge to be dangerous. In about 15 years, I'll start spinning up my epochalypse consultancy, and I'm expecting a reasonable return on investment in verifying systems as 2038 compliant.
This is not a problem for 20 years from now, I've already had to find and fix 2038 bugs ... there was an openssl bug (now fixed) in which cert beginning and ending date math was done in 32-bits ... certs with an end date from a little after 2038 would fail when compared with the current time.<p>Fortunately for me there was already a fixed OpenSSL already available once I'd found the bug in it.
> statx() will serve as the year-2038-capable version of the stat() family of calls<p>Does this seem horrible to anyone else? Why not fix stat()? Does this syscall have to be so highly preserved even when it will be broken?<p>One of the advantages of the OpenBSD approach of being able to make intrusive changes saw their time_t made 64-bit in the 5.5 release in 2014.<p><a href="https://www.openbsd.org/55.html" rel="nofollow">https://www.openbsd.org/55.html</a><p>Admittedly this is much harder for Linux as they can't make the change an verify in a single place due to the separation of kernel and userland/ the fact Linux has many distros.
Is there a reason why they decided to store time as seconds from 1970? In a 32-bit integer nonetheless. It seems like basic logic would have lead the original designers to make it at least 64 bits so that you'd never overflow it (with a 64 bit time we'd be good til the year 292277026596).<p>64 bits would also allow you to also cover the entirety of history, all the way back to 13.7 billion years ago when the Universe came into existence, but instead the UNIX time format is shackled to be within ~68 years of 1970.
Let's move to Urbit's 128 bit system: <a href="https://github.com/urbit/urbit/blob/master/include/vere/vere.h#L762" rel="nofollow">https://github.com/urbit/urbit/blob/master/include/vere/vere...</a><p><pre><code> /* Urbit time: 128 bits, leap-free.
**
** High 64 bits: 0x8000.000c.cea3.5380 + Unix time at leap 25 (Jul 2012)
** Low 64 bits: 1/2^64 of a second.
**
** Seconds per Gregorian 400-block: 12.622.780.800
** 400-blocks from 0 to 0AD: 730.692.561
** Years from 0 to 0AD: 292.277.024.400
** Seconds from 0 to 0AD: 9.223.372.029.693.628.800
** Seconds between 0A and Unix epoch: 62.167.219.200
** Seconds before Unix epoch: 9.223.372.091.860.848.000
** The same, in C hex notation: 0x8000000cce9e0d80ULL
**
** New leap seconds after July 2012 (leap second 25) are ignored. The
** platform OS will not ignore them, of course, so they must be detected
** and counteracted. Perhaps this phenomenon will soon find an endpoint.
*/</code></pre>
I wonder how much attention this will get from the general public and non-technical managers? After all, programmers predicted doom for Y2k, and then "nothing happened".<p>This is almost the same situation, except I assume slightly less understandable to a non-programmer (you have to understand seconds-since-1970 and why we'd do that instead of storing the date as text, powers of 2 and the difference between 32 and 64-bit).
Speaking of the 2038 bug, I'm impressed with Paul Ryan's rhetoric [0]<p>“I asked CBO to run the model going out and they told me that their computer simulation crashes in 2037 because CBO can’t conceive of any way in which the economy can continue past the year 2037 because of debt burdens,” said Ryan.<p>I love politicians.<p>[0]-<a href="http://www.cnsnews.com/news/article/ryan-debt-track-hit-800-percent-gdp-cbo-cant-conceive-any-way-economy-can-continue-past" rel="nofollow">http://www.cnsnews.com/news/article/ryan-debt-track-hit-800-...</a>
I Wonder how DRM anti-circumvention laws will mix with this; You have a locked-down device you use, depend on, and know is defective, but you are not allowed to hack the device to fix it.
Despite the cause being the end of the UNIX epoch in 2038, problems will become apparent a much sooner. Like the Y2K issue - in ~2031 (or sooner), systems that track expiration dates or contract terms will start to run into dates past 2038. As 2038 approaches, more systems will be affected (there are relatively fewer expiration dates 7 years out vs. 5 or 3).<p>The effects of this problem are closer than they seem - only 14 years away or less
Is 2038 the end of a signed int? If so, can't we just make it unsigned and buy ourselves another 70 years or so? I don't know how much of an issue not being able to represent time before 1970 is, but for timestamps that doesn't seem like it would be an issue.
> BSD-based distributions have the advantage of being able to rebuild everything from scratch, so they do not need to maintain user-space ABI compatibility in the same way.<p>I don't understand, not knowing much about BSD. Is this an LTS/support thing?
Can someone explain?
Newton (Apple's old PDA) had a similar problem in 2010 [0]. In short, while the base system and C++ interfaces used 32-bit unsigned ints with a base of 1904-01-01, NewtonScript uses 30-bit signed ints, with a working base of 1993-01-1, overflowing in 2010. The fix was a binary patch that changed the time bases.<p>0: <a href="http://40hz.org/Pages/Newton%20Year%202010%20Problem" rel="nofollow">http://40hz.org/Pages/Newton%20Year%202010%20Problem</a>
Surprisingly, when Googling for 2038 & FreeBSD, the 2nd highest recommended search result was:<p>2038年問題 freebsd<p>I do not speak, write, or search for things using Chinese characters. Seems as though this problem must have been heavily Google'd for by Chinese speakers - why else would it have popped up in my search recommendations?<p>Btw: Google Translate tells me 年問題 means "year problem"<p>Perhaps this information was important for ensuring the safety of Kylin, which started out as a sort of Chinese DARPA-style project to get the state off of MS Windows. Kylin was announced in 2006. It was supposedly based on FreeBSD 5.3.<p>Strange thing is, Kylin later became known to use the Linux kernel (with a Ubuntu influence). - Google search recommendations, which should be based on a recent volume of searches, if they did suggest anything about Kylin development, should yield "2038年問題 linux" rather than "2038年問題 freebsd" - Maybe some of those FreeBSD-Kylin systems are still being heavily used.<p>Or perhaps there are a lot of embedded systems being produced in China which use FreeBSD.
On a side but related note, I don't understand why many programming languages and databases don't have a positive and negative infinity date placeholder/token value that is standardized and cross platform. Negative infinity date is "past", positive infinity date is "future". This would solve the common problem of what date to use when you are trying to talk about unknown date in the future or unknown date in the past, rather than using weird placeholders like 9999-12-31 or 1900-01-01 or other magic numbers.
Using a 64 bit unsigned integer with nanosecond resolution gives us 585 years (2^64/1e9/60/60/24/365) after 1970 to come up with a new format. This, combined with using some other format to describe dates before 1970, seems like a sensible solution to me.
The problem is not with developers or tech savvy people. Everyome will know about this by then and solutions will be applied. The problem is with end users, who will only realize this after the shit hits the fan and their fridge will go crazy or there will be a car crash.
So what are we calling this one, Y2K38?<p>I've heard people talk about the risk to cars, but what other kinds of embedded systems will still be in use after 20 years? Maybe certain industrial machines?
for the others curious, seems javascript handles dates with 64-bit floating points, which have a maximum of 9007199254740991.<p>The highest date I could make with node+chrome was 'Dec 31 275759', which cozies-up pretty close to that (8639977899599000)
Can someone ELI5 this please? The only thing that seems comparable that I know of was the "Y2K bug" - but reading through this it seems like this is actually a big problem - as opposed to the techno-illiterate panic of Y2K.<p><i>That work, he said, is proceeding on three separate fronts</i><p>I can't read that without thinking of the turbo encabulator.
Here I thought it was a post on the singularity. It might be ironic if all the AI starts running into all sorts of 2038 bugs and this ends up being a huge issue.
Call me stupid, but I think computing will be much different to worry about this. ( single chip os or iot to the level that each hardware component is separate, or something else...)