<i>Few programmers have read the most important paper in this area, Leslie Lamport's "Time, Clocks, and the Ordering of Events in a Distributed System" (1978)</i><p>If you want to be one of the few:<p><a href="http://research.microsoft.com/en-us/um/people/lamport/pubs/time-clocks.pdf" rel="nofollow">http://research.microsoft.com/en-us/um/people/lamport/pubs/t...</a>
You don't actually need absolute time for distributed systems to work, only independently verifiable order. Independent venerability can be achieved through Merkle structures and Merkle proofs. In a way, through the proofs you can communicate anybody you're "perspective" on the order of events. If you get several "perspectives" you can therefore independently infer the absolute order of events.<p>You would still be left with race conditions between the communicating nodes, but that is something you can't get around anyway.
"A typical laptop or server, left without any type of external time conditioning, will drift out of sync within minutes and after a few hours may already be several minutes away from good synchronization with other systems."<p>This doesn't match my experience at all. I've had smartphones disconnected from the network for weeks at a time without drifting "several minutes away" from the consensus time. Drift is a thing, but it seems like that estimate is several orders of magnitude larger than anything I've seen in practice.
Kind of what got Einstein started on special relativity: Is is possible to exactly synchronize railroad station clocks? And the answer is no. There will always be a causality uncertainty due to the finite speed of signals. Sequencing could depend ont he location and velocity of the observer. On Earth the planetwide uncertainty would be a fraction of a second. But across planets you'd have minutes and so on.
I was at one of the open tech talks at Xerox PARC that Leslie gave where the discussion of time synchronization came up. Xerox had a naming system called Grapevine[1] and it used timestamps in a number of places. I was working at Sun and dealing with time issues in RPC and came away from the tech talk understanding that "perfect time keeping" was like "perfect security", if you could assume you had it a whole host of problems became much easier to solve.<p>The point that the author makes about needing higher and higher precision though got me thinking about ways one might achieve that. I'm wondering if you could actually provide a master clock, a 1Ghz carrier, over network cables that originate from the master clock. If the master clock is synchronized with the bit stream, and you're seeing the bit stream locally, you first calibrate your clock with the master and then drive it from the bit stream and you should be in sync with respect to cable and time of flight delays.<p>[1] <a href="http://web.cs.wpi.edu/~cs4513/d07/Papers/Birrell,%20Levin,%20et.%20al.,%20Grapevine.pdf" rel="nofollow">http://web.cs.wpi.edu/~cs4513/d07/Papers/Birrell,%20Levin,%2...</a>
Most of what we do in distributed systems is provide the <i>illusion</i> of a consistent, shared memory space. We literally pretend that we can violate the laws of physics, to make the programming for everyone else simpler.
> If an interval time needs to be measured, then rdtsc, or a library wrapped around it, is the best solution, whereas getting the system time for use in log files probably ought to be carried out using clock_gettime()with a FAST option; or, if clock_gettime() is not available, then gettimeofday().<p>If I remember correctly RDTSC suffers from other issues like being affect by CPU throttling and also might be different if your process is re-scheduled on another core.
Related: <a href="https://www.youtube.com/watch?v=wteiuxyqtoM" rel="nofollow">https://www.youtube.com/watch?v=wteiuxyqtoM</a><p>(Showing clearly why simultaneity does not exist in an absolute sense.)
It's even worse. Not only is time an illusion but your observation is being sabotaged by sysadmins that make one of the nodes of your distributed system jump back in time several hours (by changing the time zone info) and other naughty deeds.
It'll be fascinating to watch how time unfolds on the blockchain.<p>Some will strive for absolute standards, while others maximize the net benefits of relativity.
Got fired on my last job saying this. Such a shame I don't have a PhD like their CSS coder.<p>My understanding of the root causes of the time problem is a poor education.<p>Basic definition of time : time is the accident of the accident, and the same causes giving the same effects, some of them being irreversible they define an ordered direction of events. Time is like temperature, it is measured relatively to the pulsation of an harmonic oscillator. A closed absolute system time does not exists. Since Einstein we also have to decorrelate the physical speed to the speed due to the geometrical expansion. (Cerenkov effect, yes you can go faster than the speed of light playing on this).
Since quantum mechanics we know time is quantic and its uncertain capped by hbar/2 < dEdt<p>Hence a lot of problem when due to poor rigor and understanding (which amplify the aforementioned effect) time becomes that its nightmarish physical beast. And you are stuck with idiots, that even thinks that the colour of the skin influence your quality as a coder.<p>So here is my understanding of coder's problem with time. The mindset of coders I have met and boss alike is stuck in the 1800's. Where statistical physic, the dual nature of light, quantum mechanics, Einstein's relativity are known as trivial pursuit boring questions but no one cares of the implications.<p>Then they sux at understanding geometry vs physics but most of all they are stuck in the wrong physical world.<p>They live in a world of determinism where they would prefer compute the position and speed of every molecule in a gas than use the 'unpure' perfect law.<p>For time they are puzzled:
- time is a length of vector; (how much time since)<p>- time is a point - deducted from an implicit 0 origin when taking a length;<p>- time is 1D vector so it behaves like a scalar, so it must be a scalar; (computing resulting size by adding/substracting as length/vector))<p>- there is a lot of politic involved in "time measure" (GMT, TZ, calendars, interstitial seconds) and politic is buggy thus it results in bugs;<p>- heisenberg DOES exists; they never care to measure the error and think it is wasted time;<p>- my time as a coder is always free;<p>- time cannot be uncertain since we have these high resolution clocks (the exactitude of time is such we never encounter uncertainty (and our code is executed in 0s)));<p>- GPS is a measuring instrument that magically corrects this, because it is perfect and has no errors because it is USA spatial "godly" "star streky" in the sky;<p>- acausality cannot locally happen because of asymmetries in topologies (slow/fast router vs short long path);<p>In short, most of coders are insanely crippled by their own culture of ignorance and their self importance.<p>Common scientifical knowledge that is more commonly understood by mc donalds employees has still not reached the brain of our elite architects/coders. And time - frequencies is one of the most important dimension of all applications.<p>The question I wonder is "how?". How is it even possible to have such a bias in the mass recruitment of coders that they select over confident thinkers that are lacking of curiosity so much they can blindfold themselves comfortably.<p>If the lack in scientific domain is that great, and reflects arrogant lacks in other domain ... then I think of creeping lack of culture in "business", "ethics", "legal", "cryptography", "probability", "algebrae" ...<p>I have provoked enough computer pro and made stats to know for sure their level of confidence should be dangerously inversely correlated to their level of actual knowledge.<p>I am very confident that IT has a corporate culture bias of valuating arrogant ignorant that "can do it" over careful thinkers that may say "it well never be doable"<i></i>*<p><i></i>* yes, the Cretan paradox revisited