Neal Stephenson. "Mother Earth Motherboard" WIRED Dec 1996: 66 pages. Print.<p><a href="http://www.wired.com/1996/12/ffglass/" rel="nofollow">http://www.wired.com/1996/12/ffglass/</a><p>Without a doubt my favorite issue of any magazine ever. I still have my copy somewhere.
That's a great article ... I was heavily involved with terrestrial fiber systems in the '90s and, while speeds and the isolation of DWDM channels have both improved, it's amazing to me how familiar it all feels. When they talk about sub-sea amplifiers (and many of the terrestrial ones), they're not talking about a device that amplifies the signal electronically. An Erbium-Doped Fiber Amplifier (EDFA) uses a laser to optically amplify a signal. This is why there's not a lot of latency added in the sub-sea cable. If you take the same optical signal, convert it to electrons, amplify it and convert it back to an optical signal you'll see the latency added as discussed in the article.<p>One small nit ... dark fiber represents unused capacity. There are several places where the article says something like "dark fibre signals" which is incorrect. Dark fiber has no signal, while lit fiber does.<p>The last thing I'll mention is that these systems are obviously single-mode fiber. The laser powers feeding each channel are probably around 12dBm in the 1550nm spectrum (per channel). If you look into the end of one of these fibers that's lit from the other end, you'll end up with burned spots on your retina. Wiggle it around a bit and you'll have squiggle shaped burns. So if you're ever around fiber equipment, don't look directly at the ends (or into the connectors). You can get laser safety glasses cheaply ... save your sight!
I'd love to see a similar article on the economics of these steps. The only paying customers are in the datacenter and the end users, but I was under the impression that peering agreements were free as long as the bandwidth is balanced between the two parties. But surely someone must be paying for these massive infrastructures. Is it a system of back to back recharge of bandwidth?
10 terabits per second on a single strand.<p>Amazing. I think if people back in 2000 would have realized the capacity coming along in fiber, then a site like youtube would have been obvious to many more people. Back then, I think a lot of people thought it would be cool to have a video distribution site... but how the heck would you pay for it. I remained amazed and confused by how Google could somehow afford to embed video on every random website -- becoming the video provider of the entire Internet. But I guess for them it was a simple formula of using their excess capacity for something.<p>Amazing.
Slightly meta comment, but this is the kind of native advertising I'd like to see in future. A genuinely interesting article that happens to be sponsored by an ISP.
Seems pretty fast:<p>"Talking of which, John looked up the latency of the two Atlantic cables; the shorter journey clocks up a round trip delay (RTD) of 66.5ms, while the longer route takes 66.9ms. So your data is travelling at around 437,295,816 mph. Fast enough for you?"<p>Too fast. Like breaking the law fast.<p>Edit: oops my bad. That's mph and c is 186k miles per second. So this is like 0.66c - nothing to see here, move along!
Formula 1 is mentioned in the article as caring a lot about latency. Why is this so? Is it really essential that race information is distributed quickly?