I certainly wouldn't mind returning to a world where inexpensive, easily-labelled optical disks could hold relatively large amounts of data.<p>When HDDs were generally no more than a few gigabytes, I could basically stick a compressed backup of my PC on 1-2 CDs, write a date on them, and shove them in a box. It was incredibly convenient and offered good peace of mind. In fact, I just recently recovered some important data from the late 90s off one such CD.
<p><pre><code> [...]using a two-light-beam method, with different
colours [...]
The two beams were then overlapped. As the second
beam cancelled out the first in its donut ring, the
recording process was tightly confined to the centre
of the writing beam.
</code></pre>
How to they get the two beams (of different frequency) to cancel out each other?
Sure you can put 1,000TB on it but whats the read speed?<p>There is a reason both PS4 and XboxOne will require to install the game even if the Bluray can hold all that data and more. High density optical formats are slow to read.<p>At 16x Bluray only reads at 72MB/s.
So according to Brewster Kahle's estimates for storing all US phone calls (<a href="http://blog.archive.org/2013/06/15/cost-to-store-all-us-phonecalls-made-in-a-year-in-cloud-storage-so-it-could-be-datamined/" rel="nofollow">http://blog.archive.org/2013/06/15/cost-to-store-all-us-phon...</a>), it would only take 272 of these theoretical DVDs per year!<p>Of course there's a big tradeoff in latency (on the order of 10s of seconds to switch DVDs) and throughput (unknown), but properly indexed I'm sure it would still be extremely useful, and extremely cheap.<p>Imagine fitting all of that data in this little box: <a href="http://gizmodo.com/5321357/sony-finally-popping-400+disc-blu+ray-megachanger-so-dont-toss-your-dvds-yet" rel="nofollow">http://gizmodo.com/5321357/sony-finally-popping-400+disc-blu...</a>
Its too bad they don't address the media question. While its great to write 9 nanometer dots if your media fills them back in after a while, well its not as useful.<p>I've got media from the 80's (gold backed) that is still readable with no errors, and some that is aluminum (silver) backed and is readable with error recovery.
Using this with archival grade DVD-Rs would make big science much cheaper, accessible, and reproducible. Imagine the LHC fitting all their data into a briefcase, and just sending it to whoever asked for it. Of course, the real problem would be writing bandwidth.<p>Using archival grade media and write redundancy + ECC, you could decrease the size of what you put on a disk to just 10TB and probably hit a sweet spot between massive amounts of storage, exceptional reliability, and increased bandwidth.
Quick back of the envelope:
1 of these ~= 1333CDs or 212DVDs
The increase from a CD to a DVD was by a factor of ~6.26.
This is 33.8 times that factor of increase.. but no word on the speed of reading/writing in this article.. which I assume will be slow.
I can store 1024 terabytes on a DVD today, entirely in software, no innovative hardware based on new engineering principles necessary:<p><pre><code> dd if=/dev/zero bs=1024 count=1T | pv -c -W | gzip -c9 | pv -c -W > big.gz
</code></pre>
I'm pretty sure the resulting big.gz will fit on a DVD with plenty of room to spare.<p>You may need to sudo apt-get install pv if you don't have that incredibly useful utility already. You may also need several hours of CPU time...<p>EDiT: Downvoted within two minutes? HN needs to get a sense of humor...