Something about these numbers doesn't quite make sense. The reason cited for dismantling the machine is that "it isn't energy-efficient enough to make the power bill worth it." But the supercomputer uses 2345 kilowatts, which at US prices of around 15 cents per kWh would cost $352 / hour to run in energy costs. By comparison, the $120 million cost of building roadrunner, amortized over the four years it's been running, comes out to $3400 / hour. The article makes it sound like the power bill is costing them a fortune, at $3 million a year, it isn't that much at all next to the $120 million price tag.
It's interesting to read articles like this since I came from bizarro not-so-HPC world where "we" (academic department) didn't pay for electricity (the university did!), so there was no incentive at all to retire obsolete hardware. Right up until I left many months ago, I was running jobs on a cluster made up of 84 servers on death's door, each with dual-processor (not dual-core!) Nocona Xeons or Opteron 240s.<p>"Oh, but there's a cost to support obsolete hardware!" Yeah, sure, but the person supporting everything was me, and I was a constant cost to keep around whether I supported crappy obsolete hardware or shiny new hardware.
"At more than one quadrillion floating point operations per second..." - how fast is that when cracking typical passwords or mining bitcoins? If we have an encrypted drive, how long would it take to find the passphrase with it?