For all the sophisticated equipment on this rover, the computer seems surprisingly underpowered. Surely, they are going for quality of quantity, but these specifications are somewhat lackluster - 256MB of DRAM, 300 MIPS processor. Maybe they just spec'd it out for only what they need?<p>Anyone have insight into this? I'm sure there is a very good reason for this - I'd just like to know.<p>http://marsprogram.jpl.nasa.gov/msl/mission/rover/brains/<p>http://en.wikipedia.org/wiki/Mars_Science_Laboratory#Rover
The CPU on board is a RAD750: <a href="http://en.wikipedia.org/wiki/RAD750" rel="nofollow">http://en.wikipedia.org/wiki/RAD750</a> It is a radiation hardened PowerPC 750. The PowerPC 750 was launched in 1997 and the RAD 750 in 2002. So, it looks like radiation hardening took 5 years.<p>The MSL project started in 2004. So, it was probably the best radiation hardened processor for the job at the time.
Kids these days! Do you know how much compute power was on the moon missions? <a href="http://en.wikipedia.org/wiki/Apollo_Guidance_Computer" rel="nofollow">http://en.wikipedia.org/wiki/Apollo_Guidance_Computer</a> Or the Space Shuttle? <a href="http://en.wikipedia.org/wiki/AP-101" rel="nofollow">http://en.wikipedia.org/wiki/AP-101</a><p>Strip away all the crap (Linux, Windows, OSX, they're all the same in this respect) and 300 MIPS/256M would be plenty of power for anyone. The old Amiga delivered a multitasking GUI in 256K of memory and approx. 1 MIPS!
Typically the reason is often the need for radiation hardening. Operating in space, or on the surface of Mars, outside the radiation shield that is the earth's magnetic field, is a very different, and very harsh, environment for electronics.<p><a href="http://en.wikipedia.org/wiki/Radiation_hardening" rel="nofollow">http://en.wikipedia.org/wiki/Radiation_hardening</a><p>Pertinent quote from the above Wikipedia article:<p>> Due to the extensive development and testing required to produce a radiation-tolerant design of a microelectronic chip, radiation-hardened chips tend to lag behind the cutting-edge of developments.
1. As mentioned by other commenters, radiation hardening takes several years.<p>2. The government procurement bureaucracy takes some time.<p>3. Obviously the rover has to be designed, built, and extensively tested on the ground. Possibly through several iterations depending on the results of the tests. I'm guessing since the rover has to be mostly software-driven due to lightspeed delay, software is a dependency of basically everything the rover does, and since the computing hardware is obviously a dependency of software, it'd almost have to selected very early in the mission development.<p>4. Getting stuff to Mars is really expensive. I imagine that most of the expenses are R&D for radiation hardening and fitting within the overall volume/weight/power budget. So the amount of memory for example is probably just what's needed to get the job done.<p>5. To me, 256 MB seems like an enormous amount of memory for an embedded system. Most of the memory usage of everyday computers is due to all the graphics and multitasking. (Think about the sheer number of scripts and images in a typical webpage, times the number of tabs you need to open to consume all your memory.) All of NASA's fancy GUI's like the simulation they showed during landing are on ground computers, not on the rover.<p>All in all, I thought the computer was enormously powerful compared to previous space missions.
I went to a good school for CS, and some of the professors did work for private industry and interesting research, and why hardware like this can seem so dated came up a few times. One professor worked doing formal verifications of hardware and such things. (for example, Toyota might want something close to formal, mathematical proof that the chips and software in their car can never cause uncontrolled acceleration.)<p>Basically, if you dump a lot of money into making sure something works, and works exactly as intended, as long as it does and can do the job, you don't change <i>anything.</i> You don't stick in one extra memory chip without redoing your formal verifications and tolerance testing, etc. Very expensive people will have to run very expensive processes to prove that even the most minor change doesn't compromise certain key properties when you can't afford for those properties to be compromised for any reason whatsoever.