What kind of computer does the Mars rover use?

August 7, 2012 • 4:57 am

Although we never talk about computers here, I’m absolutely sure that many readers are into them.  And even I was curious to know what kind of computer was onboard the Mars rover Curiosity.  Well, it turns out that, much to my delight (I’ve always been an Macintosh man), it’s an Apple “Airport Extreme”.  The description of both hardware and software are at Extreme Tech, which says this:

At the heart of Curiosity there is, of course, a computer. In this case the Mars rover is powered by a RAD750, a single-board computer (motherboard, RAM, ROM, and CPU) produced by BAE. The RAD750 has been on the market for more than 10 years, and it’s currently one of the most popular on-board computers for spacecraft. In Curiosity’s case, the CPU is a PowerPC 750 (PowerPC G3 in Mac nomenclature) clocked at around 200MHz — which might seem slow, but it’s still hundreds of times faster than, say, the Apollo Guidance Computer used in the first Moon landings. Also on the motherboard are 256MB of DRAM, and 2GB of flash storage — which will be used to store video and scientific data before transmission to Earth.

The RAD750 can withstand temperatures of between -55 and 70C, and radiation levels up to 1000 gray. Safely ensconced within Curiosity, the temperature and radiation should remain below these levels — but for the sake of redundancy, there’s a second RAD750 that automatically takes over if the first one fails.

The piece also describes the instrumentation of Curiosity and how it communicates with Earth (remember that 7-minute delay).

Reader Michael, who found this piece, notes that “the base price for the BAE Systems RAD750 single board computer was $200,000 10 years ago, so I assume it’s nearer $750,000 today. Very tough. Very precisely made.”

The Wikipedia link in the previous paragraph describes the computer:

The RAD750 is a radiation-hardened single board computer manufactured by BAE Systems Electronic Solutions. The successor of the RAD6000, the RAD750 is for use in high radiation environments such as experienced on board satellites and spacecraft. The RAD750 was released in 2001, with the first units launched into space in 2005.

The CPU has 10.4 million transistors, nearly a magnitude more than the RAD6000 (which had 1.1 million). It is manufactured using either 250 or 150 nm photolithography and has a die area of 130 mm². It has a core clock of 110 to 200 MHz and can process at 266 MIPS or more. The CPU can include an extended L2 cache to improve performance.

The CPU itself can withstand 2,000 to 10,000 gray and temperature ranges between –55 °C and 125 °C and requires 5 watts of power. The standard RAD750 single-board system (CPU and motherboard) can withstand 1,000 gray and temperature ranges between –55 °C and 70 °C and requires 10 watts of power.

The guts:

Credit: Peter Vis

Original photograph from Peter Vis’ site here.

85 thoughts on “What kind of computer does the Mars rover use?

    1. It ain’t running Apple either. The Operating System is VxWorks from Wind River. It is a specialist proprietary Real Time OS.

      Unix based operating systems like OSX and Linux to not work well in real time. As for Windows, NASA is not mad.

      1. With a few patches Linux is as good as any other RTOS. QNX is largely POSIX compliant and for the most part behaves like a UNIX system; it has been around far longer than Linux and is still one of the most popular choices for instrumentation.

        1. Even the RTOS analog Linux systems aren’t all deterministic. But they are good enough for most applications.

          Again, NASA is not mad. Linux variants may suit ISS, say, not year long missions.

        2. It’s a shame QNX and AmigaOS never got together.

          BTW I’m under the impression that QNX is entirely POSIX compliant.

  1. It’s not really an “Apple computer”. The OS is also used on something called the Airport, which is Apple’s name for an access point (i.e. something one connects to a LAN to allow WLAN devices to connect). Nothing really to do with Apple computers as most people think of them.

    1. Agree. Special-built boards have no inherent kinship with commercial boards, other than chips and electricity (electrons). It’s a BAE computer.

  2. Saying it’s an apple is a bit like saying a dune buggy with a Ford engine is a De Tomaso Pantera because they both use Ford engines.

    I’ll shut up now.

  3. Is there some sort of sophisticated computerology involved in reading that and concluding that it’s an Apple?

    1. Nope, just a very misleading headline.

      There’s no connection in hardware or software to the Airport Extreme.

      It’s actually way slower than the Airport Extreme, which should be interesting but unsurprising, as there’s very few radiation hardened CPUs.

      It actually shares much more in common with the Game cube main processor, though the Game Cube is clocked over 2x as fast.

      http://en.wikipedia.org/wiki/PowerPC_G3#Gekko

      1. Actually, I’m wrong also. The Airport Extreme does use the same OS family, so there is a small software connection.

      2. interesting but unsurprising, as there’s very few radiation hardened CPUs.

        I recall that a couple of servicing missions ago the Hubble space telescope had a computer upgrade … to a hardened 486 system.

  4. Ha ha, the G3 was the computer I bought when I started undergrad in 1998. That was a great computer.

    1. Oh yeah? I’ve still got a G3 iMac sitting on one of my desks at work. I don’t use it much, but I can’t bear to part with it.

      1. Yep. The OS finally crapped out and my needs (computer music) were best met by Linux shortly thereafter. I think it’s still in my little brother’s room at my parents’ house!

  5. Erm… from the ExtremeTech article, one of the family of the RAD750 (a RAD750 3U) actually looks like this and is not, strictly speaking, an Apple. Apple happen to use a chipset from the same family, but not the same one. Sorry.

    I just checked the BAE catalogue for 2012-2013, no prices unfortunately 🙁

    From the Curiosity spec (press release 2011) the CPUs are 200MHz, and from the BAE catalogue page 8, the 200MHz products are designated as 6U-160s (about 4 times the size of a 3U), so the ExtremeTech article may not have the correct pic either.

    I hope to have reached my nerd-quotient for today 🙂

    1. JC’s post was designed to out the nerds. 😉 Thanks for the pic link. THAT looks a lot more like a hardened computer.

  6. Let’s not forget that the delay in transmission depends on the positions of Earth and Mars, and is constantly changing.

  7. Let’s not forget that the delay in transmission depends on the positions of Earth and Mars, and is constantly changing as each follows its orbit.

      1. A delay in synaptic transmission, I believe. Actually, I had to sign in again, and wasn’t sure if my first post achieved lift-off.

    1. You aren’t a programmer, are you? Maybe look into the phrase “General Purpose Computer”. It may help.

      1. You aren’t an embedded programmer, are you? There are valid reasons FPGAs would be used on a mission like this. I doubt there would be any reprogramming done- too much room for error I’d think.

        Two points:

        1. How many non-programmers know what a FPGA is?
        2. The MSL does use at least one FPGA in their setup, in the sequencer for radar altimeter.

        http://ntrs.nasa.gov/search.jsp?R=20110002997

        Some more info on issues encountered using FPGAs on the MSL:
        http://www.nasa.gov/offices/oce/appel/ask/issues/42/42s_challenge_complexity.html

        1. FPGAs are power hounds – not good on interplanetary missions outside Earth orbit.

          That said, I am sure FPGAs solve many development problems vs time schedules on these one-off projects.

          All hail the power of FPGAs! (Somewhat ironically.)

          1. > FPGAs are power hounds – not good on
            > interplanetary missions outside

            Mars PathFinder used FPGAs. They might use more power, but they’re infinitely more useful for rerouting around problems, like stray particles taking out critical gates.

      2. @Rjw Clearly you don’t know what a FPGA does or how it works. This is the difference between “General Purpose” programmers and engineers.

        Try some facts before getting snarky.

      3. I have a degree in CS :p I am very familiar with the various applications of different computational devices.

        FPGAs have been historically been associated with space travel as they are more robust and reconfigurable on the fly.

        As mentioned in the literature “Field Programmable Gate Arrays (FPGA) have been used in many applications to achieve orders-of-magnitude improvement in absolute performance and energy efficiency relative to conventional microprocessors”

        http://www.cs.washington.edu/education/courses/cse591n/11wi/papers/Chung10_CoRAM.pdf

        So yes I would expect the MSL to use FPGAs, and after a little more digging I have found that yes the MSL does in fact use FPGAs and they were apparently a significant research and development tools for the MSL team.

        http://www.nasa.gov/offices/oce/appel/ask/issues/42/42s_challenge_complexity.html

  8. “(remember that 7-minute delay)”
    The 7 minutes was the time of descent – signal delay is currently between 14 and 15 minutes.

  9. Not much Apple in there at all, almost none, except that both used a version of the PowerPC originally developed by IBM, not Apple.
    Software: VxWorks, developed by Wind River Systems; not a smidgen of Apple here either.

  10. According to Wikipedia, the RAD750 is manufactured by BAE, incorporating the PowerPC 750 series CPU. These chips were originally made by IBM, now Motorola (as you can see in the photo). The only link with Apple is that they are also customers for the chip. Sorry Jerry!

    I’ve heard that the Pentium chips have an inherent instability that makes them unsuitable for space missions, and that this was why they never upgraded the Shuttle’s computers beyond something like a 386. Anyone who actually knows about computers care to refute or confirm this?

    1. No firsthand knowledge to offer, but I speculate that as circuit elements get smaller, they become more vulnerable to radiation-induced errors. But this wouldn’t be unique to Intel chips.

      1. That’s probably part of it. As it says in the quoted text, “It is manufactured using either 250 or 150 nm photolithography…”, whereas modern consumer-level processors are fabricated using a 22 nm process with plans for 14 nm in the next couple years. The smaller traces presumably increase vulnerability to interference and errors due to impurities in the silicon, etc, but I don’t really know for sure. 🙂

        1. It’s not electrical traces; if a cosmic ray passes through the transistor or the bulk silicon it can create a cascade of photons and electrons and screw up the computer’s operation. The bigger your transistors, the less sensitive they will be to the photons and electrons produced because more electrons are required to switch the transistor on or off.

    2. They must be referring to the FDIV bug! 😉 Joking aside, Pentiums (and pretty much all consumer-level chips) aren’t radiation-hardened. I’m sure a consumer-level PowerPC chip would be just as bad. I know there was a project to develop a radiation-hardened Pentium for use in space at back in 1998 or 1999, but I don’t know if it was ever finished…

      1. Actually any older chips with larger layer sizes would be more “radiation hardened”. I know that Apple IIe’s and Amigas flew on shuttles, though I don’t recall their tasks- might have been bay related. The Apple was a 1MHz 6502, and the Amiga was a 8MHz 68000 (clocked at 7.16MHz) with coprocessors running at the same speed. The latter was arguably the most hardened and full featured general purpose computer you could get.

        I’m not sure why the Apple was used- while the 1MHz 6502 would have been very resistant to radiation, but it was the worst 65xx MOS based computer ever made.

    3. The 2 main challenges are the heat dissipation and the radiation. With such small transistors in the later CPUs, the occasional cosmic ray is more likely to cause a glitch in the system; a Motorola 6502 CPU is so much easier to put into space than, say, a modern Intel Pentium-M.

      As for upgrading computer systems it costs a hell of a lot of money, so as long as the current system is working people aren’t going to divert money into an upgrade.

      You can’t simply use consumer/industrial products; aside from the engineering and testing to ensure that the computer will work in its space environment, you have all the component/batch tracking, post-production testing and so on. It’s a laborious process which is reflected in the cost (BAE are not ripping anyone off for a computer similar in capacity to a mobile phone). Personally I suspect the RAD750 will either be the same price or cheaper; I can only imagine it being more expensive if some parts are obsolete and need to be manufactured just for that board.

      1. > a Motorola 6502 CPU is so much easier to
        > put into space than, say, a modern Intel
        > Pentium-M

        True. But Motorola is known for the 6800 and 68000 series processor. MOS was known for the 6502. Commodore acquired MOS.

        Incidentally, electrical engineer genius Chuck Peddle while working for Motorola, designed the 6800 and then he went on to design the 6502. Moto saw no value in low-cost 6502 processors for the masses and so Chuck Peddle left with his 6502 plans (you’d never see that nowadays) and formed MOS. MOS was acquired by Commodore.

        I can’t recall if the KIM-1 motherboard kit (designed by Peddle) was released before or after Commodore bought MOS. But the Commodore PET (designed by Peddle) came out in 1976 and by then I’m pretty sure Commodore had acquired MOS by then. Why Commodore/MOS didn’t cut off 6502 supplies to Apple and Atari is anyone’s guess. You’d think they would have raised prices at least. More evidence that Commodore couldn’t do anything right, other than engineering.

        Interesting historical note: when Steve Wozniak could not get his Apple prototyping to work (keep in mind that the Apple was little more than a 6502 with RAM, ROM, and a keyboard) Chuck Peddle came over to Steve’s garage and personally showed him the proper way to do things. Otherwise the Apple could have been many month or years behind, or never made it at all.

        Oddly, Wozniak fails to credit Chuck Peddle for his contributions to Apple.

        Some blame it on the airplane accident that damaged Wozniak’s brain and memory. My explanation isn’t that kindly however.

  11. For Curiosity’s sake, I do hope its embedded systems do work more reliably than my Airport Extreme.

    Wind River though, the firm behind VxWorks, the real-time OS that runs on the RAD750, is an interesting outfit. They claim on their website over a billion devices world-wide running their embedded software. I remember a cheerful feature from WIRED many years ago portraying their co-founder, Jerry Fiddler. (A number of very bright folks are named Jerry, apparently…) They dubbed him “Lord of the Toasters”, or something in that vein.

    Question for the computer guys and ladies around here: is there anyone who wouldn’t be scared out of their pants if they had to debug the VxWorks software on the Curiosity project?

    1. Nope, not scared at all. The reason for the lack of fear is due to confidence in the engineering processes and testing, qualification and release procedures that must be followed before upload*. The NASA and other aerospace companies’ processes and procedures for software and hardware engineering are very, very good/comprehensive/anal retentive/pedantic and just the way I like them**

      * There was that little problem of the feet/metres conversion for the Beagle rover that cratered. Fortunately that does not happen very often.

      ** There, nerd-quotient exceeded.

    2. I think I’d be nervous as hell during descent, but other than that I don’t think I’d be scared. Why? A device like this isn’t programmed like our desktop computers are. Not only is there a lot of very intentional analysis, design, and code reviews there are tools they use which verify that the code is robust and complete, involving a lot of math.

      If you’re interested in the architecture, check out these slides:

      http://compass.informatik.rwth-aachen.de/ws-slides/havelund.pdf

    3. Curiosity had a long lead time. The number of people and the number of man-months available were probably much more generous than any for-profit situation. In addition, IIRC, the small rovers already on Mars got some re-programming in situ, to overcome some hardware failure and terrain problems (shifting sands??)

    4. Not at all; in fact I’d say that anyone who’s afraid doesn’t understand the software and the hardware and has no business being on such a team. WindRiver have done an excellent job on VxWorks. There’s a hell of a lot of testing etc. which they do so you generally don’t have to worry about the operating system. VxWorks runs a hell of a lot of safety-critical avionics systems. The rover’s custom software will also be carefully tested in so many ways (for example, feeding it simulated sensor data to see how it responds – that would be only one of a barrage of tests). In the event of a hardware or software fault, there is the second computer to take over while the first computer restarts itself. You don’t want anything going wrong on the descent, but with all the work that goes into these projects the chances of anything going wrong are very small. I’ve designed instruments to fly and have had numerous arguments with the lead scientists insisting on using cheap(er) components rather than the more reliable components I selected for use, but even in those cases they’ve been lucky so far and have had no problems (nor can I absolutely guarantee that a system built to my specification wouldn’t have a glitch – but it’s far less likely to).

  12. To be a bit cheeky, posts like this are part of the reason Appleheads have the reputations they do among users of real computers. 😉 To be fair, the original article is the source of the claim, but I’m pretty sure they were making a joke. Which was repeated here in all seriousness. 🙂

    The only thing Apple-related is the PowerPC CPU design, which to a first approximation was designed by IBM, built by Motorola, and sold by Apple for a while in their Macs, but abandoned by them in 2006.

    If you want to credit the original CPU designers and builders, credit IBM and Motorola. Of course the overall integration is presumably the work of BAE.

    (Just my small contribution to the latest skirmish in the culture war against those crazy Apple users. Cue the rebuttal that ever if they lack technical savvy, at least Apple users have lives, and style, unlike us nerds. Touché, touché, …)

    1. I freely admit that I know almost nothing about computers, and may have erred in the post. That said, I am not keen on the implication that I’m a moron!

      1. Not at all. 🙂

        I’m not sure how I could have better marked that first paragraph as a playful tease without adding an explicit disclaimer, but I’m sorry if I did a bad job of it!

    2. I don’t think you can give Jerry too hard a time about this- look at the actual article. It’s titled “Inside NASA’s Curiosity: It’s an Apple Airport Extreme… with wheels.” I take it they say that because they both have a PPC CPU and run VxWorks. It’s a stretch, but Extreme Tech is a popular tech news site, not a scientific journal.

    3. It also highlights the value of an advanced degree in physics or geography. One would be hard pressed to get a degree in either discipline without having also acquired extensive programming knowledge. Biology on the other hand, yields numerous specialized skills that are completely useless outside the lab. And while you could teach a high school student to run PCR’s (the frau does it every summer) – you won’t find the same skill level comparison for Java coding.

      1. Where does the advanced physicist get her food or her medicine? Does the advanced physicist with programming skills fare better at turning off the “research pane” in Microsoft Word? And while you could teach a high school student to collect rocks (my neighbor’s kids do it every summer) you won’t find the same skill level comparison for designing a yeast genomics experiment. Just saying.

  13. Apple-compatible computers are a good choice for space missions because, if necessary, they can transmit malware which will shut down an entire alien mothership.

    1. Haha, indeed, they can apparently interface with anything! 😀 And on rare occasions even nondestructively.

      1. Well, if those aliens were stupid enough to bring their entire fleet down into the atmosphere, instead of sitting up at L5 where nothing could reach them and raining down asteroids, it stands to reason their information security would suck too 🙂

    1. Processor speed is not a particularly indicative specification. If the OS is pared downed as this one likely is, and is not performing a number of concurrent tasks, then processor speed is not a hindrance or aid. They probably keep the processor speed down for power and thermal reasons.

    2. Because…

      ** That set up has been used successfully on a dozen space vehicles

      ** It costs an absolute fortune & a long time to test the rad-hardness of a new computer

      ** To increase clock speed it’s necessary to miniaturise the hardware & miniaturisation makes the chips more sensitive to ‘noise’

      ** Faster chips run hotter & hotter chips are more prone to failure

      https://en.wikipedia.org/wiki/Radiation_hardening

    3. For a large number of reasons:

      – “Rad hardened” means well tested technology to military specs.

      You don’t do that testing with all circuit designs.

      – “Rad hardened” means well tested chips to military specs; electronics obeys a “bath tub” statistics, circuits are made with parallel methods so obeys a statistical distribution on characteristics “as is”, and chemical systems obeys an Arrhenius law on activation energy for defects, so weeding out the worse individuals means netting the durable ones.

      You meet the quality target more easily with conservative technology.

      – “Rad hardened” means conservative design, because a) the production process is stable and well characterized, b) larger dimensions means higher tolerances against defects (that activation energy again, and longer mean time before failure), c) higher voltage swings means less problems with noise including radiation induced.

      – “Rad hardened” may mean special material (silicon-on-insulator) and/or technology (GaAs JFETs) and/or encapsulation to withstand extreme radiation and/or temperature.

      Consumer circuits may be CMOS, but pad protection and parasitic elements are bipolars, bipolars are inherently temperature unstable as they can suffer thermal runaway. (Increasing temperature lowers EC resistance, which increases current, which increases temperature … oops.)

      All these things costs extra, no less in development.

    4. The speed is low due in part to the large transistor size. Also, engineers will select a speed that ensures they can accomplish what they need to – jacking up the speed beyond that simply wastes energy. Yes, some phones are far more capable computers. ARM will soon be releasing a new multicore CPU for phones (up to 4 cores I think). I have a controller that fits in a small matchbox and uses a mutant cousin of the Motorola 6502; this matchbox-sized computer which can be powered by a wristwatch battery for almost a month is far more capable that the once popular Apple2e which ran on the original 6502.

      1. You can already buy phones with 4-core ARM CPUs by Samsung and Nvidia- the Samsung S III and HTC One X. FYI- ARM doesn’t make CPUs, they license out their own designs to companies like Nvidia, Samsung, TI, STM, Marvell, etc.

        Just out of curiosity, what controller are you talking about?

      2. Which 65xx mutant is in your matchbox?

        > mutant cousin of the Motorola 6502

        Motorola is known for the 6800 and 68000 series processor. MOS was known for the 6502. Commodore acquired MOS.

        Incidentally, electrical engineer genius Chuck Peddle while working for Motorola, designed the 6800 and then he went on to design the 6502. Moto saw no value in low-cost 6502 processors for the masses and so Chuck Peddle left with his 6502 plans (you’d never see that nowadays) and formed MOS. MOS was acquired by Commodore.

        I can’t recall if the KIM-1 motherboard kit (designed by Peddle) was released before or after Commodore bought MOS. But the Commodore PET (designed by Peddle) came out in 1976 and by then I’m pretty sure Commodore had acquired MOS by then. Why Commodore/MOS didn’t cut off 6502 supplies to Apple and Atari is anyone’s guess. You’d think they would have raised prices at least. More evidence that Commodore couldn’t do anything right, other than engineering.

        Interesting historical note: when Steve Wozniak could not get his Apple prototyping to work (keep in mind that the Apple was little more than a 6502 with RAM, ROM, and a keyboard) Chuck Peddle came over to Steve’s garage and personally showed him the proper way to do things. Otherwise the Apple could have been many month or years behind, or never made it at all.

        Oddly, Wozniak fails to credit Chuck Peddle for his contributions to Apple.

        Some blame it on the airplane accident that damaged Wozniak’s brain and memory. My explanation isn’t that kindly however.

    5. I think the one fact about the Apollo missions that scared me more than any other, and forever impressed me how much nerve those guys had, was the computing power of the Apollo onboard computer.

      32K. Still shake my head at that.

  14. Jerry,

    I have the feeling that if you are not a computer geek (not that there’s anything wrong with that 😊), when you post stuff like this:

    The CPU has 10.4 million transistors, nearly a magnitude more than the RAD6000 (which had 1.1 million). It is manufactured using either 250 or 150 nm photolithography and has a die area of 130 mm². It has a core clock of 110 to 200 MHz and can process at 266 MIPS or more. The CPU can include an extended L2 cache to improve performance.

    you run the risk of sounding much like Dr. Evil (we demand one million dollars!).

    1. Pfft. He’s just quoting the article. If you think the RAD750 is retro, you should have seen the systems on the Space Shuttle. Built in the 1970s, flown since the 1980s and they were using 1950s technology (and some late ’60s tech) – check out the Wikipedia entry on Magnetic Core Memory. Some museums (such as Chicago’s Museum of Science and Industry) may have some magnetic core memory on display – the MSI certainly did back in the mid 1980s.

      1. Yeah, I remember that computers in spacecraft often seem to be very retro. I’ve always assumed that the choice is purposely extremely conservative – better to have something that is known to be highly reliable than to have the latest and on the bleeding edge.

  15. Wow. I’m adding computers to my list of things that spark long comments threads.

    1. Criticism of religion
    2. Climate change
    3. Computers

    It is interesting that they put a 10 year old bit of kit in it. I would have just assumed that they would go for the latest and greatest available at the time they built it.

    1. The design was probably pretty-much fixed ten years ago.
      When you cannot perform maintenance or repair, you want reliability over performance.
      (I’d guess that’s part of the reason for having redundant computers.)

  16. Calling this a Mac or and AirPort Extreme is typical of wild Macolyte hyperbole.

    Airport uses VxWorks, but so do many other systems. Non-Mac systems. The Mac doesn’t run VxWorks. It doesn’t even use the PowerPC any more (and despite what Apple tells you, the switch to x86 was about profit margins, not performance; although low heat PPCs chips were in poor availability which was aggravating for Apple).

    The Curiosity also likely contains FPGAs, making it closer to the A-Eon Amiga X1000 in terms of hardware. The realtime OS also makes it more Amiga-like than Mac-like.

    BTW AirPort Extreme does not use the PowerPC either.

  17. In my work area, our products are subject to SEUs as well (though exposed to MUCH less radiation.) They routinely cause failures (at a very low rate.)

Leave a Reply to Gregory Kusnick Cancel reply

Your email address will not be published. Required fields are marked *