Hi-res image from new Mars rover

From NASA: "This is one of the first images taken by NASA's Curiosity rover, which landed on Mars the evening of Aug. 5 PDT (morning of Aug. 6 EDT). It was taken through a "fisheye" wide-angle lens on the left "eye" of a stereo pair of Hazard-Avoidance cameras on the left-rear side of the rover. The image is one-half of full resolution. The clear dust cover that protected the camera during landing has been sprung open. Part of the spring that released the dust cover can be seen at the bottom right, near the rover's wheel."

NASA's New Mars Rover Sends Higher-Resolution Image [Nasa JPL]



  1. Can this (or any of the other) rover take photos of itself? I know it wouldn’t be much use to science but photos of the whole thing on MARS would be awesome. Do they ever have detachable cameras?

    1. We saw many photos of the top of Spirit and Opportunity in the first days of their missions. But it’s not likely for a detachable camera to be part of such a mission, because there isn’t much additional science to be gained from a portrait of a landing craft.

    2. The design of the main cameras (which these that we’ve seen images from already are not) means that it should be possible for the rover to include itself in the photos. They’ll be doing the same thing they did with the earlier rovers, stitching together dozens (or more) individual frames into larger panoramas, and the camera will point down to look at parts of itself in that process.

  2. What will the mainstream media have to report about now – that the thing landed without blowing the freak up…  All I’ve seen, practically speaking, are sensationalist articles about how many ways this can blow the freak up (metaphorically speaking).  Now that it’s landed successfully, I imagine not many outlets can afford the economy of praise for successful, well thought-out engineering, or the science that may now proceed.  Harumpf.

    Yours in get-off-my-lawn-yelling,

  3. The Mars Pathfinder’s rover of the late 90’s was visible from a camera on its lander. At the time, the NASA website had a timelapse feed that would let you watch the rover scoot around sniffing rocks.

    Most rovers have some visibility of themselves, which is useful when your wheel gets stuck or you need to see just how much dust is on your solar panels.

    We also got some shots via satellite of one of the last rovers, and one of them even tracked down it’s parachute (sorry, can’t recall which one).

    I wouldn’t be surprised if  Curiosity tracks down the skycrane, but then again, it’s got a short lifespan.

    edit: intended as reply to Andy Howell

    1. That’s pretty cool. I was on the 3d model of the rover this morning and just thought it was missing a trick. Maybe it can go and find one of the other ones once it’s finished with all the science.

      1. I believe the Spirit rover is the closest to Gale crater, at a few hundred km away, but Curiosity’s range is in the tens of kilometers, so it looks like she’s on her own up there.

  4. Did nobody think colour photos would be interesting? Is this one of those decisions made by geeks without a clue : “we’ve got a rock sampler that can tell us to a millionth percent what’s in the rock, who cares what colour it is? Nobody. Right we can spend even more money by having a special mono CCD made up instead of that off the shelf colour one”

    1. It can do color photography.  That was just an image sent as fast as possible to say that the thing had landed.  And I believe it still has the lens covers on. 

      Give them a day or so – sheesh.

    2. The photo is the rover’s equivalent of a back-up cam. Don’t panic. We’ll get some eye candy.

    3. “Hey everyone, we just landed a one ton rover 350 million miles away with an autonomous landing system incorporating the largest supersonic parachute and a staggeringly innovative “sky crane” rocket landing system and everything looks like it worked perfectly the first time we put it all together!”

      “So, it’s not in color?  Meh.”

    4. In case you weren’t satisfied by the snark your question already received, there are multiple technical reasons why the first images we saw are black and white. 

      Besides what the other repliers have already covered, you should know that the main cameras (which may not even have been fired up yet, not sure) are also monochrome. There are filters that will need to be swapped out in front of the lens and for a true-color image three B&W photos with red, green, and blue filters alternately placed in front of the lens are combined (I believe the computer in the camera on the rover can do this itself, actually, but in any case it’s not a super quick process).

      Resolution/fidelity etc. is better with the monochrome sensor, and this also allows them to put other sorts of filters in front of the lens to pick up other stuff.

      Early color photography (and e.g. Technicolor cinematography) worked the same way BTW; even modern color film has lower fidelity than old black and white film. Digital sensors are the same way and while for even professional photographers color sensors are more than good enough, in the scientific context, it makes sense to do it the old way for better fidelity.

Comments are closed.