Emily LakdawallaAug 21, 2012

Curiosity sol 15 update: Wheel wiggles, arm flexes, and bad news about REMS

As I write, it's just past sunset for Curiosity on its 15th full day of Mars operations. There was a press briefing earlier today, summarizing the weekend's activities. The brief version: everything is working great including ChemCam and wheel steering tests, with one exception: apparent permanent damage to one of the two wind sensors in the REMS meteorological package.

Mike Watkins reported that most checkouts continue to go extremely well and on schedule. There have been ChemCam shots at rock and soil targets on sols 13, 14, and 15. Ashwin Vasavada said that ChemCam was working "better than hoped," a comment he didn't explain very much, except to say that the ChemCam laser travels a long path through the various parts of the instrument and mast, and there are ways the quality can be degraded along that path. He said that the first target, that little pyramidal rock, looked like a fairly typical Mars basalt in the ChemCam data.

Other successful checkouts include DAN in its active mode and some of SAM's internal mechanical systems. The wheel wiggle test also went well, as you can see in the animation below, which means that Curiosity is "go" for its first drive tomorrow, sol 16.

Curiosity's first wheel wiggle
Curiosity's first wheel wiggle On the 15th day after its landing on Mars, Curiosity exercised its wheels for the first time. First it rotated all wheels out (pointing radially away from the rover's center), then in (the direction they must point for a turn in place), and finally straightened the wheels, making the rover ready to drive the following sol.Image: NASA / JPL / animation by Emily Lakdawalla

Tomorrow's drive will be very simple: forward about 3 meters (just over one rover length), then a turn in place of 90 degrees to the right, then a 2-meter drive backwards; so in total the rover will wind up a bit more than 3 meters away and to the left of its current position, and facing south instead of east. The whole operation will take about half an hour, at 3 or 4 p.m. local solar time, which will be roughly 9ish PDT / 16ish UT tomorrow.

They also deployed the arm for the first time, and that went very well. The arm deployment test included "basic motion functionality" of all the mechanical devices in the sample handling system, including the turret-mounted devices and the inlet covers on SAM and CheMin. Louise Jandura (lead engineer on the sampling system) said there was still a lot of work to do to calibrate the motions of the arm, now that it's experiencing Martian rather than Earth gravity. Watkins said felt great to see that arm and turret with Mars in the background.

Curiosity deploys the robotic arm for the first time, sol 14
Curiosity deploys the robotic arm for the first time, sol 14 Image: NASA / JPL / Damia Bouic

Igor Mitrofanov and Javier Gomez-Elvira were on the panel to report on the operations of the DAN active neutron scanner and REMS weather instruments. Mitrofanov showed that DAN was working great in actice mode. Gomez-Elvira showed measurements of pressure and air and ground temperature over a few sols.

Then came the bad news for REMS. Ashwin Vasavada talked about how REMS has two sensor-studded booms that sprout from the "neck" of the rover's mast, visible in this photo. Both booms have sensitive wind speed sensors that include a sort of circuit board-like material exposed at the surface (the orange rectangles near the ends of the booms). One of those two wind speed sensors is not functioning properly and is now thought to be permanently damaged. It's the one on the side-mounted boom (the one facing left in the photo). Vasavada said we may never know exactly why it's not working, but they have a hypothesis.

The sensor was working on fine in cruise, so the damage happened during landing. They think that the culprit may have been that unexpected flying gravel that now litters the rover deck. The side-mounted boom's wind speed sensor was facing skyward during the landing, so this certainly seems plausible. I assume that they will eventually take a photo of it with the MAHLI camera to see if it does appear to be visibly damaged.

It's inconvenient, for sure, but at least there is a second sensor. The reason there were two is that it helped triangulate wind speed and also improve accuracy of wind speed measurements when one of the booms is aimed windward or leeward. So the quality of the wind speed data will be harmed, but there will still be wind speed data.

There is a televised news briefing scheduled for tomorrow at 11:30 PDT / 18:30 UTC. I assume they plan to present the results of the first drive.

I thought some of you might be interested in a play-by-play of how I go about following Curiosity's mission in pictures and making animations like the wheel wiggle one above. It's not intended to be a detailed instruction guide, just a sort of road map that you can follow if you're keen to get started with raw image processing for yourself.

I get my raw images from one of the two enthusiast-developed raw image listing websites, Joe Knapp's curiositymsl.com or Ludo Stellingwerff's msl-raw-images.appspot.com. Today I used Ludo's.

First you have to notice that there are new images. You can do this by checking the site yourself, but I usually find out via Twitter or at unmannedspaceflight.com. There's always somebody who's paying closer attention than I am!

Today I saw that the new images showed wheel motions, so I had to make an animation, which meant downloading all of the frames. Ludo's site makes this easy: I told it to show me full-frame images from sol 15 from all the engineering cameras (the mast-mounted Navcams and body-mounted Hazcams, each of which has a right-eye and a left-eye camera). I used the little checkbox to select all the images. Clicking on "export" brought up a text listing of the URLs of 26 images. I used wget to grab them from JPL's server.

I use Photoshop CS6 for image processing. It has a handy feature in the file menu: File > Scripts > Load files into stack... when you load the images this way, Photoshop puts them each on a separate layer, naming the layer with the filename for each image.

The raw images from Curiosity tend to be quite dark, so the next thing I did was to add a Levels adjustment layer to brighten all the images. Adjustment layers are wonderful because you can make just one that will get applied to all of the layers, and because you aren't actually changing the pixel values in the original images, so you can fiddle with the levels repeatedly without data loss.

Now, how to organize the images into a coherent story? Animations are all about time, so the next step was to sort the images according to their time stamps. The nine-digit number in each image's filename is a clock counter, in seconds. I rearranged the images and it became clear that there was a set of "before" full-frame images from Hazcams and left Navcam, followed by some subframed Hazcams and full Navcams that showed the wheel rotate out, then wheel rotate in, then wheel straighten.

Because the camera mast is offset to the right side of the rover, its cameras can only see the right-side wheels. So the Navcams documented the right rear and right front wheels. The Hazcam images were needed to see the left-side wheels: the left front Hazcam to see the left front wheel and the right rear Hazcam to see the right rear wheel. Which meant I could throw out the right front and left rear Hazcam images, as they weren't necessary to tell the story. There is one missing frame, the left Navcam image that shows the right front wheel on the rotate-in step. I plan to add that in and update the animation when it comes down.

The subframed Hazcam images were much lighter than the full-frame ones, so I set their output levels to peak at a lower value to make them blend in with the "before" photos.

Then I used the Timeline to make a frame animation, with one frame each for before, rotate out, rotate in, and straight conditions.

Piece of cake! Right?

Take action for space exploration!

Give today to have your gift matched up to $75,000.

Donate