Printable Version of Topic

Click here to view this topic in its original format

Unmanned Spaceflight.com _ MSL _ MSL Images & Cameras

Posted by: CosmicRocker Aug 16 2012, 11:05 PM

I'm still trying to figure out a number of things about the new images we are trying to work with. Assuming others are likewise trying to learn, I thought I would open this thread to create a place for such discussions.

I'd like to start out with a comment about raw image contrast. There have been several postings in the main threads about whether or not the MSL raw images have been stretched like those from the MER missions. I am certainly no expert on this, but it looks to me as if the MSL images have not been stretched at all. I haven't tried to analyze all of the image types, but the hazcams and navcams have pixel brightness histograms that are very different from their MER counterparts.

This attached image compares MER and MSL navcams along with their luminosity histograms.



The MSL images clearly are not using the entire, available range of brightness values, whereas the MER raws do. For this reason, the MSL raw images can usually be nicely enhanced by simply stretching the distribution of brightness across the full 256 value range.

Posted by: fredk Aug 17 2012, 02:52 PM

I've noticed the same thing. It means for some of these images, we're effectively getting 7 bit images. But on the other hand, the MSL images don't seem to loose detail in the whites the way MER images do.

I don't know if the MSL images have had any nonlinear pixel value transformations done, such as a logarithmic lookup table. I am curious what the images would look like with pixel value linearly related to the true scene intensity, and with the zero point correct.

Posted by: CosmicRocker Aug 18 2012, 05:53 AM

I'm curious about something I am seeing in the mastcam color images (M-34) and hope someone here can explain it. I'm not sure if this is the correct term, but I am seeing what appears to be color banding in most of the color mastcams. It is more apparent in some than others, and if you split the color image into it's red, green, and blue channels it becomes even more apparent.

In the attached example I have put the red channel greyscale alongside the color image. I have contrast stretched both images to make the banding effect more apparent.



My best guess is that this may be caused by the raw images having had their bit depth reduced at some point. Or, could this be caused by some kind of image compression?

Posted by: um3k Aug 18 2012, 06:00 AM

That's the JPEG compression having an aneurysm. smile.gif

Posted by: Cargo Cult Aug 18 2012, 10:45 PM

QUOTE (um3k @ Aug 18 2012, 07:00 AM) *
That's the JPEG compression having an aneurysm. smile.gif

Something I've been wondering about - are the Mastcam, MARDI and MAHLI JPEG images being recompressed on Earth, or is this JPEG data originally produced on Mars?

(I read somewhere on these fine forums that the Navcams use some fancy wavelet compression, but the colour cameras use good old JPEG. I'd love it if some of the raw images were literally that - identical data to that produced on another planet. Being repackaged and recompressed into a more web-friendly form is much more likely, alas...)

Pointlessly, I did check an image for EXIF tags, just in case - unsurprisingly there's nothing exciting.

CODE
Spiral:Desktop afoster$ jhead 0003ML0000124000E1_DXXX.jpg
File name : 0003ML0000124000E1_DXXX.jpg
File size : 132864 bytes
File date : 2012:08:18 15:35:06
Resolution : 1200 x 1200
Comment : NASA/JPL-Caltech/Malin Space Science Systems


Transmitting data like the manufacturer's name, camera serial number, focal length and whatever, over and over again, could be deemed an unnecessary use of interplanetary bandwidth? Shame. I imagine useful metadata takes a different route. wink.gif

Posted by: um3k Aug 18 2012, 10:50 PM

QUOTE (Cargo Cult @ Aug 18 2012, 06:45 PM) *
Something I've been wondering about - are the Mastcam, MARDI and MAHLI JPEG images being recompressed on Earth, or is this JPEG data originally produced on Mars?

At least some of the images are compressed in-rover, but I suspect these glaring artifacts are the result of recompression for the web. Especially since the data is almost certainly transmitted as a grayscale image with an intact bayer pattern. It'd be rather silly to debayer it on Mars and triple the amount of data that needs sent.

Posted by: ugordan Aug 18 2012, 11:04 PM

QUOTE (um3k @ Aug 19 2012, 12:50 AM) *
It'd be rather silly to debayer it on Mars and triple the amount of data that needs sent.

... except that's pretty much exactly what the cameras do onboard, de-Bayer is applied and JPEG-compressed as that's the lossy compression scheme of choice. Probably every single color image returned so far used this approach. It's not 3x the amount of data because its compressed in a lossy way and the chrominance channels are subsampled. In fact, this approach of exploiting visual similarity between what would otherwise be 3 similar b/w images encoded separately might by itself reduce bits-per-pixel requirement for a given image quality, even with no chrominance subsampling.

Posted by: um3k Aug 18 2012, 11:16 PM

I suppose that sort of makes sense. It makes the photographer in me cringe, and explains the sub-par demosaicing, but I understand the logic behind it. As someone who frequently deals with video encoding, I'm well aware of the relative compression efficiency of redundant data.

Posted by: mcaplinger Aug 18 2012, 11:48 PM

QUOTE (um3k @ Aug 18 2012, 04:16 PM) *
I suppose that sort of makes sense. It makes the photographer in me cringe, and explains the sub-par demosaicing, but I understand the logic behind it.

Gee, thanks for the ringing endorsement. smile.gif ugordan has it all correct. A complete description of MMM compression can be found in the Space Science Reviews paper on MAHLI -- http://rd.springer.com/article/10.1007/s11214-012-9910-4 -- see section 7.5 "Image Compression".

We can return uninterpolated frames if we want to pay the downlink volume penalty.

I haven't verified this but I suspect that the web release images are going through a decompress/recompress cycle.

Posted by: um3k Aug 18 2012, 11:55 PM

I mean no offense, in fact I greatly admire you and the rest of the MSSS team. You've (collectively and individually) accomplished some amazing things. I just have a rather negative gut reaction to lossy compression. tongue.gif

EDIT: Also, about the demosaicing: I'm quite certain you know what you're doing. I'm referring to the anomalous horizontal lines in the MARDI images, which I assume (perhaps mistakenly) have to do with the demosaicing, the quality of which I (again) assume is limited by the limited processing power/time of the rover. I also fully realize that more complex, non-linear interpolation algorithms aren't very conducive to scientific analysis. Perhaps my excessive assumptions are correlated with the fact that the MSSS team has cameras in space, while I have a desk cluttered with scavenged optics and a spectrograph made of LEGO bricks and a mutilated DVD-R. tongue.gif

Posted by: Winston Aug 19 2012, 02:51 PM

The signal to noise ratio and information content of this forum is outstanding. Let me contribute a small bit by providing links to technical documents about some of the MSL cameras, documents I found in the process of researching the MSL's computer system and internal network (about which I found virtually nothing):

The Mars Science Laboratory Engineering Cameras
http://www-robotics.jpl.nasa.gov/publications/Mark_Maimone/fulltext.pdf

THE MARS SCIENCE LABORATORY (MSL) NAVIGATION CAMERAS (NAVCAMS)
http://www.lpi.usra.edu/meetings/lpsc2011/pdf/2738.pdf

THE MARS SCIENCE LABORATORY (MSL) HAZARD AVOIDANCE CAMERAS (HAZCAMS)
http://www.lpi.usra.edu/meetings/lpsc2012/pdf/2828.pdf

THE MARS SCIENCE LABORATORY (MSL) MARS DESCENT IMAGER (MARDI) FLIGHT INSTRUMENT
http://www.lpi.usra.edu/meetings/lpsc2009/pdf/1199.pdf

THE MARS SCIENCE LABORATORY (MSL) MARS HAND LENS IMAGER (MAHLI) FLIGHT INSTRUMENT
http://www.lpi.usra.edu/meetings/lpsc2009/pdf/1197.pdf

----------
Some interesting heat shield documents:

MEDLI System Design Review Project Overview
http://www.mrc.uidaho.edu/~atkinson/SeniorDesign/ThermEx/MEDLI/MEDLI_SDR_Project_Overview.pdf

Advances in Thermal Protection System Instrumentation for Atmospheric Entry Missions
http://www.mrc.uidaho.edu/~atkinson/ECE591/Sp2008/Presentations/Fu.ppt

A relatively short but very interesting document about the engineering challenges of landing on Mars which discusses the advantages and disadvantages of the various possible methods:
http://www.engineeringchallenges.org/cms/7126/7622.aspx

As I said, I found virtually nothing about the Mars Compute Element (MCE) and the network(s) used within the lander to control MSL hardware (anyone know a good source, the more technical the better?), but I did find a tiny bit within this document starting on p41 where the electronics architecture is discussed. The bus used is redundant 2Mbps RS-422. The SAM uses BASIC keywords for its command language!:

The Sample Analysis at Mars Investigation and Instrument Suite
http://www.springerlink.com/content/p26510688kg4q808/fulltext.pdf

Excerpt:
The (SAM) CDH (Command and Data Handling) module (Fig.16) includes a number of functions. The CPU is the Coldfire CF5208 running at 20 MHz. The CDH (module) communicates with the Rover via redundant 2 Mbps high speed RS-422 serial bus along with a discrete interface (NMI).

In the SAM software description starting on p47, I found this interesting tidbit:
SAM’s command system is a radical departure from prior spaceflight instrumentation. SAM is a BASIC interpreter. Its native command language encompasses the complete set of BASIC language constructs—FOR-NEXT, DO-WHILE, IF-ENDIF, GOTO and GOSUB—as well as a large set of unique built-in commands to perform all the specific functions necessary to configure and operate the instrument in all its possible modes.

SAM’s commands, which are BASIC commands with SAM-specific commands built in, are transmitted in readable ASCII text. This eliminates the need for a binary translation layer within the instrument command flow, and makes it possible for operators to directly verify the commands being transmitted. There are more than 200 commands built in to the SAM BASIC script language.


----------
And even though this is just a NASA Press Kit, it is satisfyingly detailed technically on various MSL systems:
Mars Science Laboratory Landing
http://mars.jpl.nasa.gov/msl/news/pdfs/MSLLanding.pdf

And just for the heck of it, here's NASA's Viking Press Kit. How very far we have come, even with press kits:
http://mars.jpl.nasa.gov/msl/newsroom/presskits/viking.pdf

Posted by: nprev Aug 19 2012, 04:29 PM

Welcome, Winston, and thanks for a terrific first post! smile.gif We may archive some of those links in the MSL FAQ thread; much appreciated!

Posted by: mcaplinger Aug 24 2012, 08:02 PM

QUOTE (Holger Isenberg @ Aug 24 2012, 11:48 AM) *
Now the question is: Which is the actual transmissin curve of the IR cutoff filter? The one (black dotted line) in the 2541.pdf or the one (thick dark green line) on the JPL-Mastcam web page?

The plot in 2541.pdf is not a transmission curve, it's a response curve and thus modulated by the detector QE. The JPL web page looks pretty close.

Posted by: Holger Isenberg Aug 24 2012, 08:30 PM

QUOTE (mcaplinger @ Aug 24 2012, 10:02 PM) *
The plot in 2541.pdf is not a transmission curve, it's a response curve and thus modulated by the detector QE. The JPL web page looks pretty close.


Thanks for pointing that out! So the overall effect is like adding a strong UV filter to a normal digital camera, like a LP 420, which is actually helping to reduce visual haze. However, what explains then the decrease of the blue response function maxima at 470nm to almost 30% with the green response being the 100% reference on the black dotted line in 2541.pdf?

Posted by: mcaplinger Aug 24 2012, 09:25 PM

QUOTE (Holger Isenberg @ Aug 24 2012, 01:30 PM) *
However, what explains then the decrease of the blue response function maxima at 470nm to almost 30% with the green response being the 100% reference on the black dotted line in 2541.pdf?

I'm not sure what the dashed line is really supposed to mean. The IR-cutoff response is shown in blue, green, and red, but all the filters have been normalized to their peaks, so you can't, for example, compare the R2/L2 narrowband response to the IR-cutoff blue response, or the IR-cutoff blue to the red. This plot was just intended to give an outline of where the bandpasses are.

Posted by: ugordan Aug 25 2012, 11:29 AM

This is just for fun, I tried to implement an adaptive correction for the JPEGged (ugh!), raw Bayered images to get rid of artifacts in image areas that are smooth in appearance. The artifacts come from the JPEG algorithm trashing the Bayer pattern, introducing this kind of pattern:



After correcting for that, the smooth areas became more smooth as illustrated by this comparison, although obviously this approach can't ever come close to an image already returned from the spacecraft in de-Bayered, color form:



I had to make the algorithm adaptive in picking which DCT blocks it will apply this to, because if I apply that correction invariably across the image, some uniform-color areas which originally already looked good had these artifacts introduced afterwards...

Posted by: Airbag Aug 25 2012, 06:32 PM

Here is a poor man's way of de-Bayering Mastcam images using the Gimp (or similar tools) for those wanting to experiment a little - obviously not intended for the experts here! I realize it is not as sophisticated as proper implementations, but in the absence of a Gimp plugin, this way has the advantage of simplicity at the expense of ending up with a half-resolution image. It does not for instance use just the green pixels for luminosity, and it performs the chroma filtering by the simple expedient of scaling the image down by a factor of 2 (thus merging each set of red, blue and two green pixels together).

  1. Open the Mastcam Bayer pattern (1536x1152) image in the Gimp
  2. Change the mode to "RGB"
  3. Drag the attached Bayer pattern color map onto it (forming a new layer)
  4. Change the mode of that new layer to "multiply"
  5. Flatten image
  6. Scale image to 50% of original size
  7. Colors->Auto->Normalize (or use Levels or Curves tool) to brighten the image.

Airbag

Bayer pattern color map image:


Sample result:

Posted by: fredk Aug 25 2012, 08:54 PM

Here's the FFT of the jpeged Bayer patterns, upper two are a patch of smooth sky, and lower two a patch of ground:


In the ground image, you only see 2-pixel-scale (rough) periodicity, corresponding to the Bayer pattern, which shows up as broad peaks at the edges of the FFT. In the sky image, you also see peaks at the 2-pixel scale at the edges of the FFT, but sharp now since the sky is smooth. And you can also see FFT peaks halfway and quarter way to the edges, corresponding to 4- and 8-pixel periodicity. But of course there should be no 4 or 8 pixel periodicity in a smooth Bayer image! So clearly those peaks are the result of jpegging.

So I tried to filter out those peaks in the power spectrum. Here's the result on the same image as ugordan used:

Very similar result! The Fourier space filtering beautifully gets rid of the blotchy pattern on large smooth areas, but breaks down at the edges of those areas since the periodicity breaks down there. But there was no need to make the algorithm adaptive here - it works in one simple step.

Here's the horizon shot:

Again, a great job on the sky, but very little improvement on the not-so-periodic blotches over the mound.

Posted by: ngunn Aug 25 2012, 09:35 PM

I'm totally fascinated by the various approaches to de-Bayering being tried out and discussed here, and that's from somebody not normally interested in the details of image processing techniques. I've learned a lot from the many posts, especially ugordan's and now fredk's. I love Airbag's slicing of the gordian knot for a quick short cut. Who says airbags are no use on this mission?

Posted by: Art Martin Aug 26 2012, 04:25 PM

I hope this is the right section for this. I'm actually replying to a post in another thread but, since it's about image processing, thought it better to put it here.

Amazing comparing the Hirise anaglyph to what we actually see on the ground. One thing that seems to be obvious to me though is that there is a real exaggeration of relief in the Hirise 3D effect (mesa are taller, canyons are deeper) most likely created because the left and right images are taken a great deal farther apart than the human eyes. If I understand they are simply images taken at different points in the orbit and not by two cameras side by side as on the rovers. While that relief is stunning and produces this amazing view of surface from above, it is not really a true representation as to what a human observer would see from orbit. In these days of phenomenal image and video processing software, where a program can build intermediate frames of a video by analyzing the pixels of each surrounding frame, I wonder if someone hasn't devised a way of correcting the relief of a 3D anaglyph if one knows the actual separation of the two images. I can certainly picture the code process in my mind and it doesn't seem complicated if one works with image comparison coding. I'm a computer programmer but it's been years since I did anything where I was manipulating pixels and my relearning curve would be extensive or I'd tackle something myself.

Sure seems that anaglyphs have been around long enough that someone would have figured this out by now. Any thoughts?

QUOTE (walfy @ Aug 26 2012, 02:21 AM) *
I borrowed Fred's excellent rendition to compare with HiRISE anaglyph of the prime science region around the inverted riverbed. It's a very narrow angle of view. If I marked some features wrong, please let me know.

[attachment=27716:msl_science_target.jpg]


Posted by: Phil Stooke Aug 26 2012, 04:35 PM

I don't do anaglyphs so I can't get technical here, but basically the 3-D map created by a stereo pair can be displayed with any vertical exaggeration you like. Typically they are made with some exaggeration because most scenes are rather bland without it. For a given stereo pair there may be some default value that is normally used but it could be changed if desired.

Phil


Posted by: ngunn Aug 26 2012, 05:09 PM

QUOTE (Art Martin @ Aug 26 2012, 05:25 PM) *
I wonder if someone hasn't devised a way of correcting the relief of a 3D anaglyph


We've discussed this here a while ago, but rather than try to dig back for that here are a couple of salient points. Anaglyphs don't have a constant intrinsic exaggeration factor. The apparent relief you see depends on the size of the image and the distance you view it from. Adjust those and you could in theory view any anaglyph without line-of sight exaggeration. It's true that in many cases you'd have to enlarge the image enormously and sit very close to it!!

One solution I've suggested is the inclusion of a small virtual cube in the corner of each anaglyph to serve as a three dimensional scale bar, so if the cube looks too tall you know you're seeing the scene exaggerated by the same amount.

Posted by: fredk Aug 26 2012, 05:17 PM

I think what Art's suggesting is adjusting the apparent relief while keeping viewing size/distance constant. I could imagine doing that, for example by morphing one frame part ways towards the other to reduce relief. But that would be hard and would involve some degree of faking for the intermediate viewpoints.

For now, ngunn's approach can at least help reduce exagerated relief.

Posted by: john_s Aug 26 2012, 05:55 PM

QUOTE (ngunn @ Aug 26 2012, 11:09 AM) *
It's true that in many cases you'd have to enlarge the image enormously and sit very close to it!!


Actually, does enlarging the image have any effect on the apparent vertical exaggeration? I wouldn't expect so, because there should be no vertical exaggeration when the convergence angle of your eyes matches the convergence angle of the original image pair [convergence angle = angle between the two lines of sight in the stereo pair, measured at the surface location being viewed]. The convergence angle of your eyes depends on their distance from the image, but doesn't depend on the image magnification.

John

Posted by: Pando Aug 26 2012, 06:59 PM

Here's my attempt at creating a 3d anaglyph image of the distant hills...



 

Posted by: ngunn Aug 26 2012, 07:49 PM

QUOTE (john_s @ Aug 26 2012, 06:55 PM) *
Actually, does enlarging the image have any effect on the apparent vertical exaggeration? I wouldn't expect so


You are correct of course. It's the viewing distance alone that does it. I was confusing anaglyphs with cross-eyed pairs where the size does have an effect because it changes the angles too.

QUOTE (Pando @ Aug 26 2012, 07:59 PM) *
Here's my attempt at creating a 3d anaglyph image of the distant hills.


Excellent! smile.gif

Posted by: Art Martin Aug 26 2012, 08:09 PM

Yes, that's exactly what I was wondering about. It would very much involve faking one of the images based on an analysis of an anaglyph created with the wider separation of views and then rebuilding the anaglyph with one original and the "faked" image. I guess derived image would be a more PC term much like when smoothing a video shot at low FPS and having the computer generate the intermediate images for a standard video frame rate based on a best guess of how motion and scaling would occur in each frame. When I've created anaglyphs in the past, the two original images are lined up vertically first and then the images are aligned horizontally for the most comfortable view. This results in a blue and a red tinted image combining both of the originals. When you view it without the glasses you can see distinct blue and red tinted objects with close up ones having more horizontal distance between those objects and the far away ones having very little distance or they're essentially right on top of one another. When the left and right image are taken at let's say hundreds of miles apart those distances get very exaggerated when viewed by human eyes. What the program would do would be figure out how much offset each pixel on let's say the right image shifted to the side from it's corresponding pixel on the left image and bring it back closer together in the proportion between a standard eye viewing angle and the actual image angles. I'd think that would be fairly easy to do on a long distant aerial shot but very tough on something close up because objects could block other ones from left to right. So guess this is a challenge to the programmers that have written the wonderful anaglyph software out there that pretty much assumes the original shots are taken at standard eye distance. They're already really doing the processing when they build the final red/blue image. You'd just need one more parameter in there that was the distance between the original images. Instead of simply combining the shots together, the blue portion would be derived and then combined.

One advantage of having this feature would be that you could also intentionally exaggerate the relief by creating the derived image at a much wider distance than it was originally shot to more readily spot depressions and things jutting up.

QUOTE (fredk @ Aug 26 2012, 10:17 AM) *
I think what Art's suggesting is adjusting the apparent relief while keeping viewing size/distance constant. I could imagine doing that, for example by morphing one frame part ways towards the other to reduce relief. But that would be hard and would involve some degree of faking for the intermediate viewpoints.

For now, ngunn's approach can at least help reduce exagerated relief.


Posted by: ElkGroveDan Aug 26 2012, 08:43 PM

QUOTE (ugordan @ Aug 25 2012, 04:29 AM) *
This is just for fun, I tried to implement an adaptive correction for the JPEGged (ugh!), raw Bayered images

Very ingenuitive thinking Gordan -- and it seems to have worked well.

Posted by: Roby72 Aug 27 2012, 08:22 PM

A few remarks about the near focus of the mastcams:

The pictures of the sundial taken by the mastcam-100 are not in focus up to now.
I suspect that the near focus of this camera is a little bit beyond - for example the cable that running left of the dial is sharp:

http://mars.jpl.nasa.gov/msl-raw-images/msss/00013/mcam/0013MR0002011000E1_DXXX.jpg

In contrast the m34 has taken sundial pictures that are in best focus:

http://mars.jpl.nasa.gov/msl-raw-images/msss/00013/mcam/0013ML0002010000E1_DXXX.jpg

Did anyone know the distance from the mastcams to the dial ?

Robert

Posted by: mcaplinger Aug 27 2012, 08:55 PM

QUOTE (Roby72 @ Aug 27 2012, 01:22 PM) *
Did anyone know the distance from the mastcams to the dial ?

The Marsdial is roughly 7.6 cm square and one side is 296 pixels long in the 34mm image. The IFOV of the 34mm is about 218 microrads, so the distance is roughly 1.2 meters.

Posted by: PDP8E Aug 28 2012, 12:12 AM

Back from a little vacation, and catching up on all the threads.
After several attempts to circumvent the JPEG induced color/pattern bleeds in the bayers, I decided to just make a good B&W image. It has several hacks: rank order adaptive noise reduction, reduced resolution to 50%, and multiple stacks.
(and its not that good) huh.gif



... maybe MSSS will take pity on the collective suffering and hacks found here on these threads and authorize a posting of a few PNG raw bayers (just to see if all of the UMSF debayer programs are working properly of course!)


Posted by: mcaplinger Aug 28 2012, 12:49 AM

QUOTE (PDP8E @ Aug 27 2012, 05:12 PM) *
... maybe MSSS will take pity on the collective suffering and hacks found here on these threads and authorize a posting of a few PNG raw bayers...

What would be more in keeping with the spirit of the data release policy (IMHO) would be for us to demosaic the data and then JPEG that, but I can't do either one without permission. If raw data acquisition becomes common I'll ask, but I don't think it's worth it for a relative handful of images.

Posted by: ElkGroveDan Aug 28 2012, 04:06 AM

Sounds like a great task for a trusted college intern.

Posted by: mcaplinger Aug 28 2012, 04:44 AM

QUOTE (ElkGroveDan @ Aug 27 2012, 09:06 PM) *
Sounds like a great task for a trusted college intern.

I have no idea what your point is. It would take me about a minute to process all these images and post them. It's the permission to do so that's lacking.

Posted by: JohnVV Aug 28 2012, 07:30 AM

the jpg issue is mostly solved
http://imgbox.com/g/9pzCUh6YaZ
------------
http://imgbox.com/abeAl3KG http://imgbox.com/acpJ0mWK http://imgbox.com/abtrvaZU http://imgbox.com/acoiHAx9
-------------
not "white balanced"
the code used

CODE
gmic  0017MR0050002000C0_DXXX.jpg -bayer2rgb 3,3,3 -pde_flow 5,20 -sharpen 2 -o 0017MR0050002000C0.png


a pde to remove most of the artifacts

Posted by: ugordan Aug 28 2012, 07:57 AM

QUOTE (mcaplinger @ Aug 28 2012, 06:44 AM) *
It's the permission to do so that's lacking.

I can understand that posting raws as PNGs would be iffy because we'd be getting the same quality product as you, but why would de-Bayering on the ground and JPEG-ing present an issue?

Posted by: ugordan Aug 28 2012, 07:59 AM

QUOTE (JohnVV @ Aug 28 2012, 09:30 AM) *
the jpg issue is mostly solved

John, that looks very smooth, but there's an obvious loss of sharpness in the resulting images.

Posted by: RegiStax Aug 28 2012, 09:40 AM

QUOTE (ugordan @ Aug 28 2012, 09:59 AM) *
John, that looks very smooth, but there's an obvious loss of sharpness in the resulting images.

Ive attached a waveletsharpened version of one of the PNG's, its possible to resharpen them without too much of the debayer-artefacts showing up it seems.

cheers
CorB

Posted by: jmknapp Aug 28 2012, 10:08 AM

I wonder what the effective bits per pixel of the MASTCAM raw images is. The cameras use the http://www.stargazing.net/david/QSI/KAI-2020LongSpec.pdf sensor. Is it sampled with a 12-bit ADC like MARDI and NAVCAM? One reference for NAVCAM gives an SNR of >200 for certain conditions. So that would mean effectively 7-8 bits per pixel for that camera anyway?

It was interesting that yesterday Mike Malin referred to stacking images.

Posted by: mcaplinger Aug 28 2012, 01:48 PM

QUOTE (jmknapp @ Aug 28 2012, 03:08 AM) *
I wonder what the effective bits per pixel of the MASTCAM raw images is.

The MAHLI paper I've referred to several times before is an excellent source of this kind of information. From section 7.5.1:

"Acquired as 12-bit images, MAHLI data are converted onboard the instrument, without loss
of information, to 8-bit images using a square-root companding look-up table."

Posted by: mcaplinger Aug 28 2012, 02:03 PM

QUOTE (ugordan @ Aug 28 2012, 12:57 AM) *
why would de-Bayering on the ground and JPEG-ing present an issue?

I am not authorized to put any data out, in any form. You might want to take a look at http://blogs.agu.org/martianchronicles/2012/07/28/blogging-msl/ -- Ryan discusses in general terms the rules that we operate under. I am constantly evaluating each of my posts to make sure that it is only derived from public information.

If you don't like it, complain to my boss.

Posted by: Airbag Aug 28 2012, 02:39 PM

QUOTE (mcaplinger @ Aug 28 2012, 08:48 AM) *
"Acquired as 12-bit images, MAHLI data are converted onboard the instrument, without loss
of information, to 8-bit images using a square-root companding look-up table."


I read that a while back and was puzzled by how one could do any such conversion without a loss of information unless the original data already fit in 8 bits? Any non-trivial processing operation results in a loss of information somewhere (usually with the purpose of increasing apparent information somewhere else).

I understand companding and all that, but that is not a two-way operation without loss of information. The resulting 8 bit data cannot be converted back into the original 12 bit data, surely, as multiple values in the 12 bit sample get mapped to the same 8 bit value?

Just trying to understand what exactly was meant in the MAHLI document.

Airbag

Posted by: Floyd Aug 28 2012, 03:05 PM

mcaplinger We all really appreciate the fantastic images released and I think the misundrstanding was that you are bound by image release policy--not being the one doing the releasing. After the appropriate time, the eager ones here will have access to raw data.

Airbag I'm not an expert, but I think it means without loss of spatial resolution. Clearly there is minor loss of brightness precisionin/pixl going from 12 to 8 bit (and back again). However, visually, 8 bit square-root gives full dynamic range and preserves the low-light/dark range preferentially to the saturated white end.

Posted by: ugordan Aug 28 2012, 03:09 PM

Airbag, also look up photon shot noise to see why even 12 to 8 bit conversion is "sorta" lossless.

Posted by: Airbag Aug 28 2012, 04:17 PM

Ugordan,

But photon shot noise is less of an issue with larger number of photons (i.e. higher CCD signal levels) and that is exactly where a square root compander "does its thing" the most, so I don't see how photon shot noise would be relevant to my original question?

I could understand it if the CCD read noise was a dominant factor, and thus the 12 bits of data are really only "effectively" worth around 8 bits, but that would imply a huge noise contribution!

Airbag

Posted by: mcaplinger Aug 28 2012, 04:22 PM

QUOTE (Airbag @ Aug 28 2012, 09:17 AM) *
But photon shot noise is less of an issue with larger number of photons...

shot noise = sqrt(signal), so shot noise is higher at higher signal levels. The square root encoding is coarser at higher levels, finer at lower levels.

You could have a philosophical debate about what this means, but that's the way our cameras work.

Posted by: Airbag Aug 28 2012, 06:30 PM

I was looking at shot noise from a S/N point of view but I get the hint and will skip any debates - thanks for the clarifications everybody!

Airbag

Posted by: jmknapp Aug 28 2012, 07:47 PM

QUOTE (mcaplinger @ Aug 28 2012, 08:48 AM) *
The MAHLI paper I've referred to several times before is an excellent source of this kind of information.


http://rd.springer.com/article/10.1007/s11214-012-9910-4/fulltext.html

From http://www.lpi.usra.edu/meetings/lpsc2009/pdf/1197.pdf paper....

QUOTE
MAHLI shares common electronics, detector, and
software designs with the MSL Mars Descent Imager
(MARDI) and the two Mast Cameras (Mastcam).


Ahh, good to know. Wonder what the (optimum) SNR is.

Posted by: mcaplinger Aug 28 2012, 08:25 PM

QUOTE (jmknapp @ Aug 28 2012, 12:47 PM) *
Wonder what the (optimum) SNR is.

I often tell people who are fond of such numeric metrics as SNR, MTF, ENOB, etc, that no matter how optimal those numbers make your camera sound, anybody can tell a good image from a bad image. I think the MMM images hold up pretty well.

That said, you could work out the best possible SNR we could get from the Truesense datasheet. You can't ever do better than sqrt(fullwell) for a single measurement due to shot noise and these sensors have a fullwell of 20K-40K electrons so 140:1 to 200:1 is as good as it could be.

Posted by: jmknapp Aug 29 2012, 12:32 AM

QUOTE (mcaplinger @ Aug 28 2012, 04:25 PM) *
these sensors have a fullwell of 20K-40K electrons so 140:1 to 200:1 is as good as it could be.


200:1 is about 46 dB SNR which according to the method on http://www.ctecphotonics.com/cdd.html gives effective number of bits (ENOB) of (46-2)/6 = 7.3 bits. That could be improved if desired by stacking?

The images are very impressive to see--just wondering what the limits are when the images are considered as abstract data, perhaps to be analyzed by machines teasing out the last bit of information.

Posted by: Airbag Aug 29 2012, 06:44 PM

That explains it all Joe - thanks! This made me wonder if the public "raw" images we see are uncompanded (and rescaled into 8 bits) again or not?

As an experiment, I used some full size ML and MR color images of the sundial, which has gray rings of 20%, 40% and 60% reflectivity, and measured the corresponding grayscale pixel values using "sample merged" data point of appropriate radii in the Gimp. Making various assumptions about JPEG accuracy, lighting and dust etc. (i.e. ignoring them!), the resulting data shows that the JPEGs we see appear to be quite linear in response to the different grayscale rings. This suggest that the images we see has been decompanded - unless either my measurements are flawed (for instance, the extrapolation of the trend does not go through the origin), or the CCD is not really linear in response, and the MSL side companding has now made it appear linear?



Comments welcome, as if I could stop them anyway :-)

Airbag

Posted by: jmknapp Aug 29 2012, 07:01 PM

QUOTE (Airbag @ Aug 29 2012, 02:44 PM) *
... the JPEGs we see appear to be quite linear in response to the different grayscale rings.


Interesting use of a test pattern. I gather the square law companding from 12-bit to 8-bit would be something like this:



The image you used ranged from about 100-160--not sure if you'd notice the nonlinearity per the graph above. I.e., does it only matter if there's a lot of dynamic range in the image, otherwise it can be pretty much fixed up with linear levels adjustments?

Posted by: Airbag Aug 29 2012, 07:29 PM

Joe, I think you may have the wrong "law" - it should be square root, not square, right? Doesn't change your plot though, and maybe that is just a semantics thing.

You may well be correct that the dynamic range of my samples (from 5 different images) is not sufficient to show whether the resulting data is truly linear or not. Plus, even a little bit of dust would cause more forward scattering and that could be why my data points don't go through the origin. Arguably. I could have included some (presumed) "full white" and "full black" (the gnomon?) points too, but I have no data for actual reflectivity values for anything other than the gray rings.

Airbag

Posted by: fredk Aug 29 2012, 07:37 PM

This is interesting. A suggestion: look at the calibrated pds sundial images from the earliest MER sols - presumably we know that those 10bit files have linear response?

Posted by: mcaplinger Aug 29 2012, 07:48 PM

QUOTE (Airbag @ Aug 29 2012, 11:44 AM) *
This made me wonder if the public "raw" images we see are uncompanded (and rescaled into 8 bits) again or not?

As far as I know they aren't decompanded or stretched. Of course you can't tell if they were commanded with sqroot or some flavor of linear (it's an option) so I caution everyone against trying to do photometry.

You could look at ftp://pdsimage2.wr.usgs.gov/cdroms/Mars_Reconnaissance_Orbiter/MARCI/mrom_0243/document/marcisis.txt to see the companding table used for MRO/MARCI. Why would they be different?

Posted by: RoverDriver Aug 29 2012, 07:54 PM

I don't know, but is it possible these images use a gamma value other than 1?

Paolo

Posted by: ugordan Aug 29 2012, 09:02 PM

QUOTE (RoverDriver @ Aug 29 2012, 09:54 PM) *
I don't know, but is it possible these images use a gamma value other than 1?

If the MSSS cameras use sqrt encoding, that's pretty close to an inverse of the 2.2 display gamma so while the encoded DNs don't scale linearly with scene brightness, the apparent brightness scaling on the screen should follow the actual scene pretty closely.

The navcams and hazcams on the other hand look to me like they use a linear 12 to 8 conversion (even though sqrt should be an option for them as well), because they look rather contrast-enhanced.

Posted by: jmknapp Sep 8 2012, 12:12 AM

Anyone know why the "full frame" raw MASTCAM images are sometimes 1648x1200, usually 1536x1152, rarely 1600x1200? I believe the specified image size for the camera is 1648x1200. Are the smaller sizes scaled or cropped?

Posted by: Deimos Sep 8 2012, 12:49 AM

QUOTE (jmknapp @ Sep 8 2012, 01:12 AM) *
Anyone know why the "full frame" raw MASTCAM images are sometimes 1648x1200, usually 1536x1152, rarely 1600x1200? I believe the specified image size for the camera is 1648x1200. Are the smaller sizes scaled or cropped?

True full frame is as you say. 48 columns are dark or null (see the MAHLI paper that has been linked several times). The corners are quite vignetted, so for some purposes 1600x1200 may not be what is desired. For other purposes, it is handy to have multiples of 64 for the size; multiples of 8 are required. Those come from JPEG compression of both the thumbnail and the image. I'd speculate that once we're well past CAP2, you'll see NxM * 128 most often, and 1648x1200 more rarely, and others quite rarely. But you never know.

Posted by: jmknapp Sep 8 2012, 01:37 AM

QUOTE (Deimos @ Sep 7 2012, 08:49 PM) *
But you never know.


Thanks... I've been working out a method to unwarp the images based on the CAHVOR camera specification for ftp://naif.jpl.nasa.gov/pub/naif/MSL/kernels/ik/msl_ml_20120731_c03.ti. Hard to say how to apply it to the cropped images though. I guess scale 1536x1152 up to 1600x1200 and then add the 48 dark columns back in.

It's easier with the NAVCAM images because they at least come in at 1024x1024 as specified. I'm wondering if stitches would work a little better using such rectified images. Here's a before and after of a NAVCAM shot--it's a real subtle correction:


Posted by: iMPREPREX Sep 9 2012, 07:04 PM

Hey folks. I see a lot of controversy here in regards to de-bayering.

What's the story in a nutshell? Are we allowed to de-bayer them? And if so, what is the easiest and best way? I wish I knew why this was an issue. But that's probably due to my own ignorance in the matter.

Posted by: fredk Sep 9 2012, 07:33 PM

QUOTE (iMPREPREX @ Sep 9 2012, 07:04 PM) *
Are we allowed to de-bayer them?
Yikes! They're public, so we can deBayer them, stretch them, Philovision them, unsharp mask them, solarize them, swap R and G channels, and finally raise each pixel value to the pi'th power, if it suits us.

The basic deal is that they're transmitting lossless, Bayered images, apparently for calibration purposes. We probably won't get very many of these in the future, since they take a lot of bandwidth. JPL jpegs the original images and makes them public, as they've always done. Unfortunately in this case the jpegging introduces blocky artifacts when we deBayer. If JPL deBayered them before jpegging, we wouldn't see the blocks. So we either hope that changes or wait for the full data release in six months.

Posted by: iMPREPREX Sep 9 2012, 08:29 PM

All I have is a pattern I can't get rid of. Would look cool if it could be removed.

 

Posted by: EdTruthan Sep 10 2012, 05:23 AM

Here's a HUMONGOUS ANAGLYPH for anyone so inclined to peruse the Mt. Sharp foothills in a reasonably perceptible quantum of 3D... and surprisingly, there's some depth there!

After testing a few stitched frames out to see whether the 12 meters or so parallax difference between Sol 19 and Sol 23's color panos allowed for any 3D depth to be seen at the Mt. Sharp foothills or not I decided it warranted a BIG anaglyph just to check. Sol 19's pano has all the pictures down but unfortunately there's still 4 or 5 that have never arrived yet from from Sol 23 so there's a few missing frames in this version (the reddish areas). Now at first glance there doesn't seem to be much depth to perceive at the foothills, but zoom in a bit, and slowly pan around and sure enough... it's there alright. Seems the more you zoom in (better in the FULL version), the more evident it becomes. Sol 23 was used as left eye and Sol 19 as right. Sweet.

Here's a medium sized version (6500px x 922px):




...and here's THE FULL VERSION (19738px x 2100px): http://www.edtruthan.com/mars/Sol19-and-23-Pano-Anaglyph-19738x2100.jpg

EDIT: This image has been updated! (missing frames are down). http://www.unmannedspaceflight.com/index.php?s=&showtopic=7426&view=findpost&p=191812


Posted by: Nix Sep 10 2012, 05:32 PM

woaw, very nice work..


Posted by: fredk Sep 10 2012, 05:47 PM

I second that. Very nicely done.

Let's hope for another M100 pan with a greater baseline to really pop out those butes and mesas... blink.gif

Posted by: morganism Sep 10 2012, 10:47 PM

The web interview with our camera operator !


http://blogs.agu.org/magmacumlaude/2012/08/28/danny-krysak-an-out-of-this-world-geologist-accretionary-wedge-49/

Posted by: EdTruthan Sep 11 2012, 05:26 PM

QUOTE (fredk @ Sep 10 2012, 10:47 AM) *
Let's hope for another M100 pan with a greater baseline to really pop out those butes and mesas...


Oh my, I agree and soon I hope! Using one of the top portions of the Sol 32 MC100 robotic arm photos as the left eye in an anaglyph test, though the horizon is frustratiingly out of focus it was enough to verify that the basline is now quite effective for imaging the base of Mt. Sharp with plenty of eye-popping depth. The test below is a little wonky to the eye because the red spectrum is so darned out of focus but it was enough to test the baseline shift's effectivness. I'd just love to do another full pan anaglyph with a new from a location somewhere well before Glenelg before it widens to much for a good Sol 19 pairing. Please give us another MC 100 full pan soon!




Posted by: elakdawalla Sep 11 2012, 05:59 PM

Oh my. I'm actually not bothered by the fuzziness of the red channel at all; those buttes pop into spectacular depth-rich focus, and the yardang material above them is surprisingly spiky.

Posted by: fredk Sep 11 2012, 07:07 PM

Now that's being resourceful! Very cool.

Posted by: fredk Sep 11 2012, 08:04 PM

And here's my attempt at combining sol13 mastcamL (from James's mosaic) and sol32 navcam into a long baseling view. Navcam gives similar resolution to those out-of-focus mastcams, but covers a wide field. The distortions of navcam produce pretty severe headachiness at the sides of this anaglyph, so be warned:


Posted by: EdTruthan Sep 11 2012, 11:31 PM

Very Nice! I find it interesting to note how little input the brain seems to require to construct a reasonably well defined binocular portrait even with such compromised data being received from the other half of the equation. And the general baseline distance from around Bradbury to the present location are really quite nice now for rendering the depth perspectives well without too much exaggeration. Now we just need a few crisp new long shots. Will definatetly be hitting "F5" a bit more than usual...

Posted by: Eyesonmars Sep 12 2012, 07:35 PM

I've been wondering if it is technically feasible to use the Mastcam Video capability to search for and record dust devil activity. Can find nothing on the board forums or the internet on this. I know they plan on doing videos of phobos eclipses so this isn't too much of a stretch.

Lets say we point the mastcam wide angle in the general direction of the dark dunes for 30 min around solar noon. The strong albedo contrast along the dune boundary would be a preferred place for dust devil development - I would guess.

What are the memory/bandwith constraints in implementing something like this ? I suppose frame comparison software uploaded to the rover would be smart enough to return only video segments with changes between frames - ie dust devil/blowing dust activity.

IF this makes sense so far why not make this the sort of default mode for Mastcam when it is not being used otherwise.

Posted by: mcaplinger Sep 12 2012, 08:30 PM

QUOTE (Eyesonmars @ Sep 12 2012, 12:35 PM) *
I've been wondering if it is technically feasible to use the Mastcam Video capability to search for and record dust devil activity.

IMHO, not really.

1) Even the 34mm Mastcam has a fairly narrow FOV (about 15 degrees.) Dust devil searching on MER was done with Navcam, see http://trs-new.jpl.nasa.gov/dspace/bitstream/2014/41748/1/08-0444.pdf

2) There's no motion detection capability in the Mastcam hardware and doing it in software would be limited to a frame rate of maybe 1/4 to 1/10 fps at best.

3) There are power limitations and running the camera all the time isn't possible.

Posted by: Eyesonmars Sep 12 2012, 09:08 PM

Thanks mcaplinger for the response.

Perhaps I read too much into the list of objectives on the Mastcam website
http://msl-scicorner.jpl.nasa.gov/Instruments/Mastcam/

One of the bullet items under the list of objectives is:

"Document atmospheric and meteorological events and processes by observing clouds, dust-raising events, properties of suspended aerosols (dust, ice crystals), and (using the video capability) eolian transport of fines"

Posted by: fredk Sep 12 2012, 09:10 PM

Maybe software could use sparsely timed images (like Spirit's DD sequences) to detect a DD with software and then trigger high frame rate video (without motion detection). But I don't know if there's much we'd learn from DD video.

It may depend on what we see - if there are lots of DDs, we'll catch some by accident.

Posted by: mcaplinger Sep 12 2012, 09:22 PM

QUOTE (fredk @ Sep 12 2012, 02:10 PM) *
Maybe software could use sparsely timed images (like Spirit's DD sequences) to detect a DD with software and then trigger high frame rate video (without motion detection).

Certainly it's conceivable that we could run a Navcam sequence looking for dust devils, find one, slew the Mastcam to it and start a video acquisition. That capability doesn't exist right now, though, and I don't know if the science value would be worth the effort.

Posted by: Deimos Sep 12 2012, 10:02 PM

QUOTE (fredk @ Sep 12 2012, 09:10 PM) *
It may depend on what we see - if there are lots of DDs, we'll catch some by accident.

Pathfinder, Spirit, Opportunity, and Phoenix returned their first images of dust devils serendipitously. All but Phoenix had dust devil campaigns prior to the first detection of a dust devil in an image, also. I would expect "exploratory" observations to have a different priority and implementation from "monitoring" observations. I think that with previous missions, the frequency of intentionally looking for dust devils vaguely correlates with the amount orbital evidence for dust devils in the area. M34 has some neat advantages over Navcam if you have reason to believe its FOV is sufficient--it would have been great up in the Columbia Hills--but Navcam's FOV is better for exploratory surveys, as with MER. Of course, exploration frequently follows many paths.

Posted by: climber Sep 13 2012, 11:50 AM

I must admit there's something I don't understand.
I understood that, once the covers of the Hazcam have been removed, they were to kind of hanging somewhere (I mean not on the ground).
Looking at the bellypan or whatever you call it, I can't see those covers.
Any thoughts?

Posted by: dvandorn Sep 13 2012, 12:32 PM

QUOTE (climber @ Sep 13 2012, 06:50 AM) *
I understood that, once the covers of the Hazcam have been removed, they were to kind of hanging somewhere (I mean not on the ground).
Looking at the bellypan or whatever you call it, I can't see those covers.
Any thoughts?

If you look closely at the bellypan, underneath and offset from each of the hazcam lenses there is a spring. We know that the hazcam lens covers were designed for a one-time deployment; the springs were supposed to move them completely out of the way of the lenses, for good. My guess is that the lens covers are out of sight on the bottom of the lens housing, one each attached to each spring visible in the MAHLI images.

-the other Doug

Posted by: pospa Sep 13 2012, 12:38 PM

QUOTE (climber @ Sep 13 2012, 01:50 PM) *
I must admit there's something I don't understand. I understood that, once the covers of the Hazcam have been removed, they were to kind of hanging somewhere (I mean not on the ground). Looking at the bellypan or whatever you call it, I can't see those covers. Any thoughts?

This video explains it all : http://mars.jpl.nasa.gov/msl/multimedia/videos/index.cfm?v=53

Posted by: Tesheiner Sep 13 2012, 12:50 PM

There were several posts about this topic on the landing or pre-landing thread.
You can see the springs (cyan arrows) and the covers (red arrows) on this crop image.


Posted by: centsworth_II Sep 13 2012, 01:00 PM

QUOTE (pospa @ Sep 13 2012, 07:38 AM) *
This video explains it all : http://mars.jpl.nasa.gov/msl/multimedia/videos/index.cfm?v=53
Rotating the video image shows how the cover may be hidden behind the spring.


Posted by: climber Sep 13 2012, 01:52 PM

Thank you ALL!
Believe me, I read ALL posts but can't remember understanding where they'll end up. Yes, spings are very evident to me for sure and I though covers have to be somewhere but I couldn't see them. Video also helps.

Posted by: mhoward Sep 15 2012, 04:06 PM

QUOTE (fredk @ Sep 15 2012, 08:56 AM) *
But all of the smudginess is artifacts - the sun should be very smooth.


I've noticed that they're really compressing the living daylights out of MSL images before putting them on the web - much more so than the MER images are compressed. Or are we talking about a different kind of artifacting.

Posted by: paraisosdelsistemasolar Sep 15 2012, 04:26 PM

I think the main problem behind the most artifacts is JPEG compression. I think they should try PNG or a less agressive compression scheme.

QUOTE (mhoward @ Sep 15 2012, 04:06 PM) *
I've noticed that they're really compressing the living daylights out of MSL images before putting them on the web - much more so than the MER images are compressed. Or are we talking about a different kind of artifacting.


Posted by: fredk Sep 15 2012, 05:14 PM

The problem with the sun image is that the public images are lossless -> jpeg rather than lossless -> debayer -> jpeg or just lossless. That's the same problem that gives the green splotches on landscape shots.

QUOTE (mhoward @ Sep 15 2012, 04:06 PM) *
they're really compressing the living daylights out of MSL images before putting them on the web - much more so than the MER images are compressed.
I haven't noticed that - could you give an example of that? Or did you mean the bayering problem?

Posted by: mcaplinger Sep 15 2012, 05:29 PM

QUOTE (mhoward @ Sep 15 2012, 09:06 AM) *
I've noticed that they're really compressing the living daylights out of MSL images before putting them on the web...

I think they're just using a fixed quality (75, maybe?). Of course it doesn't help that there is sometimes a decompress/recompress and that they JPEG-compress Bayer-pattern data. The final compression ratio seems to be around 8:1 to 9:1. Are the MER images really a lot lower compression ratio?

For the 100mm sun image, since the filter cuts out all of the pixels besides blue anyway, you'd be better off just tossing the other Bayer positions and then upsampling as desired. But it's still going to be a round slightly-fuzzy circle with a bite out of one side.

Posted by: ugordan Sep 15 2012, 05:35 PM

QUOTE (mcaplinger @ Sep 15 2012, 07:29 PM) *
Of course it doesn't help that there is sometimes a decompress/recompress

I though the cameras basically return JPEG compliant data so that there shouldn't be a need to decompress/recompress the stream again. Are they really doing that for color images?

Posted by: mcaplinger Sep 15 2012, 06:03 PM

QUOTE (ugordan @ Sep 15 2012, 10:35 AM) *
I though the cameras basically return JPEG compliant data so that there shouldn't be a need to decompress/recompress the stream again. Are they really doing that for color images?

http://pds.nasa.gov/tools/policy/ExplicitPDSArchiveRequirementsStatement.pdf
QUOTE
PDS Archives must comply with the following
• All EDR image data delivered for archiving must consist of simple raster images with PDS labels

Now, you could claim that this doesn't have to apply to public release images and I wouldn't argue with you, but it would require delivery in two forms unless there was a decompress/recompress cycle.

Posted by: mhoward Sep 15 2012, 06:19 PM

QUOTE (mcaplinger @ Sep 15 2012, 10:29 AM) *
Are the MER images really a lot lower compression ratio?


I've been looking at mostly Navcam images so far, so I guess I can't and shouldn't try to say anything about Mastcam. But for MER Navcam images, the JPGs on the web usually are about 200K or more in size; MSL Navcam JPGs on the web seem to be usually around 100K or less. And there just seem to be more and stronger JPG artifacts, at least in images I've looked at of the deck. That could be factors other than compression for the web; for all I know they're being compressed more heavily on the rover in some cases. All I really know is, trying to take a MER Navcam image and make it about the same file size as the MSL Navcam images we're seeing on the web, I have to use a JPEG 'quality' setting of about 40%, which is high compression.

Here's what I typically see (random section of the image that happens to be in front of me):


 

Posted by: mhoward Sep 15 2012, 06:50 PM

QUOTE (fredk @ Sep 15 2012, 10:14 AM) *
I haven't noticed that - could you give an example of that?


Yes, an example - good idea.

Again, this is Navcam; I admit the Mastcam images are probably completely different. And I don't know that this is because of heavy JPEG compression for the web. That's just my guess.

 

Posted by: xflare Sep 15 2012, 07:09 PM

QUOTE (mhoward @ Sep 15 2012, 07:50 PM) *
Again, this is Navcam; I admit the Mastcam images are probably completely different. And I don't know that this is because of heavy JPEG compression for the web. That's just my guess.


Damn.... that looks bad.

Posted by: mcaplinger Sep 15 2012, 07:23 PM

QUOTE (mhoward @ Sep 15 2012, 11:19 AM) *
But for MER Navcam images, the JPGs on the web usually are about 200K or more in size; MSL Navcam JPGs on the web seem to be usually around 100K or less.

Well, that speaks for itself. Assuming full 1024x1024 frames, 200K would be about 5:1 and 100K would be about 10:1. I did a quick spot check of some recent Navcams and they were more like 120K, but close enough.

I don't know how they chose the JPEG quality for MER and I don't know how they chose it for MSL, but I would think that 10:1 would be about quality 50 and 5:1 would be about quality 75. My own personal opinion is that 75 would be a more appropriate choice, but nobody's asking me.

As for your example, at that scale they both look pretty crummy (the MSL one has more JPEG artifacts, clearly), but I don't think zooming the image is really a fair test. That said, I wouldn't pick a fight with anyone who says the web release images are compressed too much, but I'm not sure I would use the phrase "compressing the living daylights out of" -- YMMV.

Posted by: fredk Sep 15 2012, 07:36 PM

For the navcams, here's my guess. We've discussed before (image thread?) that the MSL public navcams appear to be stretched/lut'ed/delut'ed differently from the MER navcams. In practice it looks like the histograms are typically concentrated into the lower half (0-128 or so) of the full 8 bit range. So there's less information to begin with than in MER navacams. But then the details on MSL navcams tend to be dark with low contrast, so maybe jpeg doesn't capture the details as well. Then you need to stretch them to show the detail, and you end up enhancing the jpeg artifacts. You could test this by jpegging an original image, and then halving the intensity of the original, jpegging, and then doubling the intensity, and comparing the two.

Or they may be compressed more heavily on MSL. MER uses various compression levels depending on the need.

Posted by: ugordan Sep 15 2012, 07:42 PM

QUOTE (fredk @ Sep 15 2012, 09:36 PM) *
Or they may be compressed more heavily on MSL.

Nah, navcams use wavelet compression, but these are blocky artifacts suggesting JPEG compression on the ground is destroying data.

Posted by: mhoward Sep 15 2012, 07:47 PM

QUOTE (mcaplinger @ Sep 15 2012, 12:23 PM) *
As for your example, at that scale they both look pretty crummy (the MSL one has more JPEG artifacts, clearly), but I don't think zooming the image is really a fair test. That said, I wouldn't pick a fight with anyone who says the web release images are compressed too much, but I'm not sure I would use the phrase "compressing the living daylights out of" -- YMMV.


Fair enough. Here's a more typically-sized comparison, as good as I can make it at the moment. The MSL version still seems lossy to me. But if it is compression that's being done on the rover to get the images back faster, I'm perfectly happy with that - I'm just so happy to be seeing images at all. I only bring it up because I care.

Normally I would JPEG-compress these images, but well, this is a special case for obvious reasons.

 

Posted by: mcaplinger Sep 15 2012, 07:52 PM

QUOTE (fredk @ Sep 15 2012, 12:36 PM) *
We've discussed before (image thread?) that the MSL public navcams appear to be stretched/lut'ed/delut'ed differently from the MER navcams.

I don't about MER, but as far as I can tell there is no stretching being done on any of the MSL images. They are typically autoexposed, which by its nature might be a little on the dark side but shouldn't really need much processing.

Posted by: mcaplinger Sep 15 2012, 08:06 PM

QUOTE (mhoward @ Sep 15 2012, 12:47 PM) *
The MSL version still seems lossy to me.

At this point I'm tempted to give into my bias and say that all Navcam images look bad to me. (Sorry, Justin. smile.gif Not all, but some.)

I've always been a bit surprised that the MER Navcams were so grainy-looking. This might be dark current from the relatively long readout time. They're a bit blurry but what can you do with only four elements and fixed focus?

As for wavelet compression, having ICER artifacts interact with JPEG artifacts isn't going to improve the images.

But to recap -- are the public release images on MSL being compressed on the ground more than for MER? Could well be.

Posted by: Eluchil Sep 16 2012, 06:54 AM

When comparing the left and right Mastcams of the sun I see a couple of things that I don't understand. The first is the large artifact at the bottom of the Mastcam 100 images. Is it a lens flare? The second is the color. The Mastcam34 thumbs are grey but the Mastcam100 thumbnails show as blue. I figured that this might have to do with the neutral density (sun) filters being different wavelengths, but looking at http://www.msss.com/msl/mastcam/MastCam_description.html the Mastcam34 sun filter is 440 nm which should be blue/violet and the Mastcam100 is 880 nm which should be near infrared. Sorry if this is obvious to others, but it seemed strange to me.

Posted by: djellison Sep 16 2012, 07:04 AM

Blue filters on a bayer filter can often be nIR transparent.

Posted by: mcaplinger Sep 16 2012, 02:57 PM

QUOTE (Eluchil @ Sep 15 2012, 11:54 PM) *
looking at http://www.msss.com/msl/mastcam/MastCam_description.html the Mastcam34 sun filter is 440 nm which should be blue/violet and the Mastcam100 is 880 nm which should be near infrared.

The web page is in error. http://www.lpi.usra.edu/meetings/lpsc2012/pdf/2541.pdf is correct.

Posted by: Eluchil Sep 17 2012, 07:06 AM

QUOTE (mcaplinger @ Sep 16 2012, 03:57 PM) *
http://www.lpi.usra.edu/meetings/lpsc2012/pdf/2541.pdf is correct.


Thanks. That paper cites Malin, M.C. et al. (2010) LPS XLII, Abstract #1533. as having more detail in regard to the near-IR characteristics of the Bayer pattern, but I can't find that abstract and the diagram in http://www.lpi.usra.edu/meetings/lpsc2010/pdf/1123.pdf is hard to read. Still I learned a bit about how narrowband imaging will work (or rather is already working) with the MastCam. Thanks for directing me in the right direction.

Posted by: centsworth_II Sep 17 2012, 12:07 PM

QUOTE (Eluchil @ Sep 17 2012, 03:06 AM) *
...the diagram in http://www.lpi.usra.edu/meetings/lpsc2010/pdf/1123.pdf is hard to read....

The diagram is easier to read at http://msl-scicorner.jpl.nasa.gov/Instruments/Mastcam/

Posted by: Ant103 Sep 18 2012, 01:57 PM

Oh nooo, what have they done ? sad.gif

Auto-white balance of the navcam images sad.gif

http://mars.jpl.nasa.gov/msl/multimedia/raw/?s=42&camera=NAV_LEFT_A

I liked the previous version a lot much because it was like working with true raw image, only degraded by the jpeg compression. But now… I hope this is just some kind of test, or something, because to be honnest, I don't like it.

Posted by: MahFL Sep 18 2012, 02:02 PM

I don't know if it's me or what but the MSL navcam images don't seem to be as good as the MER ones were. Any thoughts anyone ?

Posted by: dvandorn Sep 18 2012, 02:30 PM

As one of the first people to mention that the navcams looked tremendously murky without a lot of processing, I have to say I like the new stretch. Previously, as mhoward has noted, the histograms on the navcam images were all piled up in the dark half the available dynamic range.

As someone who wants to just look at the images and doesn't have an automated pipeline from the website into Photoshop, I'd rather be able to look at and enjoy the images directly, rather than feeling the need to save them and run them through contrast and brightness gamma enhancements just to have a reasonably non-murky image in which my old, tired eyes can actually pick out good details.

That could be just my own reaction, though. As always, YMMV. smile.gif

-the other Doug

Posted by: mhoward Sep 18 2012, 02:49 PM

Well, the Navcam image quality on the web improved dramatically for sol 42. The JPEG compression artifacts are gone; in fact I would describe the images as 'pristine', even better than we get from MER, now.

Posted by: Ant103 Sep 18 2012, 03:09 PM

Well yes, dark sand and over exposed rocks ?
http://mars.jpl.nasa.gov/msl-raw-images/proj/msl/redops/ods/surface/sol/00042/opgs/edr/ncam/NLA_401232988EDR_F0041632NCAM00427M_.JPG

Sorry about that but I have to disagree. Try out to devignetting pictures like this, with so much difference between frames… Maybe it's clearer for people that are just consulting it, but working with that ? Okay, then I think I'm good to some nightMER panoramic adjustments.

I hope that I'm not the only one to have this point of view…

Posted by: fredk Sep 18 2012, 03:14 PM

QUOTE (dvandorn @ Sep 18 2012, 02:30 PM) *
I like the new stretch... I'd rather be able to look at and enjoy the images directly, rather than feeling the need to save them and run them through contrast and brightness gamma enhancements

I agree completely. The new stretch makes it so much easier to quickly see if there's anything interesting in the new images. Then we can always compress the histogram back down into the lower 7 bits if we like.

Posted by: Ant103 Sep 18 2012, 03:41 PM

No Fredk. What has been lost, has been lost. If you have black area, or white area, you can't "lower" the contrast to get back some details in these areas

Check this : http://mars.jpl.nasa.gov/msl-raw-images/proj/msl/redops/ods/surface/sol/00042/opgs/edr/fcam/FLA_401232778EDR_F0041632FHAZ00302M_.JPG

Around the Mastcam shadow, you have a white flat area. There were details inside.

I'm a little bit angry in my words, but for me, it's a total mistake to stretch picture like this. And come one, the previous ones were no so dark…

Posted by: Doc Sep 18 2012, 05:05 PM

I think you're right Ant. I study with B/W sonography images and it's all about balancing the need for gain and compression. Once you have captured the image with such settings you can't regain the details by toying around with the histogram

Posted by: fredk Sep 18 2012, 06:14 PM

You can reproduce the overall look of the old images. But I agree completely that where the whites are clipped in the new images, you can't recover that. It's the same with MER, but at least there we have Powell's evernote source which appear to be stretched/lut'ed more like the old MSL navcams. I checked and Powell's evernote MSL images changed their stretch today and are providing identical images to the jpl site. So no luck there.

Posted by: djellison Sep 18 2012, 07:31 PM

QUOTE (Ant103 @ Sep 18 2012, 08:41 AM) *
but for me, it's a total mistake to stretch picture like this. And come one, the previous ones were no so dark…


Given that the primary purpose of these images is for people to look at them as they are, without photoshop to stretch them - they have done the right thing.

Many of the previous images were too dark.

http://mars.jpl.nasa.gov/msl/multimedia/raw/?rawid=NLA_401051733EDR_F0040916NCAM00424M_&s=40
http://mars.jpl.nasa.gov/msl-raw-images/proj/msl/redops/ods/surface/sol/00029/opgs/edr/ncam/NLA_400071866EDR_F0040000NCAM00418M_.JPG
http://mars.jpl.nasa.gov/msl-raw-images/proj/msl/redops/ods/surface/sol/00029/opgs/edr/ncam/NLA_400071177EDR_F0040000NCAM00302M_.JPG
http://mars.jpl.nasa.gov/msl-raw-images/proj/msl/redops/ods/surface/sol/00026/opgs/edr/ncam/NLA_399807408EDR_F0030530NCAM00417M_.JPG

These images would be 'better' for people to look at with the newer stretch. Yes - we get clipping at each end as a result, but the image occupies a larger part of the histogram and, combined with lower compression, is far better for people to actually look at.

Posted by: Ant103 Sep 18 2012, 08:23 PM

So, I could think that "people" have some trouble with their screen brightness. Because for me, it's not too dark.

But I guess you're right, and I will continue my battle against windmills.

Posted by: EdTruthan Sep 18 2012, 10:43 PM

Hopefully what we're seeing has been temporarily utilized for a specific purpose, but if not...

QUOTE (Ant103 @ Sep 18 2012, 08:41 AM) *
What has been lost, has been lost.....

Aaarg so true.... over or under saturation cannot be recovered, period. But it's the disruption of the gray level relationships that's most lamentable to me because that's where much of the textural subtleties and thus potentially valuable scientific comparatives reside. I'm not a trained geologist, but even from an armchair perspective, for a particular rock or patch of soil, slight differences in the average nominal gray levels is often the key to identifying similar or disparate specimens and helps in theorizing their inherent characteristics and relationships. And of course the extreme halo effect overall will make seamless looking mosaics a nightmare.

If the reasoning behind this is for the usability of the general public as suggested, perhaps some tweaking is part of the plan and we're seeing the first "extreme" test, after which I would hope that "toning down and tuning in" the level adjustment would be the next move. Hopefully there will be someone in the know here soon enough who can explain the rationale or temporal nature of what we're seeing a bit more.

Posted by: fredk Sep 18 2012, 11:33 PM

FWIW, my guess is that jpl received masses of emails from the general public saying "these images are too dark". That's why the change. But now that the MSL engineering images look very similar to MER (I'm guessing it's the same algorithm), I doubt very much we'll see further "iterations". It's the masses screaming "too dark" they're reacting too; I'd be surprized if they bent over to tweak the stretching because a few of us complained about clipped whites. Remember that we were also loosing information in the old MSL images, them being effectively around 7 bits instead of 8.

QUOTE (EdTruthan @ Sep 18 2012, 10:43 PM) *
over or under saturation cannot be recovered, period
...until the images appear on PDS! It won't be that long...

Posted by: markril Sep 19 2012, 01:19 AM

I'll just comment that I like the change (improved contrast) because now I don't need to process the images to get a decent cross-eyed view anymore. All I need to do is pop open a left navcam or hazcam image in a browser window and place it on the right side of my screen and vice versa. Then, cross eyes and voila! smile.gif

Posted by: Ant103 Sep 19 2012, 01:48 AM

About 3 months if I'm correct ? Scuse me but, certainly with color pics we will get a hand on it, but on Navcam, I don't think so. I didn't see so much processing of the MER Navcam PDS pictures in the past, and I don't think it will change with those coming from Curiosity. The fact of losing informations between 7 bits and 8 bits is in my opinion less important than losing informations by auto-adjust the histogram.

But I think I can say anything I want, this will not change the fact that now, Navcams will looks like this, I have to deal with it probably. I have sent a feedback via the "feedback" link at the bottom of Curiosity website, who knows ?

Thanksfully, they didn't apply such a processing onto color pictures from Mastcam (general public can fin them too flat, or too red tongue.gif) !


Posted by: ronald Sep 20 2012, 07:05 AM

QUOTE (Ant103 @ Sep 18 2012, 05:09 PM) *
I hope that I'm not the only one to have this point of view…


Definitely not! The "new" ones are really harder to work with and I guess we will see more navcam panoramas with not so good blending now.
Working more than ten years in postproduction on images my opinion is that raw images should stay unaltered.
Also the darker ones mentioned above represent better that it is Mars we are looking at and not Earth. smile.gif

Posted by: xflare Sep 20 2012, 08:54 AM

QUOTE (Ant103 @ Sep 19 2012, 02:48 AM) *
Thanksfully, they didn't apply such a processing onto color pictures from Mastcam (general public can fin them too flat, or too red tongue.gif) !


I was hoping they would increase the jpg quality of the color mastcam shots like they did with navcams, but the latest batch look even more highly compressed.

Posted by: ronald Sep 20 2012, 07:48 PM

Couldn't find an answer so far - can please someone explain why the calibration target on MSL does not have a dustcover or some sort of dust removal technology? What did they calibrate for when all the grey and coloured patches are full of dust? Even here on earth a photographer has to buy a new greycard now and then (at least he should because of colour changing) ...

Thank you! rolleyes.gif

Posted by: mcaplinger Sep 20 2012, 08:57 PM

QUOTE (ronald @ Sep 20 2012, 12:48 PM) *
can please someone explain why the calibration target on MSL does not have a dustcover or some sort of dust removal technology?

http://www.nbi.ku.dk/english/research/phd_theses/phd_theses_2011/line_drube/
QUOTE
Permanent ring-magnets have also been built into the calibration target of the Mars Science Laboratory (MSL), the same type of ring-magnet used in the Sweep magnet experiment on the Mars Exploration Rovers (MERs). Unfortunately, on MSL the ring-magnets were included at a very late stage in the development of the target (actually the target was a flight spare unit from the MER mission). This resulted in the ring-magnets being positioned at a depth of 0.8-1.0 mm below the surface instead of the 0.4 mm used on the MERs and Phoenix. From preliminary computer simulations this didn't appear to make a significant difference, other than in the size of the magnetically protected area. However, wind tunnel experiments using Salten Skov dust have now demonstrated that this relatively small difference in depth causes the "protected" area to disappear, so that with this new configuration the ring center will accumulate more dust than the reference areas free of influence from any magnetic field. With no clean area at all, magnets in this configuration will have the opposite effect to what they were intended to provide, attracting significant amounts of dust and retaining it on areas that are meant to be used as "dust-free" calibration standards.


Posted by: ronald Sep 21 2012, 06:09 AM

Thank you for the link. Two more:

http://www.nbi.ku.dk/forskningsgrupper/mars/english/research/missions/phoenix/caltarget/

http://www.agu.org/pubs/crossref/2008/2007JE003014.shtml

Quote form the second link:

QUOTE
From this we have concluded that essentially all particles in the Martian atmosphere are magnetic in the sense that they are attracted to permanent magnets.

Too bad it didn't work now on MSL.

Edit:
http://www.nbi.ku.dk/sciencexplorer/foredrag/mbmadsen2/video/ from http://en.wikipedia.org/wiki/Morten_Bo_Madsen unfortunately in danish rolleyes.gif

Posted by: udolein Sep 22 2012, 01:17 PM

How can I demosaic the bayer filter treated Curiosity raw images ? Is there any appropriate software known ?

Regards, Udo

Posted by: Nix Sep 22 2012, 01:47 PM

You can download GIMP (http://www.gimp.org/downloads/) and the plugin G'MIC(http://gmic.sourceforge.net/gimp.shtml) ..

then after opening the file in GIMP; Filters > G'MIC(new window) > Degradations > Bayer reconstruction

Posted by: udolein Sep 22 2012, 03:43 PM


Thanks-a-lot. Works !

Regards

Udo

Posted by: jmknapp Sep 22 2012, 08:10 PM

How would the night sky look at Gale Crater? Would the Milky Way be a magnificent sight, or might there be too much dust aloft to have a good view?

I was wondering if the mastcam az/el motors are up to the task of tracking a long exposure shot of the night sky.

Posted by: ngunn Sep 22 2012, 08:20 PM

I'll go with 'magnificent'. Dust isn't a problem if there's nothing lighting it up. Maybe the stars are only half as bright as from out in space (not a big difference really) but the background is still black. I wish the view from my back yard was that good.

Posted by: Joffan Sep 22 2012, 08:42 PM

QUOTE (jmknapp @ Sep 22 2012, 02:10 PM) *
I was wondering if the mastcam az/el motors are up to the task of tracking a long exposure shot of the night sky.

I was wondering whether - in another year or so - a long-exposure shot of Jupiter with the M100 would look good. Of course, we can get superb shots of Jupiter from other hardware, so it would be a bit of a vanity exercise, but interesting nonetheless. And it might give some indirect information about the state of the Martian night atmosphere or some other subtle data.

Posted by: fredk Sep 22 2012, 09:40 PM

There was lots of night imaging from Spirit of course. They didn't image the Milky Way, but did image the LMC. One potential problem with MSL is bandpass. The good results from Spirit used the L1 (open) pancam. I believe that has considerably wider bandpass than L0/R0 on mastcam, which has an IR cutoff (cutting above around 700 nm). I'm not sure how important the IR is to imaging the night sky, but that could mean you'd need substantially longer exposures with mastcam. But the efficiency of the detector and speed of the optical system also matters of course. And look at the Spirit images - CCD noise becomes important on long exposures.

As far as tracking, I think that's out. Even if they could manage to move the mast at the extremely slow rate you'd need while taking an exposure, it's an altazimuth mount so frame rotation quickly becomes a problem. Probably better to take multiple short exposures and stack them (with rotation to compensate for field rotation if needed).

Tau is important. The absorption when tau is high means you need longer exposures, which is always harder. I think dust would affect the visibility of diffuse objects like the Milky Way more than stars. Maybe if heating needs aren't too bad, they could do night imaging in the low-tau winter, which they could never do with MER.

Posted by: vikingmars Sep 23 2012, 11:50 AM

QUOTE (ronald @ Sep 21 2012, 08:09 AM) *
Thank you for the link. Two more:
http://www.nbi.ku.dk/forskningsgrupper/mars/english/research/missions/phoenix/caltarget/
Too bad it didn't work now on MSL.

Well... Those experiments are too much "combined" on the MSL for my own color calibration tests... The 4 color calibration targets are now "polluted" by the magnet experiments...
See sections from MastCam Sol 13 vs. Sol 44 image. It's a real pity...


And even more than a pity. This was a foreseen event as per page 58 of the "Martian Airborne Dust - Magnetic Properties on Phoenix and Dust on the MSL Calibration Target " thesis by Line Drube in 2011 :
http://www.nbi.ku.dk/english/research/phd_theses/phd_theses_2011/line_drube/Line_Drube_juni2011.pdf/
"It is apparent from Figure 50 that keeping the magnets in the MSL calibration target will quickly ruin much of the blue, green, yellow and red colored calibration areas for their intended purpose. Since less than a year to launch is too late for removing the magnets from the calibration target..."

Posted by: markril Sep 23 2012, 02:02 PM

QUOTE (fredk @ Sep 22 2012, 01:40 PM) *
As far as tracking, I think that's out. Even if they could manage to move the mast at the extremely slow rate you'd need while taking an exposure, it's an altazimuth mount so frame rotation quickly becomes a problem. Probably better to take multiple short exposures and stack them (with rotation to compensate for field rotation if needed).


Curiosity is currently at 4.5 degrees south latitude. If you could cant the rover by that amount in the north-south direction, then the masthead should be able to track objects on the sky's equator by simply adjusting altitude only without any field rotation. Hopefully, the minimum altitude increment of the masthead would shift the image by less than a pixel.

Mark

P.S. I wonder if MAHLI could be used? The arm has so many degrees of freedom that something should be possible for proper tracking. Looking at the arm I think you would still need to cant the rover, but at least you could track objects at different declinations on the sky by adjusting the last axis on the arm. Tracking would be done with the second to last axis. Question again would be if the pointing accuracy is good enough?

Posted by: fredk Sep 23 2012, 04:51 PM

The problem is precision. I'd be surprized if MC or MH could slew slowly enough to track the sky, which is about 4 seconds per arcminute. Both MC34 and MH have resolutions of around 1 arcmin/px. So you'd either need to bump something like 1 arcmin every 4 seconds, which is an incredibly fine step, or slew at an extremely slow continuous rate. Either is way outside what they were built to do. And I don't know if you could both move and shoot at the same time to begin with.

But the good news is optical speed. MER pancam is f/20, and still we could image the LMC! MH is f/8.5 and MC34 f/8. So in terms of optics alone, you'd need roughly a sixth of the exposure compared with pancam. (Even MC100 is f/10.) So stacking a series of short exposures might give very impressive results. (But there's still the lower optical bandwidth for mastcam compared with pancam.)

Posted by: jmknapp Sep 23 2012, 05:58 PM

QUOTE (fredk @ Sep 23 2012, 12:51 PM) *
I'd be surprized if MC or MH could slew slowly enough to track the sky, which is about 4 seconds per arcminute.


For the case of MC, found a reference: https://www-robotics.jpl.nasa.gov/publications/Mark_Maimone/fulltext.pdf

QUOTE
The absolute pointing accuracy of the RSM is approximately 4.6 milliradians (approximately 6 Navcam pixels), and pointing is
repeatable to less than a Navcam pixel.


So that would be about 15 MC pixels--not so great for tracking the sky then.


Posted by: djellison Sep 23 2012, 09:08 PM

QUOTE (vikingmars @ Sep 23 2012, 03:50 AM) *
It's a real pity...


Whats a pity? The middle of the ring magnet is rather clean. That's the point of them.


Posted by: markril Sep 23 2012, 09:31 PM

QUOTE (djellison @ Sep 23 2012, 02:08 PM) *
Whats a pity? The middle of the ring magnet is rather clean. That's the point of them.


I thought this post and referenced link indicated that the magnets were ineffective due to being placed at an incorrect depth below the surface of the color swatches:

http://www.unmannedspaceflight.com/index.php?showtopic=7418&st=120&p=191828&#entry191828

If true, this would seem to be a "pity". Edit: smile.gif

Mark

Posted by: djellison Sep 23 2012, 10:44 PM

Let's see if Mars behaves the same as that wind tunnel test.I'd be surprised if it does- Mars rarely plays by the rules.

Posted by: vikingmars Sep 24 2012, 10:19 AM

QUOTE (markril @ Sep 23 2012, 11:31 PM) *
If true, this would seem to be a "pity". Edit: smile.gif Mark

Agree. Here are my color calibration target sampling tests results(*) done last week with Sol 13 vs. Sol 44 samples (taken in the middle of the ring magnets).
There are still hints of colors on Sol 44 on the targets (especially for the blue target), but globally colors are much less saturated (especially on the red and yellowish-brown/ochre targets) and with results remarkably similar for both MastCams... (confirming, by the way, that both are well calibrated... of course !)

(*) Tests done with images I produced from raw imaging data by equalizing the raw images on grey values. The images shown hereabove are the fully calibrated final products

Posted by: fredk Sep 24 2012, 02:35 PM

The lighting's very different between those sol 13 and 44 shots - that may change the colours, even though the colour targets have a pretty flat/matte finish. Also exposures may be different - or have you tried to compensate?

Have you looked for pairs with more similar lighting?

Posted by: Paul Fjeld Sep 24 2012, 03:09 PM

QUOTE (vikingmars @ Sep 24 2012, 05:19 AM) *
...Here are my color calibration target sampling tests done last week with Sol 13 vs. Sol 44 samples (taken in the middle of the ring magnets).

Very interesting. I agree with fredk, though. I would think the highly specular reflection off the target from the late afternoon sun on SOL 44 would goof up the comparison significantly - there looks to be some specularity from the matte colors as well. I'm really interested in the effect of global illumination from the dust in the atmosphere at different times of the day. I've been using the Kapton tape all around Curiosity to try to get a notion of color under that illumination, in shadow and sunlit, and assume that it would be more amberish than under blue sky here. When I saturate the colors for that Kapton, I get a much more colorful landscape! Relentlessly orangy/browny smile.gif

Posted by: vikingmars Sep 24 2012, 03:46 PM

QUOTE (fredk @ Sep 24 2012, 04:35 PM) *

Thanks for your interesting questions. Here are some answers :

Q/ : The lighting's very different between those sol 13 and 44 shots - that may change the colours, even though the colour targets have a pretty flat/matte finish. Also exposures may be different - or have you tried to compensate ?
A/ : Yes, compensation with brightess only, NOT with contrast and saturation ;

Q/ : Have you looked for pairs with more similar lighting ?
A/ : I found none yet, but anyway my pics are processed to be equalized on the grey MSL cover on which the sundial is positioned and this gives to all pics the same basis for comparing hues, as explained in previous posts :
http://www.unmannedspaceflight.com/index.php?s=&showtopic=7454&view=findpost&p=191206
http://www.unmannedspaceflight.com/index.php?s=&showtopic=7454&view=findpost&p=191226

This technique has always worked well for comparing hues and dates from the "old" Viking times. The best evidence for the effectiveness of this process comes from the similar results that are obtained for sampling tests for both cameras (MastCam-34 and MastCam-100), with images taken with different contrast on Sol 44... smile.gif

(=> PS : To answer the note herebelow : I agree with the statement regarding the raw data. But, I should have insisted more on the fact that the images shown hereabove in my sampling tests are the fully calibrated ones I produced from this raw data by equalizing the raw images on grey values as explained before. In such a case, "JPG's on a rapid release raw website" works very well as a source of images and, besides, this is where lies the beautifulness of this technique I learnt at JPL)

Posted by: djellison Sep 24 2012, 04:52 PM

Note - you're not using calibrated data - you're using JPG's on a rapid release raw website. You're not comparing apples to apples at this point.

Calibration targets get dirty. It happened on Viking, MPF, MER, PHX...and it's happening on MSL.

Posted by: jmknapp Sep 24 2012, 05:31 PM

The date on that magnet study was June 2011, which brings up a question of why wasn't the situation fixed--was there not enough time, or perhaps did not everyone believe the wind tunnel simulation? The answer is in the conclusion of the linked http://www.nbi.ku.dk/english/research/phd_theses/phd_theses_2011/line_drube/Line_Drube_juni2011.pdf/:

QUOTE
The MSL magnet had almost no lee-side dust-poor area and the center area of the magnet was dustier than the reference area. This was a surprise as computer simulations predicted that the center would be kept clean. It is believed that some of the simplifications incorporated into the simulation were probably responsible. However, this experimental result indicates that in less than a month a large part of the blue, green, yellow and red areas just above the embedded magnets on the MSL calibration targets will be almost completely obscured by dust. Removal of the magnets hasn’t been possible at this late stage, so the surface imager calibration team will just have make do with what remains of the areas of color chips and with the different gray color areas, and write the calibration programs primarily using only them. This is not as bad as it sounds luckily, as the gray colors are the most crucial for calibration. There is also still the hope that the very uncertain estimate of the depth of the magnets will be in the very shallow end or some other differences between the experiment and real life will end up in favor of a [sweep] effect stronger than predicted and hence of better calibration measurements.

Posted by: mcaplinger Sep 24 2012, 05:59 PM

QUOTE (jmknapp @ Sep 24 2012, 10:31 AM) *
why wasn't the situation fixed...

In my opinion, the instruments were well-calibrated on the ground and the calibration target is not really needed. Given the number of images being taken of the cal target, this seems to be a minority opinion. rolleyes.gif

With all due respect to the people supplying the magnets, I'm not sure they understood the needs of imaging or the dynamics of the landing dust environment.

Posted by: djellison Sep 24 2012, 06:11 PM

Note that the thesis concludes that " in less than a month a large part of the blue, green, yellow and red areas just above the embedded magnets on the MSL calibration targets will be almost completely obscured by dust" - 48 sols in and I don't think we could say that they're 'almost completely obscured' so we already have reason to question its conclusions.


Posted by: vikingmars Sep 24 2012, 10:32 PM

QUOTE (djellison @ Sep 24 2012, 08:11 PM) *
...48 sols in and I don't think we could say that they're 'almost completely obscured'

Agree with you : not completely obscured. A few colors can be still retrieved now on the MSL targets, more than a month after its landing, on the blue and green channels, which is not too bad after all. And, as said above, "gray colors are the most crucial for calibration"...
Let's wait a few more weeks then... smile.gif

(=> PS : By the way, this interesting thesis reminds me the difficult calibration of MPF color images in 1997 using its 5 targets.
Herebelow is a Sol 2 except from MPF's "Insurance Pan"... The targets were not yet dusty. Enjoy !)



Posted by: ronatu Sep 24 2012, 11:53 PM

QUOTE (djellison @ Sep 24 2012, 02:11 PM) *
Note that the thesis concludes that " in less than a month a large part of the blue, green, yellow and red areas just above the embedded magnets on the MSL calibration targets will be almost completely obscured by dust" - 48 sols in and I don't think we could say that they're 'almost completely obscured' so we already have reason to question its conclusions.


1997?

Posted by: iMPREPREX Sep 26 2012, 06:50 AM

There appears to be dust on the right MastCam according to the subframes from sol 48. Do the lenses get brushed at all?

Posted by: Explorer1 Sep 26 2012, 07:31 AM

Do you mean the little fleck on the middle top of the recent daylight Phobos pics? It might be too far for the DRT to reach...

Posted by: xflare Sep 26 2012, 08:49 AM

QUOTE (Explorer1 @ Sep 26 2012, 08:31 AM) *
Do you mean the little fleck on the middle top of the recent daylight Phobos pics? It might be too far for the DRT to reach...



Was there any reason given for why they suddenly chose to dramatically increase the JPG quality of the BW engineering cameras, yet leaving the color mastcams the same? The color mastcams look like they may have actually decreased in quality.

That daylight phobos shot illustrates why a less destructive jpg compression would be greatly appreciated.

Posted by: mhoward Sep 26 2012, 12:03 PM

I haven't noticed any decrease in quality of the Mastcam images. I think the Mastcam images have always been compressed for the web at a more reasonable level than the engineering cameras previously were. If they were compressed a bit less it might help with a few things, but they're still amazing images and we're getting an incredible amount of detail out of them.

Posted by: vikingmars Sep 26 2012, 12:25 PM

100% agree with you ! They are great workable images already for processings.
Even though you may think otherwise... (see the Post Scriptum)
http://www.unmannedspaceflight.com/index.php?s=&showtopic=7418&view=findpost&p=192007
smile.gif

Posted by: Deimos Sep 26 2012, 01:05 PM

Remember that the Mastcam images are downlinked to Earth at a wide variety of different compressions depending on competing desires for high quality images, getting images quickly, and having bandwidth available for other purposes. Some test images may well be lower-than-usual quality, and the public release isn't going to fix that. Other images will be high quality, but limited by whatever the heck is done for the release.

Posted by: mcaplinger Sep 26 2012, 01:20 PM

QUOTE (Explorer1 @ Sep 26 2012, 12:31 AM) *
Do you mean the little fleck on the middle top of the recent daylight Phobos pics?

This is crud on the focal plane. It's been there since before launch, see http://www.msss.com/images/science/mastcam/m100cwb.jpg just above the red sign.

Posted by: pospa Sep 26 2012, 02:44 PM

Please don't lapidate me ph34r.gif and excuse for maybe very stupid question, but under impression of recent Phobos picture from sol 45 I'd like to ask this:
Could any of MSL cameras Mastcam, Navcam, MAHLI take some meaningful night astrophoto pictures?
For instance according to https://www-robotics.jpl.nasa.gov/publications/Mark_Maimone/fulltext.pdf Navcams has exposure time up to 335,5 seconds and other technical parameters could be suitable as well, but I realy don't know as being no astronomer/astrophotographer at all.

If its possible, does anybody know if there are plans for some "night sky campaign" already?
Thx

Posted by: fredk Sep 26 2012, 02:54 PM

We talked about nighttime observing with MSL just a few days ago in this same thread, starting with http://www.unmannedspaceflight.com/index.php?s=&showtopic=7418&view=findpost&p=191940

Posted by: pospa Sep 26 2012, 04:07 PM

Thanks, fredk. I'm at UMSF almost daily but this discussion somehow passed without my noticing.
So OK, the fine camera pointing is the bottleneck, more then optic systems themselves.

Posted by: iMPREPREX Sep 26 2012, 08:09 PM

QUOTE (mcaplinger @ Sep 26 2012, 08:20 AM) *
This is crud on the focal plane. It's been there since before launch, see http://www.msss.com/images/science/mastcam/m100cwb.jpg just above the red sign.


That's it. Thanks. smile.gif

Posted by: Greenish Oct 10 2012, 05:12 PM

I don't post much but am a daily reader of this forum, so I'm curious - I haven't seen many (any?) posts of false-color images generated from the Mastcam filters as opposed to the Bayer RGB images. I can think of a few reasons but am curious which if any of them apply:

1. because not many filter images have been taken yet?
2. because the image filenames don't seem to indicate which filter was used? (or do they? seem to be basically sequence numbers to me plus thumbnail/subframe etc. codes)
3. because the filter images released so far are all very small, and therefore not useful for compositing even with larger images for luminance
4. because the filter images released so far are all very small, and therefore JPG artifacts are so severe they make a mess of any composites
5. no-one really is interested since for the targets imaged so far they wouldn't tell us much anyway?
6. no-one really is interested since the Bayer color images are so good and there are so many there's just not time to do both?
7. some combination of the above?

I mostly ask because it seems like the false-color images are used a lot on the MER side, and because I'm attempting to teach myself how to make composite images, and want to know if these would be worth my while.

Thanks in advance for any info.

Posted by: mcaplinger Oct 10 2012, 05:55 PM

1 doesn't apply, there have been lots of narrowband filter images taken. Any image that shows up gray in full-res form is probably a narrowband image.

2 is definitely an impediment. For the visible narrowband filters you can figure out the bandpass from the tinting of the thumbnail, for the IR filters you have to guess, from the image order or possibly from the number of hot pixels since the farther into the red, the longer the exposure time in general.

3 and 4 don't apply, lots of the images are normal-sized.

5 and 6 are the reasons I haven't done any false-color stuff.

Posted by: ronald Oct 10 2012, 07:05 PM

I did made some tests but the false colored (using only red, green and blue filters, no IR) just look like the color images with auto-whitebalance or auto-color-correction (left). With balancing out the single channels I come pretty close to the color images (greys slightly too blue - right). I did not see anything fancy in the infrared ones yet.

http://mars.jpl.nasa.gov/msl-raw-images/msss/00044/mcam/0044ML0204000000E1_DXXX.jpg



http://www.redorbit.com/images/pic/63392/curiosity-rovers-mastcam/ gives some hints. For "true" color I get better results with the "nearIR-Red" (left). With the "650-nm-red" the blue edge on the "sundial" turns red/pink like on the MER pancam (right) - no idea why it is this way.



Sometimes the filtered shots are not 100% aligned.
Often the first shot is green, second is blue, third is 650 nm, fourth is 750 nm, then IRs - but double-check with the thumbnails to be safe. I do use this numbers for R-G-B: R 240/30/0 - G 55/255/0 - B 255/0/0.

So #5 for me smile.gif

Posted by: Greenish Oct 10 2012, 10:17 PM

Thanks for the replies. I had overlooked the grayscale-looking full-res images - it makes sense that they are IR (though unknown which band) - I had guessed instead that they were perhaps a way to reduce bandwidth vs. bringing back all images in full RGB.

I agree that it does look like there's a typical order the filtered images are taken, though sometimes I think they skip some filters (or maybe they haven't been transmitted yet). I'll have to scrutinize the filenames more carefully to see if there's obviously missing images or skipping filters.

I guess we'll wait & see if they're used more extensively or if certain targets would benefit from the narrowband/multispectral treatment. I sure won't hold my breath for one more letter or number to be added to the filenames - the reasons for not providing metadata have already been hashed out.

I thought of another complication (#7 I guess), though perhaps this has been mentioned elsewhere - for the bands where the RGB Bayer filters overlap the filter band, would the image need to be de-Bayered and the different color pixels adjusted for different responses? If so, this would seem to preclude clean processing of those released narrowband images due to JPG artifacts. Pretty sure I read that the pixel filter responses are all similar in the IR though.




Posted by: mcaplinger Oct 11 2012, 03:46 AM

QUOTE (Greenish @ Oct 10 2012, 03:17 PM) *
I agree that it does look like there's a typical order the filtered images are taken...
for the bands where the RGB Bayer filters overlap the filter band, would the image need to be de-Bayered...

http://www.lpi.usra.edu/meetings/lpsc2012/pdf/2541.pdf has a table that shows the filter numbering for each Mastcam. If you see six grayscale images of the same scene, they are most likely ordered this way. If there are fewer than six, then you will have to do some guessing for the IR bands.

For the narrowband filters that overlap the Bayer pattern, all needed processing is done in the camera unless the image is commanded raw (which may have happened a few times, I haven't looked.)

The abstract referenced above also has a good discussion of what the filters are useful for.

Posted by: Greenish Oct 11 2012, 02:42 PM

Thanks again, mcaplinger, for the clarification and the reference to the as-built filters. Pretty cool that the needed Bayer adjustments are done in the camera.

Would be great if someone could add the above document to the MSL FAQS & USEFUL DOCS thread, MASTCAM is not covered in the listed documents there.

Also the page http://msl-scicorner.jpl.nasa.gov/Instruments/Mastcam/ has the sun-viewing ND440 and ND880 filters switched (L<->R) compared to the above document.

Posted by: mcaplinger Oct 14 2012, 03:25 PM

I've seen quite a bit of difference in how people are doing color balance of the Mastcam and MAHLI images, both here and in press releases.

I don't know how anyone is doing this specifically. Some results seem too gray or red/pink to me. Without something known to be neutral in the image it's hard to do white balance on any direct basis. Even in fairly dusty conditions, I'm skeptical that there's enough tinting by sky scattering to really affect the color of the scene much, especially near mid-day.

The top of the white pyro box where the Mastcam cal target is mounted seems pretty clean in the sol 3 images. To normalize that to neutral, I multiplied the red channel by 1.06 and the blue channel by 1.13 and the results seemed OK. This is a fairly subtle change, but it gets rid of the green cast in the raws; see example below (raw on the left, processed on the right.)



Posted by: Zeke4ther Oct 14 2012, 05:25 PM

That definitely looks better. The black strut looks 'black' now as well.

Posted by: ronald Oct 14 2012, 06:09 PM

QUOTE (mcaplinger @ Oct 14 2012, 05:25 PM) *
Even in fairly dusty conditions, I'm skeptical that there's enough tinting by sky scattering to really affect the color of the scene much, especially near mid-day.


Would be nice to have any paper/studie on this subject. I would believe that the light on mars is not a neutral white and with this in mind I wouldn't calibrate a white surface on mars to neutral white, at least not to get "true color" results.

Another thing I'm thinking about are exposure times - even at mid-day it would not be that bright on mars like on earth right? So the images are all to bright too (?).

original - white balanced - wb and darker


Posted by: ugordan Oct 14 2012, 06:53 PM

QUOTE (ronald @ Oct 14 2012, 08:09 PM) *
So the images are all to bright too (?).

No, they're not. If anything, they are all too dark on computer screens. The eye is also amazingly adaptive and can compensate for vast brightness differences without you actually noticing them. I would guess that you wouldn't even notice that the Sun is around 50% dimmer there on average.

Posted by: mcaplinger Oct 14 2012, 07:15 PM

QUOTE (ronald @ Oct 14 2012, 11:09 AM) *
I would believe that the light on mars is not a neutral white...

Why? Do images taken on Earth look blue because the sky is blue? AFAIK, it takes unusual conditions (dust storms or smoke) on Earth to change the color balance of daylight near mid-day. And of course sunlight isn't "white" anyway, so to zeroth order one balances an image so that something that looks white to the eye looks white when the eye looks at the image (more complicated than it sounds.)

This has been analyzed a lot on previous missions, inconclusively IMHO. MSL is the first time we've had broadband filters that approximate the human eye response, so there's some reason to believe that with minor adjustments we can get "natural color". I'm not certain how automatic white-balance algorithms work; see http://en.wikipedia.org/wiki/Color_balance for a start, but ironically, someone has used an MSL image on that page as an example of white balance, and I suspect that image was just tweaked in Photoshop.

Posted by: ronald Oct 14 2012, 07:56 PM

QUOTE (mcaplinger @ Oct 14 2012, 09:15 PM) *
Why? Do images taken on Earth look blue because the sky is blue?

No, of course not. But with a "wrong" white balance they do smile.gif

I thought there is more dust in the martian atmosphere than under normal earth conditions? So you would say there are the same (or close) light conditions on mars like here on earth (white or grey would be more ore less neutral)?

Edit: I do not question your arguments - I just would use slight off-white instead of a neutral white. And for sure I dont want to split hairs here.

Posted by: mcaplinger Oct 14 2012, 08:32 PM

QUOTE (ronald @ Oct 14 2012, 12:56 PM) *
So you would say there are the same (or close) light conditions on mars like here on earth (white or grey would be more ore less neutral)?

Short answer: I would guess so under average dust loading (that's what we assumed to compute typical exposure times, etc), but I am not exactly an expert. The idea of "what it would look like" is so slippery as to start to lose engineering meaning.

http://marswatch.astro.cornell.edu/Bell_etal_SkyColor_06.pdf is a nice try from MER but I'm not convinced it's conclusive or even what question it's trying to answer -- I think the sky color at the horizon.

Posted by: DDAVIS Oct 14 2012, 09:05 PM


Unfortunately this article is behind a paywall, but the abstract tells the essence of the tale:

http://adsabs.harvard.edu/abs/1999JGR...104.8795T

The ambience of the Martian sky is a factor in the color of the scenes shaded regions in particular. The effects of the ambient lighting on the changing apparent color of the 'Yogi' rock is mentioned in the article.

I wish there was within the field of view a stepped grayscale like the Vikings had. An abbreviated and/or reduced variation of the Macbeth color charts such as was used in the camera tests would be even better. That said, the color balance looks pretty good, the brightness levels seem justifiably 'brightened' somewhat to optimize the detail recorded.

Posted by: Eyesonmars Oct 14 2012, 09:23 PM

QUOTE (ugordan @ Oct 14 2012, 07:53 PM) *
No, they're not. If anything, they are all too dark on computer screens. The eye is also amazingly adaptive and can compensate for vast brightness differences without you actually noticing them. I would guess that you wouldn't even notice that the Sun is around 50% dimmer there on average.

The hypothetical question "How would our perceptions of color change on worlds with dimmer or stronger sunshine" has always interested me.
I've been lucky enough to witness two total solar eclipses in my adult life. Both of them observed under crystal clear skies and from an elevated location with a unobstructed view to the horizon. The first one was observed on feb 26 1979 from atop a small Mesa near wolf point montana with the sun about 25 degrees high. With this question in mind I made note of what the world looked like under Martian (50%), asteroid belt(10%), and Jovian illumination(4%). As ugordon states, I expected to not notice much until most of the sun was covered. But at 50% illumination the change in color and contrast was significant. It was as though I had a pair of weak sun glasses on. However when I asked my 3 companions( who had not been paying attention to this) if they noticed it I got a yes,no and a maybe.

Posted by: Deimos Oct 14 2012, 10:40 PM

I tend to be more on the side of expecting the sky to color the terrain, and wanting to see what the scene looks like with a context appropriate white balance (i.e., using the cal target). Under current dust loads, diffuse sky light exceeds direct sun light throughout the sol. That's never true under a blue sky on Earth--but rarely true for other sky colors and frequently true for gray skies. And, on Earth, the blue sky results in a yellowed Sun. On Mars, the dust slightly reddens (oranges?) the Sun as well as being colored by mineral absorption. That said, I doubt astronauts will come back talking about how much the sky changed their color perception--it's well within adaptability (but that's another debate).

Another small effect that can change the appearance of the same image as interpreted by different people is just that, while the broad filters sorta match the eye's response, the processing (e.g., companding to 8 bits vs. gamma=2.4 vs. perhaps expanding the bits without applying a gamma [?], no accounting for eye vs. monitor color) does not match the sRGB standard, and will not show up as the right color on typical monitors. These differences are small, I think--but can result in slight variations of color balance, even between white balanced images.

Posted by: ngunn Oct 14 2012, 11:00 PM

QUOTE (Deimos @ Oct 14 2012, 11:40 PM) *
On Mars, the dust slightly reddens (oranges?) the Sun


Really? I thought reddish dust would blue the sun. It certainly does at sunset. I think the one highly noticable difference in the quality of light on Mars would be the colour of the shadows. They'd be brownish not bluish because that's the colour of the scattered light there.

Posted by: vikingmars Oct 14 2012, 11:11 PM

QUOTE (DDAVIS @ Oct 14 2012, 11:05 PM) *
- The ambience of the Martian sky is a factor in the color of the scenes shaded regions in particular. .../...
- I wish there was within the field of view a stepped grayscale like the Vikings had.

I agree 100% with Don, who has alway been for us, "Martians", the great "Master" of Mars colors since the old Viking times... (and also much respect to Deimos...)

At JPL we did accurate color processings using the 3 grayscale charts positioned on the VL1 deck with images taken during the Monitor & Long-Term missions. Here are the comparative results : we discovered that the color of the Martian sky changes a lot during the seasons. In fact NO one can tell which color exactly are the Martian sky and ground at a specific time. Sayings at JPL were that the Martian soil is of a "yellowish-brown" (refer to the JGR issues of 1977) and the sky is of a "moderate salmon pink"... globally but NOT on a specific day. Herebelow are accurate comparative processings showing the changes of colors of the Martian sky over the seasons...



Regarding the equalizing of raw images on grey values, please refer to posts #
http://www.unmannedspaceflight.com/index.php?s=&showtopic=7418&view=findpost&p=191996
http://www.unmannedspaceflight.com/index.php?s=&showtopic=7418&view=findpost&p=192007
Enjoy ! smile.gif

PS : Sol 1742 image was taken during a dust storm. See the inverted brightness due to higher concentration of dust in the plane of sight towards the horizon...

Posted by: AndyG Oct 14 2012, 11:24 PM

QUOTE (Eyesonmars @ Oct 14 2012, 10:23 PM) *
...at 50% illumination the change in color and contrast was significant. It was as though I had a pair of weak sun glasses on. However when I asked my 3 companions( who had not been paying attention to this) if they noticed it I got a yes,no and a maybe.


Interesting. As an ex-BBC camera operator, with a vested (professional) interest in correct white balance and lighting in general, my experience of the near total eclipse of August '99 in Scotland was different: what struck me at near totality was the sharpening of shadows. Not, particularly, an overall change in the amount of light. As (I think it was?) Heinlein once said, "a bit of sun is still a lot of light".

For Mars? A smaller sun will mean subtley sharper shadows - I doubt we would notice much in the way of a drop in insolation. And as to white balance, the eye copes with amazing variations all the time: we'd white-balance on any available clues. My gut reaction is Ronald's central frame in his post above is about right.

Andy

Posted by: stevesliva Oct 14 2012, 11:39 PM

QUOTE (Deimos @ Oct 14 2012, 06:40 PM) *
That said, I doubt astronauts will come back talking about how much the sky changed their color perception--it's well within adaptability (but that's another debate).


It's always fun to get accustomed to ski goggles or tinted sunglasses and then remove them. Pretty easy to become unaware of the weird lighting.

Posted by: mcaplinger Oct 15 2012, 04:45 AM

QUOTE (Deimos @ Oct 14 2012, 03:40 PM) *
Under current dust loads, diffuse sky light exceeds direct sun light throughout the sol.

Mark, do you have a reference for this? I find it very surprising.

It's true that I've ignored the effects of the companding.

Posted by: Deimos Oct 15 2012, 08:32 AM

Smith & Lemmon 1999 (MPF special issue) talks about tau=0.5 implying ~40:60 sky:Sun. I should note that getting to 40:60 or 50:50 accounts for all the sky light, and the relatively bluer light near the Sun can offset the rest of the sky.

With respect to Sun color: it is a subtle (few %) effect, but the optical depth increases with wavelength in the visible in the absence of ice (same ref). The color of the sky near the Sun at sunset is not the same as that of the Sun. On Earth, the coloring of Sun & sky is due to removal of blue light. On Mars, the coloring is all about distribution: diffraction of the blue light keeps it closer, in angle, to the Sun compared to red light. The daytime sky is reddened due to absorption of blue, not the preferential removal of red light.

Posted by: ronald Oct 15 2012, 08:56 AM

QUOTE (AndyG @ Oct 15 2012, 01:24 AM) *
My gut reaction is Ronald's central frame in his post above is about right.

That is mcaplingers suggested white balance applied and I agree that this is the right direction to go color-wise.

Posted by: mcaplinger Oct 15 2012, 04:55 PM

QUOTE (Deimos @ Oct 15 2012, 01:32 AM) *
Smith & Lemmon 1999 (MPF special issue) talks about tau=0.5 implying ~40:60 sky:Sun. I should note that getting to 40:60 or 50:50 accounts for all the sky light, and the relatively bluer light near the Sun can offset the rest of the sky.

That would be "Opacity of the Martian atmosphere measured by the Imager for Mars Pathfinder", Smith, Peter H.; Lemmon, Mark, Journal of Geophysical Research, Volume 104, Issue E4, p. 8975-8986.

Doesn't that imply that shadowed regions would be something like half the brightness of directly-illuminated ones, something which is demonstrably not true for, say, MAHLI images with small amounts of shadowing from the arm? For the one image I looked at, the ratio (linearized) was more like 3.2:1. Of course, this is a tricky geometric radiosity problem and I do admit that the shadows are brighter than I expected.

Posted by: EdTruthan Oct 15 2012, 07:26 PM

Sorry for the long post but if this works as a fairly accurate white balancing trick I wanted to get everybody's take on it and share the technique...

The white balance issue (thanks for bringing it to the forefront mcaplinger) certainly is a pesky one for imagery geeks like myself. Where there is a calibration target in the image its a pretty simple task to equalize RGB values based on a gray target. Mcapliger's general adjustment values are certainly a move in the right direction. But with so many differing landscapes and lighting values based on time of day, sun angle, and atmosphic dust content, and absolutely nothing to accurately base mean gray values on, I've been struggling with how to find a method of determining just how one would go about getting an accurate white balance with any given image. This morning I stumbled across a technique that after a some extensive tests seems to show promise...

I was remembering how if one takes a color image (any image that is, Earth based or not), and copies it into another layer, inverts the color, and reduces the opacity of that inverted upper layer to 50%, the transparent inverted colors cancel out the colors of the original image below, leaving a blank neutral grayscale image. Using this concept and and a few tools in Photoshop (requires any CS version) I created custom photo filters (that vary slightly on a per image basis) that appear to white balance the Martian atmospheric tinge on calibration targets near perfectly, and so by extension one could argue the landscape as well.

Here's the technique I used for the following examples (I suppose one could loosely refer to it as "IBF" or Invert > Blur > Filter):

1. Open an MSL image in Photoshop.
2. Duplicate the "Background" layer. You now have a "Background copy" layer above the original.
3. With the "Background copy" layer selected, choose "Image > Adjustments > Invert", then "Filter > Blur > Average". You should now have a bluish single colored blank layer.
4. Use the eye dropper tool to select this color as your foreground color on the tools palette. Now turn this layer off so you can see the original image.
5. Now select your MSL image layer again ("Background") and choose "Image > Adjustments > Photo Filter..."
6. In the dialog that opens choose the "Color" radio button and click on the default color swatch and assign it with the bluish foreground color you saved on the tools palette.
7. Move the slider to 95%. (determined because at 95% the calibration target gray RGB values are closest).

Now if the image you're using is a landscape shot you're going to notice the color has washed out quite a bit. This is the filter at work. Just do the following.

8. Choose "Image > Adjustments > Hue/Saturation..." and increase the Saturation Slider to about 55-60.

This last adjustment is about where most of the tests I ran on landscapes seemed to restore the color to about the intensity of the original, though admittedly it's arbitrary. In fact, the calibration target samples below only seemed to require a 25-30 increase in saturation, as any higher seemed to over saturate them. I take this to perhaps be and indication that the farther away the target, the more saturation "recovery" must be applied (due to the extra desaturation effect of the atmosphere?).

Using this technique on an image by image basis (i.e. the precise color of the filter varying as per the inverted, blurred averaged color of the original) I was able to achieve the following results. As the technique seemed to almost perfectly balance the white, gray, and black levels of the calibration targets, could we then assume that the landscape color values must then be similarly accurate? Hmmmm.


http://www.edtruthan.com/mars/Invert-Blur-Filter-Test-Calibration-Target.jpg

http://www.edtruthan.com/mars/Invert-Blur-Filter-Test-MAHLI-Target.jpg

http://www.edtruthan.com/mars/Invert-Blur-Filter-Test-Sol-51.jpg

http://www.edtruthan.com/mars/Invert-Blur-Filter-Test-Hottah.jpg

Posted by: mcaplinger Oct 15 2012, 08:34 PM

QUOTE (EdTruthan @ Oct 15 2012, 12:26 PM) *
Sorry for the long post but if this works as a fairly accurate white balancing trick I wanted to get everybody's take on it and share the technique...

How is this different than any other auto white-balance algorithm? It looks to me like a variant of the standard "gray world" algorithm. http://therefractedlight.blogspot.com/2011/09/white-balance-part-2-gray-world.html

If you want to make the average color of the scene neutral, it works fine, but that may not be what you really want to do.

Posted by: ngunn Oct 15 2012, 08:46 PM

It looks (on my monitor, to my eyes, adjusted to current ambient lighting from a tungsten filament lamp) too blue to be true. I'll check again in the morning.

Posted by: ronald Oct 15 2012, 08:49 PM

With Deimos arguments http://www.unmannedspaceflight.com/index.php?s=&showtopic=7418&view=findpost&p=193320 (40:60 - sky:sun) we are back to a slight color tint from the sky?
I would appreciate this wink.gif (Of course the human eye would counterbalance this if you stay there).

Regarding Ed's approach - all greyer parts in the image (clean parts of the rocks for example) would come out too blue because of the averaging done in step 3, assuming that most of the image is somewhat yellowish.

Posted by: atomoid Oct 16 2012, 12:08 AM

the calibration target is looking pretty dusty already, is there any means to clean it?
sorry to pose such a noob question, but how to otherwise prevent the calibration to skew towards martian dust tones going forward?

UPDATE: if anyone has similar questions, http://www.unmannedspaceflight.com/index.php?showtopic=7418&view=findpost&p=192015 and others surrounding it cleared up a lot o this since i missed that whole discussion..
nevertheless, since it would dust-up in a couple months, i guess the color target was only included for a brief post-landing calibration sanity-check?
i was envisioning the arm lurching up to brush off the dust off whilst attempting its best to not clobber critical components...

Posted by: mcaplinger Oct 16 2012, 12:22 AM

QUOTE (atomoid @ Oct 15 2012, 05:08 PM) *
the calibration target is looking pretty dusty already, is there any means to clean it?

In a word, no.

I think people may be confused about the difference between calibration and white balance. Calibration is removing instrument signature. Once it's done, it doesn't need to be done again as long as the instrument stays stable (and there is not much reason for it not to be). White balance is making white things look white in a particular image regardless of whether they would "really look white in reality" (whatever that means). What I was attempting to do was more the former than the latter.

Posted by: EdTruthan Oct 16 2012, 02:38 AM

QUOTE (mcaplinger @ Oct 15 2012, 01:34 PM) *
How is this different than any other auto white-balance algorithm? It looks to me like a variant of the standard "gray world" algorithm. http://therefractedlight.blogspot.com/2011/09/white-balance-part-2-gray-world.html

If you want to make the average color of the scene neutral, it works fine, but that may not be what you really want to do.


Well you're right it's kind of a variant but with a totally different mechanism. What I like about the filter approach is that traditional white balancing adjustments like the "Gray World" curves balancing, levels tweaks, and most of the "Auto White Balance" algorithms I've experimented with involve directly altering the separate RGB input levels to hopefully achieve a balanced assumption of neutral gray, but in doing so, often alter the white and black intensity levels and contrast. It can get really tricky. The filtering approach doesn't attempt to foundationally alter the already existing RGB relationships or drastically alter the white or black intensity levels, just correct the yellowish cast from the atmospheric light by utilizing its directly inverted counterpart to filter it back toward neutral. That said, though professional graphics is a part of my business, I'm certainly not a scholar of color science as the knowledge of many members here are clearly beyond mine. What I do know is that of the many differing and sometimes subjectively random results from a variety of white balance routines I've played around with, it seems to be a pretty quick and painless technique, and if the post-filtering calibration target grays and whites are any evidence, offers a reasonably acceptable quantum of accuracy too. - As you said though... If "neutral" is the goal that is.

Posted by: vikingmars Oct 16 2012, 08:26 AM

QUOTE (Deimos @ Oct 15 2012, 12:40 AM) *
I tend to be more on the side of expecting the sky to color the terrain

I totally agree with Deimos.
When you look carefully to the VL1 images shown here (I know it's a kind of "old" visual science for some bloggers here, BUT let's go "back to the basics" !),
http://www.unmannedspaceflight.com/index.php?s=&showtopic=7418&view=findpost&p=193322
you will easily notice that on Sols 1520 and 1557 with maximum sky opacity, the terrain is much brighter than on Sols 1298 and 2001 (of course, all images were taken with same gain and offset).
Besides, like Don, I think that there is still good science to be retrieved from VL images...
So, in my opinion, Deimos is absolutely right in his sayings.

Posted by: ronald Oct 18 2012, 10:42 AM

I'm still somewhat puzzled with the brightness of the images. There is a nice http://mars.jpl.nasa.gov/msl-raw-images/proj/msl/redops/ods/surface/sol/00071/opgs/edr/ncam/NRA_403797280EDR_F0050104NCAM00526M_.JPG down from sol 71 with the rover and surface both in one image. As you can see below I compared this to anhttp://mars.jpl.nasa.gov/msl-raw-images/msss/00061/mcam/0061ML0279001000D1_DXXX.jpg (green filter). In the middle is a brightness corrected version to match the average surface brightness of the sol 61 image (and most other images).



As you can see the rover would be too bright now - so in reverse ...



This would be mcaplingers color correction set to about 50% and somewhat reduced brightness (not linear as the highlights are still in and not that much as in the upper comparison).


Posted by: Ant103 Oct 18 2012, 11:07 AM

Ronald, you just can't made deduction starting on the basis of Navcam STRETECHED pictures. The big difference between Navcam and Mastcan is that its pics are not stretched.

But I can't seriously where's the debate is about. This is just a question of white balance. Even with a camera (compact, bridge or reflex, whatever), you can adjust it (sunlight, shadow, cloudy, tungstene, flash, etc.) and this can lead to a red-ish or a blue-ish picture. So, I guess that the Mastcams are tunned on sunlight white balance (maybe around a temperature of 5200 K).

For me, and as a photographer, they are just correctly white balanced, and it's normal that there is some lack of contrast, especially with Mastcam100, because it's a telelens, and there is a lot of glass between the sensor and the subject. And most of the time, we have scenery imaged with a very high sun, this lead to a lack of shadowing, and yet, contrast.

Posted by: ronald Oct 18 2012, 01:05 PM

I can smile.gif



Even on the now stretched image above the surface is much brighter. It is not my ambition to get something 100% correct but just more a rule of thumb how to interpret the raw images.

Posted by: ugordan Oct 18 2012, 02:49 PM

Mastcam images should be your reference point, not navcams. The former use a square root encoding that matches sRGB gamma pretty well, while navcam images to me seem to be returned in linear A/D converted form, which on computer screens makes the contrast enhanced. This, in addition to the raw stretch that makes darkest areas black and brightest areas white.

Posted by: mcaplinger Oct 18 2012, 04:12 PM

QUOTE (Ant103 @ Oct 18 2012, 04:07 AM) *
So, I guess that the Mastcams are tunned on sunlight white balance (maybe around a temperature of 5200 K).

There is no color balancing done in the camera at all. The CCD signal, with the IR cut and Bayer pattern filter throughput and the detector quantum efficiency, is directly converted, run through the square-root encoder, interpolated, compressed, and sent to the ground. No attempt was made to balance this, although because of the way it works out, it is pretty well-balanced for sunlight. For a terrestrial sunlit scene, the raw image is just a little bit greenish, which is more or less what my analysis above says.

As for brightness, perceived brightness is an even slipperier concept than perceived color. Most of these images are auto-exposed to mostly fill the 11-bit histogram prior to companding. What that means relative to how they "should look" is as much a matter of taste as anything else.

Posted by: ronald Oct 18 2012, 05:21 PM

Absolutely right regarding percieved brightness and colors! Although I was more thinking from a photographers point of view.
Thanks for the hint with the auto-exposure, this explains why the surface ist darker when the bright rover top is in the view, giving a somewhat near white to the camera and doing a "good" auto-exposure then. When no bright parts are seen in the view the auto-exposure does a very well job for scientific purpose but maybe overall a bit too bright (for my gusto it is smile.gif).

Edit: Just to illustrate my thinking - compare these two images (same sol, same time): http://mars.jpl.nasa.gov/msl-raw-images/msss/00050/mcam/0050ML0230007000E1_DXXX.jpg http://mars.jpl.nasa.gov/msl-raw-images/msss/00050/mcam/0050ML0230013000E1_DXXX.jpg
And please don't get me wrong - I'm just curious and do enjoy all the fine images we got so far!

Posted by: fredk Oct 18 2012, 05:56 PM

QUOTE (mcaplinger @ Oct 18 2012, 04:12 PM) *
it is pretty well-balanced for sunlight. For a terrestrial sunlit scene, the raw image is just a little bit greenish
This is the crucial point. To get a proper sense of how accurate the mastcam colours are, I'd like to see some of those terrestrial sunlit scenes, treated in the same way as the Mars images. Then we could see for ourselves just how "greenish" the Mars images are. What's "a little bit greenish" to some might be substantially shifted to others.

The outdoor terrestrial views on http://www.msss.com/science/msl-mastcam-pre-launch-images.php are "colour balanced", so unfortunately we don't know what the originals were like. I don't know of any public, unbalanced, outdoor, terrestrial mastcam images.

QUOTE (mcaplinger @ Oct 18 2012, 04:12 PM) *
perceived brightness is an even slipperier concept than perceived color... What that means relative to how they "should look" is as much a matter of taste as anything else.
True enough. But it is still an interesting question to ask "what would some Martian soil sprinkled on a sheet of white paper look like". My sense from MSL and years of MER images, where white rover parts are visible in the same frame as Martian ground, is that the answer is "brown" or "cinnamony brown", rather than orange or bright orange.

Of course, when we present an image showing only the Martian ground, displaying it as brown would mean we loose detail in the darker regions, so it makes sense to brighten it so it looks like some kind of orange. What we'd actually perceive standing on Mars looking at the ground is a whole other question, and would probably depend on what else is in the field of view, how long we've been there, etc...


Edit:
QUOTE (ronald @ Oct 18 2012, 05:21 PM) *
Just to illustrate my thinking - compare these two images (same sol, same time)
Another good comparison is this pair of images, taken from and looking at the same location, but at different times of day:
http://mars.jpl.nasa.gov/msl-raw-images/msss/00060/mcam/0060MR0272002000E1_DXXX.jpg
http://mars.jpl.nasa.gov/msl-raw-images/msss/00066/mcam/0066MR0293002000E1_DXXX.jpg
The degree of colour saturation seems to depend on the angle of sunlight.

Posted by: atomoid Oct 19 2012, 12:24 AM

Ronald's links reveal the shadows depict the sun is likely shining somewhat into the lens in http://mars.jpl.nasa.gov/msl-raw-images/msss/00050/mcam/0050ML0230007000E1_DXXX.jpg.
Aethetically speaking, the autoexposure has seemed to darken it too much (my eyes get similarly brightness-averse when i look towards the sun), however the sky itself seems to actually be just about the same brightness and has been noticeably green-shifted as compared to http://mars.jpl.nasa.gov/msl-raw-images/msss/00050/mcam/0050ML0230013000E1_DXXX.jpg where the sun is at left.
perhaps the exposure algorithm did what was expected, might IR wavelengths have a more pronounced affect on the MSL exposure system given the sun angle?

Posted by: mcaplinger Oct 19 2012, 05:57 AM

QUOTE (fredk @ Oct 18 2012, 10:56 AM) *
What's "a little bit greenish" to some might be substantially shifted to others.

When I said a little bit greenish, I meant that the Neutral 5 square on the Macbeth chart was this color (linearized, averaged, and rescaled to 8 bits):



#657761. Of course, the Macbeth neutral values are only neutral under CIE Illuminant C and these images were taken in bright sunlight (more like Illuminant B ) so there might be some small departure from neutral.

Posted by: vikingmars Oct 19 2012, 09:33 AM

Martian sky color palette (10° above horizon). Enjoy smile.gif


Posted by: Zeke4ther Oct 22 2012, 08:30 PM

Found an interesting article on Slashdot on how Nasa/JPL is able to deliver all of the cool images and data we use here on UMSF.

http://opensource.com/life/12/10/NASA-achieves-data-goals-Mars-rover-open-source-software

Since it deals with Software and systems, I thought I would share it here.
Admins. if you think of a better place, please let me know. smile.gif

Posted by: jmknapp Oct 23 2012, 10:22 AM

Curious thing about this MAHLI image:

http://mars.jpl.nasa.gov/msl/multimedia/raw/?rawid=0058MH0032000020R0_DXXX&s=58

The time that image was taken is given as 2012-10-04 22:38:31 UTC (sol 58). According to my calculations, that was about 46 minutes after sunset on sol 58 (21:52). I cross-checked that with Eyes on the Solar System. Yet the scene seems fairly well lit with directional light--so is that the normal nature of the sky glow at dusk on Mars? Or maybe there's an error?

Posted by: ronald Oct 23 2012, 11:24 AM

I would think this is the auto-exposure doing its job (or an exposure to get as much tonal information into the image). At the lower half you can see some grazing light from the low sun angle (?) so the time given seems to be right. If you tune down the brightness it looks like this:


Though it is interesting how the light situation is there on low sun angles.

Posted by: vikingmars Oct 23 2012, 12:45 PM

Dust on the color calibration target on Sol 72...
http://mars.jpl.nasa.gov/msl-raw-images/msss/00072/mcam/0072ML0556000000E1_DXXX.jpg
Enjoy (if I may say...)



(And also refer to post # http://www.unmannedspaceflight.com/index.php?s=&showtopic=7418&view=findpost&p=191996 )

Posted by: mcaplinger Oct 23 2012, 01:11 PM

QUOTE (jmknapp @ Oct 23 2012, 03:22 AM) *
Yet the scene seems fairly well lit with directional light?

Maybe the LEDs were being used?

Posted by: Greenish Oct 23 2012, 01:47 PM

Looks like a bunch of full-res narrowband filtered image sequences were just released from sol 72. Now I understand more fully what mcaplinger said back in http://www.unmannedspaceflight.com/index.php?s=&showtopic=7418&view=findpost&p=193069 about greyscale full res images being filtered ones.

The thumbnails appear colored (i.e. not post-processed onboard to account for the filters, so for example the 525 nm thumbnails look green) but the full res versions appear greyscale (i.e. bayer-interpolated and compensated for the different pixel responses to the narrowband filter, though the JPG files are still color files). Pretty neat, and makes it easy to identify the sequences when sorting by time taken or filename.


Posted by: Deimos Oct 23 2012, 02:19 PM

The MAHLI image doesn't really look LED-illuminated. 46 minutes after sunset, the western sky is still relatively bright, while the eastern sky is much darker. I'd guess that is all that is going on (along with the much longer than daytime-normal exposure time).

Posted by: Paul Fjeld Oct 23 2012, 02:47 PM

QUOTE (jmknapp @ Oct 23 2012, 06:22 AM) *
Curious thing about this MAHLI image:...
....so is that the normal nature of the sky glow at dusk on Mars? Or maybe there's an error?

I vote error. The more global (diffuse) illumination shouldn't cause such sharp distinction between light and dark.

Posted by: fredk Oct 23 2012, 03:03 PM

A time error of around an hour would be a serious error.

Can someone identify the context of that image - it looks like a trench? If the trench is aligned roughly east-west, that might concentrate the bright, western sky into a smallish region as viewed from the trench floor, and so sharpen the shadows somewhat.

Otherwise, as Deimos says, the sky really will be quite a bit brighter in the west than in other directions 45 minutes after sunset, and maybe this lighting is what we expect. It would be good to see other images taken at similar local times - can anyone find some?

Posted by: jmknapp Oct 23 2012, 03:25 PM

QUOTE (Deimos @ Oct 23 2012, 09:19 AM) *
the western sky is still relatively bright, while the eastern sky is much darker


Maybe that effect is especially pronounced on Mars, given its very thin atmosphere, so the light might be diffused less, or something like that.

For example, this was taken a few minute before sunrise:

http://mars.jpl.nasa.gov/msl/multimedia/raw/?rawid=NLA_401372570EDR_F0042002NCAM00517M_&s=44

Posted by: ronald Oct 23 2012, 05:55 PM

Maybe because the http://www.unmannedspaceflight.com/index.php?s=&showtopic=7418&view=findpost&p=193323 at mars this does make an effect too (sharper light/shadows)?

Impressive what is embodied in the dark ...



Navcam is allways good for some gems: http://mars.jpl.nasa.gov/msl-raw-images/proj/msl/redops/ods/surface/sol/00071/opgs/edr/ncam/NLA_403799249EDR_F0050104NCAM00528M_.JPG.

Posted by: fredk Oct 23 2012, 06:11 PM

QUOTE (ronald @ Oct 23 2012, 05:55 PM) *
Maybe because the http://www.unmannedspaceflight.com/index.php?s=&showtopic=7418&view=findpost&p=193323 at mars this does make an effect too (sharper light/shadows)?
That won't matter for Joe's trench shot, since according to Joe the sun had set 46 minutes before the picture was taken.

Posted by: Deimos Oct 24 2012, 02:03 AM

If you back up to see all of the imaging done that sol, it becomes apparent what happened. The image was created after sunset. That does not mean an image was taken then. The image subframe is paired with a range map--meaning it is a z-stack. The raw images & thumbnails that look like it have time stamps ~2 hours earlier. Still late, but not dramatically so.

Posted by: jmknapp Oct 24 2012, 02:14 PM

Thanks Deimos--that helps to interpret the time stamp.

Posted by: elakdawalla Oct 24 2012, 03:50 PM

Ah -- it never occurred to me that the Z-stacks would show time stamps different from the original images', but if they're being made hours after the original images were taken, that makes perfect sense. Thanks for the explanation!

Posted by: Bjorn Jonsson Oct 24 2012, 04:42 PM

QUOTE (fredk @ Oct 23 2012, 03:03 PM) *
Otherwise, as Deimos says, the sky really will be quite a bit brighter in the west than in other directions 45 minutes after sunset, and maybe this lighting is what we expect. It would be good to see other images taken at similar local times - can anyone find some?

Another thing to keep in mind is that unlike the sky on Earth the brightness of the Martian sky comes mainly from Mie scattering (dust) and not Rayleigh scattering (air). Mie scattering results in a less uniform brightness, in particular there is a bigger difference between the brightness of the sky towards the sun and the brightness away from the sun. So the brigthness of the sky 'behaves' differently from the Earth's sky when the sun moves across the sky and then sets.

A fisheye view of the Martian sky would be interesting in my opinion...

Posted by: Ant103 Oct 24 2012, 05:41 PM

QUOTE (Bjorn Jonsson @ Oct 24 2012, 06:42 PM) *
A fisheye view of the Martian sky would be interesting in my opinion...


Agreed with the force of 9001 sun !

Why don't having a fisheye camera directly pointed toward the Zenith ? Simple to make.

Posted by: vikingmars Oct 24 2012, 07:27 PM

QUOTE (Ant103 @ Oct 24 2012, 07:41 PM) *
Agreed with the force of 9001 sun ! Why don't having a fisheye camera directly pointed toward the Zenith ? Simple to make.

Yes, you are right, Ant103 : this is a VERY GOOD IDEA.
In fact, it was envisioned for the MERs as the "SunCam" camera experiment : a fisheye lens pointed towards the zenith. Finally they decided to drop the experiment and use the slot instead for the descent camera... And the surveying of the Sun was to be done through a special filter by the PanCam... Enjoy smile.gif

Posted by: mcaplinger Oct 24 2012, 07:40 PM

QUOTE (vikingmars @ Oct 24 2012, 12:27 PM) *
Yes : you are right : it was the "SunCam" camera experiment that was envisioned for the MERs (fisheye lens pointed towards the zenith).

The as-built Suncams would have had ND5 filters and been basically useless for sky imaging. Optically they were the same as the Navcams and were to be mounted on the HGA. I'm not sure if earlier concepts used fisheye lenses or not.

At any rate, we don't have fisheye zenith-facing cameras so if you want global sky color it has to be done with Mastcam. A zenith to horizon sweep at some azimuthal sampling would be an interesting place to start. Not sure if there is much interest among the science team in this sort of thing and they don't let me push the button.

Posted by: vikingmars Oct 24 2012, 07:55 PM

QUOTE (mcaplinger @ Oct 24 2012, 09:40 PM) *

Dear mcaplinger, here are some more detailed info :
The as-built Suncams would have had ND5 filters (==> yes) and been basically useless for sky imaging (==> yes).
Optically they were the same as the Navcams (==> only for the electronics that were re-used for the DesCam) and were to be mounted on the HGA (==> no, on the deck and only 1 per rover). I'm not sure if earlier concepts used fisheye lenses or not (==> yes : a real fisheye lens)

Posted by: mcaplinger Oct 24 2012, 08:02 PM

QUOTE (vikingmars @ Oct 24 2012, 12:55 PM) *
(==> yes : a real fisheye lens)

All I know is what I read on the Internet: http://www.google.com/url?sa=t&rct=j&q=mer%20suncam&source=web&cd=12&ved=0CCUQFjABOAo&url=http%3A%2F%2Fdocuments.clubexpress.com%2Fdocuments.ashx%3Fkey%3D6eYwvCwIEPnOmuduk8LtXXRirZw21hgRvjdwtl%252FE168%253D&ei=FkKIULq2NqaoywG8loCoDg&usg=AFQjCNHCfh-RDqXTxtBdY2espMGqK4Hu-A&cad=rja

QUOTE
Originally, the direction of the Sun was to be measured by a single special
camera on each rover called the SunCam. It was to be mounted next to the
antenna dish. It was to have the same type of lens as the NavCam, except an extre-
mely strong neutral-density filter would be placed in front. However, as so often
happens when building spacecraft, the rovers were exceeding their weight
budget. Although the SunCams had already been built, to reduce weight they
were deleted. Their function was taken over by the PanCams.


Posted by: vikingmars Oct 24 2012, 08:17 PM

QUOTE (mcaplinger @ Oct 24 2012, 10:02 PM) *
All I know is what I read on the Internet: http://www.google.com/url?sa=t&rct=j&q=mer%20suncam&source=web&cd=12&ved=0CCUQFjABOAo&url=http%3A%2F%2Fdocuments.clubexpress.com%2Fdocuments.ashx%3Fkey%3D6eYwvCwIEPnOmuduk8LtXXRirZw21hgRvjdwtl%252FE168%253D&ei=FkKIULq2NqaoywG8loCoDg&usg=AFQjCNHCfh-RDqXTxtBdY2espMGqK4Hu-A&cad=rja

What I understood then is that a 150°FOV was envisioned vs. a 120°FOV for the Hazcams (as per the diagram hereabove). And that there was some additional costs to bear vs. scientific benefits that could be fulfilled by the PanCam instead...

Posted by: elakdawalla Oct 24 2012, 09:49 PM

QUOTE (mcaplinger @ Oct 24 2012, 12:40 PM) *
...if you want global sky color it has to be done with Mastcam. A zenith to horizon sweep at some azimuthal sampling would be an interesting place to start. Not sure if there is much interest among the science team in this sort of thing and they don't let me push the button.

I'm quite sure that our UMSF friend Deimos would have much interest in this sort of thing. smile.gif It's just up to him to convince enough people on the rest of the team that it's worth doing. I suspect the ground-staring geologists find his interest in the sky a bit odd, but the stuff that comes out of atmospheric or astronomical observations is often among the images best loved by the public.

Posted by: Explorer1 Oct 24 2012, 09:56 PM

Oh yes, that reminds me; we still haven't seen Earth from Gale yet! It should be a good view around sunrise, given the current geometry...

Posted by: Deimos Oct 25 2012, 01:55 AM

On sky images: almost there. A few approvals are need to do it right; the sequences proposed are not too unlike what mcaplinger outlines.

On Earth: it is possibly too close. MER images of Earth were all at a much bigger separation. With moderately high tau, Earth low, and the Sun not set by very far, the sky may be too bright. Later, and the Earth is lower and dimmer; earlier, and the sky is brighter. Then there's the absence of a pan-chromatic filter. The IR filters are low response; the Bayer filters are not optimal for a point source; Navcam is not just lower resolution, but also much lower response. Anyway, that's just setting expectations--it's not to say I don't want to do it (and soon).

Posted by: fredk Oct 25 2012, 02:50 AM

On the plus side, mastcam100 has a 10 mm aperture (at f/10), vs 2 mm for pancam (at f/20), if I've got my numbers right. So MR should do considerably better imaging a point source (Earth) in a brightish sky, ignoring all the other negative differences Deimos mentioned.

Posted by: jmknapp Oct 31 2012, 11:46 PM

In the recent telecon they showed a raw mastcam image showing a scene "as it would look on Mars" side-by-side with a color-corrected version of how it would look on Earth:



I tried to reverse-engineer that color correction in Adobe Lightroom and got this on another test image:



Maybe someone could do a more exact job of it, or come up with a Photoshop filter. The telecon image looks like it might be a Lightroom crop itself. Anyway, I attached a zip file with a Lightroom preset to get fairly close at least with the example image. In LR if you right click on presets you can import a preset file.

 mars_lightroom3.zip ( 988bytes ) : 310

Posted by: mcaplinger Nov 1 2012, 12:06 AM

QUOTE (jmknapp @ Oct 31 2012, 04:46 PM) *
In the recent telecon they showed a raw mastcam image showing a scene "as it would look on Mars" side-by-side with a color-corrected version of how it would look on Earth.

Who's "they"? There are people on the project I would trust to do this right, and people I, well, wouldn't.

Posted by: maschnitz Nov 1 2012, 12:14 AM

Whoever produced http://www.nasa.gov/mission_pages/msl/multimedia/pia16174.html.

It was listed under http://www.nasa.gov/mission_pages/msl/telecon/20121030.html (as Vaniman 1), but someone else might've produced it.

Posted by: mcaplinger Nov 1 2012, 12:38 AM

QUOTE (maschnitz @ Oct 31 2012, 05:14 PM) *
Whoever produced http://www.nasa.gov/mission_pages/msl/multimedia/pia16174.html.

Unless it explicitly described otherwise, this looks like it was thrown into Photoshop and tweaked until it "looked good", which is a long way from a real analysis IMHO.

Posted by: jmknapp Nov 1 2012, 02:10 AM

This is what David Vaniman said at the telecon: "The image on the left is as it looks illuminated in the Mars atmosphere. The image on the right is color-corrected to show you how it would look in your backyard here on Earth."

Posted by: mcaplinger Nov 1 2012, 02:49 AM

QUOTE (jmknapp @ Oct 31 2012, 07:10 PM) *
This is what David Vaniman said at the telecon: "The image on the left is as it looks illuminated in the Mars atmosphere. The image on the right is color-corrected to show you how it would look in your backyard here on Earth."

Maybe I should just shut up, but I'm a purist and can't let it go. When somebody gives me evidence that they know how the camera responds on the basis of ground cal data (which usually involves talking to me) and they show knowledge of the details of color science, then I'll believe they've accurately white-balanced an image. Otherwise they're just approximating or worse.

Posted by: jmknapp Nov 1 2012, 03:13 AM

OK, put it this way--do you think there could in principle be a generic image filter to convert a mastcam image from Mars light to backyard light (more or less) or would each lighting situation be too different to even attempt this?

Posted by: mcaplinger Nov 1 2012, 03:55 AM

QUOTE (jmknapp @ Oct 31 2012, 08:13 PM) *
do you think there could in principle be a generic image filter to convert a mastcam image from Mars light to backyard light (more or less)

"More or less"? To zeroth order sunlight on Mars is just the same color as sunlight on Earth. I'm not convinced we really know what the global sky color and its effect on scene color is, and it is certainly variable as a function of tau and time of day. And people seem unaware that the raw camera images aren't precisely color-balanced for any particular illumination anyway.

If someone wants to say that they white-balance an image in Photoshop because they like the way it looks, that's fine, but they really shouldn't represent that as "what the scene would look like on Earth" without a good deal more work that I see no evidence of anyone having done. (Personally I think a more interesting question is what the scene "really looks like", not what it would look like under some hypothetical light source that it's never going to be under and which is always going to involve some assumptions.)

And that's my last word on the subject, lest I start to sound like a broken record.

It is not enough to discover how things seem to seem.
We must discover how things really seem.
-- Niels Bohr

Posted by: ronald Nov 1 2012, 10:27 AM

My first though when I heard that "like in your backyard" statement at the telecon was: there is an expert talking about an particular subject he's not an expert in. The most auto correction algorithms will give you a similar result (below is auto-whitebalance in gimp, but PS would do roughly the same).


I'm with mcaplinger on this rolleyes.gif

Edit: Added the http://mars.jpl.nasa.gov/msl-raw-images/msss/00055/mcam/0055ML0254000000E1_DXXX.jpg image pair too.

Posted by: jmknapp Nov 1 2012, 10:39 AM

Auto-color is not really that similar to what was presented. Here's Vaniman-1 next to a PS auto-color treatment--too blue!



Seems like GIMP auto-color is comparable.

In an early press conference Mike Malin said that beige is a predominant color.

Posted by: ronald Nov 1 2012, 11:11 AM

Reminds me of a job where we should color the backgrounds of images to http://www.google.com/search?q=taupe&hl=de&safe=off&prmd=imvns&tbm=isch&tbo=u&source=univ&sa=X&ei=EViSUO7nF8HWtAaF5oDgDg&ved=0CDcQsAQ&biw=1920&bih=1003 (in the end it was a dark blue the client wants ...) - so it goes with http://www.google.com/search?q=beige&hl=de&safe=off&prmd=imvns&tbm=isch&tbo=u&source=univ&sa=X&ei=EViSUO7nF8HWtAaF5oDgDg&ved=0CDcQsAQ&biw=1920&bih=1003 smile.gif

Posted by: fredk Nov 3 2012, 03:25 PM

There's a series of daytime sky images from mahli on sol86. My first guess was flatfield images, but I don't see why they'd take so many. Some seem to have the lens cover on - notice the large out-of-focus dark smudge in the lower left corner of this frame:
http://mars.jpl.nasa.gov/msl-raw-images/msss/00086/mhli/0086MH0092000001C0_DXXX.jpg

Posted by: EdTruthan Nov 3 2012, 07:06 PM

QUOTE (elakdawalla @ Nov 2 2012, 11:19 AM) *
One general comment, not just to you but to everyone doing image processing work: I suggest processing out the "schmutz" on each camera. .... They're especially distracting in stereo images. If you're going to spend hours and hours building and blending seamless mosaics, dust 'em off a bit first!

That's a really good point Emily, especially for anaglyph work, and looking closely there's an awful lot of "schmutz" on the lens. I created a Photoshop automation for myself in CS3 to do the task and thought it might be of use to others, so here it is. It must be performed on totally raw 1648x1200px MAHLI's as the automation relies on that pixel sizing to identify the cloning positions. In brief, it encircles all the "schmutz" with a lasso set at a 1 pixel feather, moves the lasso encirclements down 7 pixels (diameter of the larger "schmutz's"), copies the encirclement areas to a 2nd layer, and moves that layer back up 7 pixels to mask the dirt, then flattens the image, crops out the black borders and a few rows of digital artifacting at the edge and saves the file out to an uncompressed, ready to mosaic 1596x1192px JPG. Each frame takes about 32 seconds on my computer. The only caveat is that on pictures of the rover there's always a potential that it will move a light (or dark) area into a dark (or light) area, actually adding some "schmutz", but that will only occur in the rare instance where a dark/light border falls between the 7 pixel clone area, which after analyzing a batch of processed frames seems a very rare occurrence. Compared to the broad benefit of frame-wide "de-schmutzing" that seems well worth the occasional instance. Instructions on how to install and use the "MAHLI-Dust-Off.atn" file are in the zip. One will need a CS version of PS. I've only tested it on CS3, but I assume it will function well on other CS versions....

Here's a http://www.edtruthan.com/mars/Sol84-Schmutzed.jpg... and here's a PS automated http://www.edtruthan.com/mars/Sol84-De-Schmutzed.jpg.
Download http://www.edtruthan.com/mars/MAHLI-Dust-Off.zip

(P.S. And remember, as per the included instructions, before running the automation, create a folder on your DESKTOP named "(-)" (without the quotes, i.e. left parenthesis, dash, right parenthesis only) as the automation expects it to be there to receive the processed frames, otherwise you may get an error.)

Posted by: walfy Nov 3 2012, 08:59 PM

QUOTE (EdTruthan @ Nov 3 2012, 11:06 AM) *
...I created a Photoshop automation for myself...


ohmy.gif Oh, my, that is brilliant! Of course, the specs are in the exact same place! Easy to create an Action to remove them. Thank you for providing this.

Edit: Just tested it in CS5, works perfect.

Posted by: Astro0 Nov 4 2012, 06:34 AM

Ed, that's great. Took about 3-4 seconds on my computer for each file. Nice smile.gif

Had to make one small change to the 'save' section of the Action as it was coming up with the error because it was trying to save it to the folder on your desktop. Simple edit.

I'm placing your tool in the MSL FAQs.


Posted by: Zelenyikot Nov 6 2012, 10:48 PM

I noticed that the "schmutz" on the Mast cameras came after "The contamination test".
http://www.msss.com/news/index.php?id=14

My guess is it true? I wonder, what was the "The contamination test"?

before
http://www.keepme.ru/upload/images/2012/11/07/00a004d6fe72d98aaf8905cc5f47245b.jpg

after
http://www.keepme.ru/upload/images/2012/11/07/bc47a872e6d5f3d804ba27b2c2b457cf.jpg

Posted by: mcaplinger Nov 6 2012, 11:23 PM

QUOTE (Zelenyikot @ Nov 6 2012, 03:48 PM) *
My guess is it true? I wonder, what was the "The contamination test"?

Is what true? The contamination test referred to was a very sensitive survey for outgassing products, having nothing to do with the crud on the focal plane. I've talked about the crud on the focal plane several times already.

Posted by: ronald Nov 9 2012, 09:04 PM

Puzzled by Greenishs http://www.unmannedspaceflight.com/index.php?s=&showtopic=7495&view=findpost&p=194299 I did compare MSL with MER filter images:


MER 257 vs. MSL 412

The MER images used above are the common exploratorium ones (compressed histogram). Compressing the MSLs before combining gives you more familiar looking false color images then (right).



Posted by: Greenish Nov 9 2012, 09:38 PM

ronald, I did initially make them as straight composites of the JPGs. But they seemed very flat compared to the MER ones, and therefore I think less immediately informative. So since they are uncalibrated, variously exposed [I assume], lossy compressed images (and since I don't know enough to perform any particularly accurate processing) I figured I would aim for aesthetics more than a high level of precision. I tried not to lose much off of the ends of the histograms but did adjust the channels individually, but about the same amount each.

I sent a PM or two but if anyone has suggestions on a somewhat consistent way to make these filter composites I'd be interested. If it means lower contrast results, so be it. I looked back at the MER discussions but there's a whole lot there and it seems to assume a certain base level understanding that I may not have since this isn't the kind of engineering I do for work. I'm using ImageJ with the RGB Composer plugin, which seems almost too easy.

Posted by: Zelenyikot Nov 17 2012, 03:05 AM

Somebody can explain to me for what purpose photographing via three filters is made: green, dark blue and orange? To me it is not clear why orange instead of red. unsure.gif
Thanks


Posted by: mcaplinger Nov 17 2012, 06:24 AM

QUOTE (Zelenyikot @ Nov 16 2012, 08:05 PM) *
To me it is not clear why orange instead of red.

http://www.lpi.usra.edu/meetings/lpsc2012/pdf/2541.pdf

I think L4 thumbnails end up looking orange because of some leaks from the green and blue bayer positions.

Posted by: Zelenyikot Nov 18 2012, 12:32 AM

It seems to me this orange it is L3 L4 is red

Unfortunately in the name of images it is impossible to see filter number therefore I determine only by sequence of shooting: the orange goes the third, and red the fourth. Thanks for the answer and url

Posted by: mcaplinger Nov 18 2012, 01:14 AM

QUOTE (Zelenyikot @ Nov 17 2012, 05:32 PM) *
It seems to me this orange it is L3...

Right, yes, L3. If you look at the QE of the Mastcam CCD http://www.truesenseimaging.com/all/download/file?fid=8.31 you can see that at the L3 bandpass of 750 nm the Bayer filters haven't become completely transparent but the green is getting leaky, so you get this orangish color.

Posted by: Zelenyikot Nov 18 2012, 01:43 AM

Thanks for an explanation.

Posted by: Airbag Nov 26 2012, 08:02 PM

Argh, so many beautiful panoramas yet still I see that MR "schmutz" (Emily's words) on just about every one of them. For those who can't run the Photoshop automation mentioned above, there is an alternate automated way of doing so using Linux/Unix/Solaris. It requires jpegpixi-1.1.1 (which in turn requires jpeglib). See "http://www.openprinting.org/download/digitalimage/Commandlinephoto-Mini-HOWTO.txt..." for more info.

For Mastcam right 1200x1200, the deadpixels.txt contents is:

CODE
597,315,3,5
509,362,2,4


I created this by desaturating one of the images that had obvious defects, then inverting and thresholding it to leave the "bad" pixels as white against a black background, so that jpeghotp could create deadpixels.txt. Some hand-tweaking of the exact size of the now white spots was necessary for the best corrective effect. It then becomes a simple matter of looping through all your MR 1200x1200 image source files and running jpegpixi on them, e.g.:
CODE
for i in *.jpg
do
  jpegpixi -f deadpixels.txt $i fixed.jpg
  mv fixed.jpg $i
done


Fast, and lossless (other than the corrected pixels, obviously!). Adjust contents of deadpixels.txt as necessary for other image sizes etc.

Airbag

Posted by: ronald Nov 26 2012, 09:20 PM

Great Airbag - Thankyou! Its allways nice if you can do things quickly on the command line smile.gif

Posted by: RoverDriver Nov 26 2012, 10:20 PM

QUOTE (Airbag @ Nov 26 2012, 12:02 PM) *
Argh, so many beautiful panoramas yet still I see that MR "schmutz" (Emily's words) on just about every one of them. For those who can't run the Photoshop automation mentioned above, there is an alternate automated way of doing so using Linux/Unix/Solaris. It requires jpegpixi-1.1.1 (which in turn requires jpeglib). See "http://www.openprinting.org/download/digitalimage/Commandlinephoto-Mini-HOWTO.txt..." for more info.
....


Airbag, thanks for this pointer. Just wanted to let people know that this package is also available in Mac OSX as a Mac Ports.

Paolo

Posted by: ronald Nov 29 2012, 11:56 AM

How goes http://mars.jpl.nasa.gov/msl-raw-images/msss/00090/mcam/0090MR0632001000E1_DXXX.jpg and http://mars.jpl.nasa.gov/msl-raw-images/msss/00090/mcam/0090MR0632000000E1_DXXX.jpg (or http://mars.jpl.nasa.gov/msl-raw-images/msss/00089/mcam/0089ML0617018000E1_DXXX.jpg) together?

This still puzzles me - below you see

The Bright - The Aligned - The Dark.



rolleyes.gif

Posted by: Ant103 Nov 29 2012, 12:46 PM

Come on ! This is just nothing than a difference of exposure wink.gif I don't why this is still puzzling you… smile.gif

Posted by: ronald Nov 29 2012, 01:56 PM

smile.gif - I'm simply still not sure how bright the surface should be when I work on the images ... also contrast is an issue.

For illustration I hijacked one of your nice images and "worked" on it:


Posted by: mcaplinger Nov 29 2012, 03:33 PM

QUOTE (ronald @ Nov 29 2012, 04:56 AM) *
This still puzzles me...

The autoexposure is just trying to get the histogram to fill the dynamic range. When there's a bright piece of rover structure in the image, the exposure is shorter so as not to saturate the rover, so the ground is inevitably darker.

To make an accurate mosaic, one would decompand the image to 12 bits linear and then scale all images by their relative exposure times, and then convert all of the images to 8 bits using one's favorite gamma scheme for final display. Usually one doesn't go to that much trouble and just scales in 8-bit space in an ad hoc manner.

As to how bright it is, I could figure that out in radiometric terms, but I'm not sure that would mean much from a practical standpoint.

Posted by: ronald Nov 29 2012, 10:01 PM

Doing it the scientific way is far beyond my technical and mathematical horizon ... just from the images (with just some years experience in doing manual exposures in photography) I would tend to the darker variant though it is a little bit too dark. The outer right part of Cargo Cults pan above is pretty much it. There is still a big difference between official MER and MSL images. The dark and gloomy on the one side and the shiny dessert at the other side rolleyes.gif

Posted by: djellison Nov 29 2012, 10:10 PM

QUOTE (ronald @ Nov 29 2012, 02:01 PM) *
There is still a big difference between official MER and MSL images. The dark and gloomy on the one side and the shiny dessert at the other side rolleyes.gif


They're different parts of the planet. Look at them from orbit - one is dark (Sinus Meridiani is a renown dark feature from astronomy) and one is much brighter with dark dunes. The Washington Scab Lands don't look the same as the Gobe Desert

Compare Gusev and Gale and they're really not that different.
http://photojournal.jpl.nasa.gov/jpegMod/PIA16101_modest.jpg
http://photojournal.jpl.nasa.gov/jpegMod/PIA16441_modest.jpg


Posted by: mcaplinger Nov 29 2012, 10:19 PM

QUOTE (ronald @ Nov 29 2012, 03:01 PM) *
I would tend to the darker variant though it is a little bit too dark.

What does "a little bit too dark" mean? Feel free to tweak the color however you wish, just don't claim you're doing something other than making the image fit your subjective taste, because you're not.

Radiometrically, I think the surface is more brown than pink. I suspect the MSL mosaic that Doug shows has been punched up and looks garishly colored to me. Photoshop white balance isn't radiometrically accurate.

I'm sorry to beat this dead horse again, and I'll quit if people just stop asking about it smile.gif

Posted by: djellison Nov 29 2012, 10:25 PM

QUOTE (mcaplinger @ Nov 29 2012, 02:19 PM) *
Radiometrically, I think the surface is more brown than pink. I suspect the MSL mosaic that Doug shows has been punched up and looks garishly colored to me.


Oh - I agree - I was just showing that MastCam data can be made to look just like PanCam data if one so chooses.

I'm also totally in agreement on the color. The best words I think to describe Mars are ochre and butterscotch.

Posted by: vikingmars Nov 29 2012, 10:58 PM

QUOTE (djellison @ Nov 29 2012, 11:25 PM) *
The best words I think to describe Mars are ochre and butterscotch.

Yes, quite. The term used in JGR reports during the Viking mission to describe the color of the Martian surface was "yellowish brown" smile.gif

Posted by: djellison Nov 29 2012, 11:40 PM

Yeah - the same words were used by Maki et.al. in their 1999 Pathfinder paper
"Although Mars has long been called the "red" planet, quantitative measurements of the
surface color from telescopic and surface observations indicate a light to moderate
yellowish brown color. The Pathfinder camera measurements presented here support the
claim that the red planet is not red but indeed yellowish brown. "

Posted by: ronald Nov 30 2012, 10:53 AM

QUOTE (mcaplinger @ Nov 29 2012, 11:19 PM) *
What does "a little bit too dark" mean?


As a photographer I would made the exposure a little bit longer - like this:


As for Pancam data - I can see no constant either in http://marswatch.astro.cornell.edu/pancam_instrument/panoramas.html.

But yeah - lets leave this dead horse alone for some time now ... rolleyes.gif

Posted by: iMPREPREX Jan 5 2013, 11:30 PM

Can someone please explain to me how to decipher the file names for the MastCams? I've looked everywhere and I can't find anything.

For example - with 0138MR0819050000E1_DXXX.jpg

I understand "0138" is the Sol, "MR" is "MastCam Right", and "50000" is the 50th (or sometimes 51st) image in the sequence for a mosaic or a panorama.

What does E1, E2, E3 mean? Also, if someone could explain the part after "MR" where is says "08190" and perhaps the "DXXX"?

Thank you in advance. smile.gif

Posted by: jamescanvin Jan 7 2013, 08:44 AM

QUOTE (iMPREPREX @ Jan 5 2013, 11:30 PM) *
What does E1, E2, E3 mean?


E = Product type. E = 'normal', I = Thumbnail, + a whole load of others for uncompressed, focus stacks, etc.

Number is a version number, i.e. if more data arrived for a procuct a new version is made.

QUOTE (iMPREPREX @ Jan 5 2013, 11:30 PM) *
the part after "MR" where is says "08190"


That is the sequence ID, all images taken together as part of one sequence share the same number.

Posted by: Astro0 Jan 7 2013, 11:39 AM

ADMIN NOTE: Apologies. I was moving two members panoramas from this topic over into the Glenelg thread where they should have been.
Somehow though the transfer didn't happen as it should and the files/posts disappeared. If you know who you are, can you please repost these pans to the Glenelg topic.


Sorry for the inconvenience.


Posted by: wildespace Jan 15 2013, 05:29 PM

Why aren't Mastcam images white-balanced like the MAHLI ones? Is it to do with science or techincal issues? I love MAHLI images for how natural and pleasing to the eye they look. Would be great if Mastcam images came like this, instead of being tweaked on the ground with not always the best results.

Posted by: mcaplinger Jan 15 2013, 06:46 PM

QUOTE (wildespace @ Jan 15 2013, 10:29 AM) *
Why aren't Mastcam images white-balanced like the MAHLI ones?

I don't know what you are reacting to. None of the cameras do any kind of white balance processing internally. There may be some minor color differences because of the different glasses in the optics and the slightly different bandpass filter between Mastcam and MAHLI.

Posted by: iMPREPREX Jan 15 2013, 07:34 PM

QUOTE (wildespace @ Jan 15 2013, 12:29 PM) *
Why aren't Mastcam images white-balanced like the MAHLI ones? Is it to do with science or techincal issues? I love MAHLI images for how natural and pleasing to the eye they look. Would be great if Mastcam images came like this, instead of being tweaked on the ground with not always the best results.


Who the heck would want the images white-balanced by default anyways? Besides, a raw image isn't called a "raw image" for nothing. wink.gif


Posted by: wildespace Jan 15 2013, 07:37 PM

QUOTE (mcaplinger @ Jan 15 2013, 06:46 PM) *
I don't know what you are reacting to. None of the cameras do any kind of white balance processing internally. There may be some minor color differences because of the different glasses in the optics and the slightly different bandpass filter between Mastcam and MAHLI.

From what I've read, MAHLI's images are white-balanced internally. It uses a calibration target mounted on the rover for this.

http://msl-scicorner.jpl.nasa.gov/Instruments/MAHLI/

"MSL carries the MAHLI Flight Calibration Target for color/white balance, resolution and focus checks, and verification of UV LED functionality. The target will be mounted in a vertical position on the rover (i.e., vertical when the rover is on a surface with a slope of 0°) to help prevent dust accumulation."

Looking at the raw images, it's also obvious that MAHLI images come already white-balanced, in contrast to Mastcam images. A good example is the white surface of the rover; on Mastcam images it has an ever-present yellow/orange cast, while in MAHLI images it's white.

Examples:
Mastcam http://mars.jpl.nasa.gov/msl/multimedia/raw/?rawid=0106ML0681049000E1_DXXX&s=106
MAHLI: http://mars.jpl.nasa.gov/msl/multimedia/raw/?rawid=0085MH0113000019E1_DXXX&s=85

Posted by: mcaplinger Jan 15 2013, 07:46 PM

QUOTE (wildespace @ Jan 15 2013, 12:37 PM) *
From what I've read, MAHLI's images are white-balanced internally.

I'm on the team that built MAHLI, and I can assure you there is no internal white-balancing. The cal target can be used to do white balance, or it could be if it wasn't covered with dust from the landing. smile.gif

If there's a perceptible difference in white balance between the cameras (I just looked at a few images, and it's hard to say for sure since the imaging geometries aren't very similar), it probably has to do with the slightly different cut-on wavelength of the bandpass filter.

Posted by: jmknapp Feb 6 2013, 12:42 PM

For the multi-spectral sequences, does anyone know the probable order of filters used?

This document lists the filters in a certain order:

http://www.lpi.usra.edu/meetings/lpsc2012/pdf/2541.pdf

...which might be taken to be the preferred order, but how about the sequences where only 4 or 6 filters are used? for example, this ML sequence of four from sol 173:

http://curiosityrover.com/?q=+ml173&thumbSize=large&tzsel=UTC&orderby=ettaken&sortdir=asc

It's interesting how the mast shadow essentially disappears for the two later frames--which would be logical if those filters were toward the infrared?

Posted by: mcaplinger Feb 6 2013, 02:15 PM

QUOTE (jmknapp @ Feb 6 2013, 05:42 AM) *
For the multi-spectral sequences, does anyone know the probable order of filters used?

You could possibly figure it out from the time stamps; it takes twice as long to go from filter N to N+2 as from filter N to N+1.

As I've noted previously, the thumbnail tint can be used for any filter with different visible throughput, and the infrareds can maybe be told apart from timing or the amount of dark current, as the exposure times go up as you get farther into the IR.

Then perhaps patterns would emerge as to how they're using the filters; as you observe, there seems to have been some evolution in which ones are used.

Posted by: elakdawalla Feb 6 2013, 05:00 PM

We're very close to the date of the first expected PDS release. If patterns aren't obvious from the raws, comparisons between raws and PDS-released images might provide clues that you could generalize from, going forward.

Posted by: CosmicRocker Feb 7 2013, 03:07 AM

That's good to hear.

Can we expect to find the release at the http://anserver1.eprsl.wustl.edu/?

Posted by: Greenish Feb 7 2013, 02:43 PM

Per http://pds-geosciences.wustl.edu/missions/msl/index.htm - "MSL Analyst's Notebook - Coming with Release 1. Provides search, display, and download tools for MSL data sets."

Subsequent releases will be every 90 days after Feb 27th initial release of sols 0-90.

The imaging node release dates are here: http://pds-imaging.jpl.nasa.gov/schedules/msl_release.html - Release 1 EDRs on same date as above, RDRs on Mar 20th.

Posted by: EdTruthan Mar 3 2013, 10:29 PM

Answered from a question in http://www.unmannedspaceflight.com/index.php?s=&showtopic=7591&view=findpost&p=198716...

QUOTE (Airbag @ Mar 3 2013, 01:24 PM) *
Ed. I don't know how you do, it, but those anaglyph panoramas are once again simply fantastic - they are so sharp and the depth perception is easy to see across the entire panorama. In 3D the terrain becomes so much easier to explore, and your images add tremendous value.

Do you generate the MR/ML panos independently and then merge them, or do you create them as multiple layers in one pano, sharong control points for some fixed depth, and then separate them for the anaglyph? Or something else, such as magic? smile.gif

Airbag

In general, I usually do the following for the big anaglyphs....

~ First, I run raw MC100's and MC34's through the appropriate (manually recorded preset for image size in question) Photoshop automation to carefully remove all the lens "schmutz" from each set (MC100 has 2 big ones and MC34 has now developed 3 smaller ones) so there aren't any distracting "floaters" in the final anaglyph. They're very distracting and break the immersion.

~ I then mosaic each separately with as close to the same projection as possible (varies depending on the pano).

~ Aligning them is done manually by eye in Photoshop by dragging the left channel over the right, and resizing and aligning it by eye with transparency set at 50%. Before splitting the channel colors I usually tweak the levels for an bit brighter illumination without saturating the whites. Some anaglyphs tend come out rather dark otherwise.

~ The crucial (and tricky) part is getting all the matching L & R objects in the matching channels to align along the same horizontal plane for the least eye strain. Even with a near identical initial projection for each, it usually requires a bit of careful warping in several areas. That's the most time (& CPU) consuming part.

~ And lastly, by aligning the horizontal offset so the focal point (where the channels have little to no offset) is more biased to the immediate foreground rather than the distance, it keeps eye strain in check because the eye seems to naturally prefer that the left channel falls to the left of the right channel. Below that focal point the channels reverse, which is harder on the eye. That said, the reason not to have the focal point at the bottom of the frame is that in doing so the top offsets often become far too wide to pull together at high zooms so a compromise focal point must be determined. This will vary depending on the field depth between top and bottom of course.

If all works right... it comes out looking good!

Posted by: mcaplinger Mar 4 2013, 01:08 AM

QUOTE (EdTruthan @ Mar 3 2013, 03:29 PM) *
~ First, I run raw MC100's and MC34's through the appropriate (manually recorded preset for image size in question) Photoshop automation to carefully remove all the lens "schmutz" from each set (MC100 has 2 big ones and MC34 has now developed 3 smaller ones)

Do you really have evidence that the 34mm has defects that weren't there before? I don't but I haven't been looking very hard for several weeks.

Posted by: EdTruthan Mar 4 2013, 05:22 AM

Mike, I first noticed the three 34mm spots a few weeks ago when proofing a large anaglyph. They're hard to see in 2D against the surface because of the jumbled noise in the terrain but pop out subtly as 'floaters" when viewed in stereo if not cleaned before anaglyphing. The four frame animated GIF below (from the raw Sol 184 set) points these three culprits against the relative smoothness of an out of focus robotic arm...

http://www.edtruthan.com/mars/MC34-Schmutz-3-Spots.gif

Your post made me curious as to how long they've actually been there, and to my surprise it looks like they've been present since the first good 34mm shots came down on Sol 3. You can see them in the same positions (as in the GIF above) at the center of this full frame series from Sol 3: http://mars.jpl.nasa.gov/msl-raw-images/msss/00003/mcam/0003ML0000035000E1_DXXX.jpg. Now knowing now where to look for these - especially easy to spot if one flips rapidly in an image viewer at any large 34mm set - and yup, they're there.

Posted by: mcaplinger Mar 4 2013, 06:39 AM

QUOTE (EdTruthan @ Mar 3 2013, 10:22 PM) *
Your post made me curious as to how long they've actually been there, and to my surprise it looks like they've been present since the first good 34mm shots came down on Sol 3.

This fits my expectations. These are not on the lens but on the focal plane and have been there since we buttoned up the optics during final assembly.

Posted by: fredk Mar 4 2013, 03:55 PM

QUOTE (EdTruthan @ Mar 3 2013, 11:29 PM) *
MC34 has now developed 3 smaller ones

There are many more than three visible on MC34 - in a quick count I found at least ten on the sol 24 images. They're easiest to see in the sky shots, but their visibility depends strongly on the filter. For example, compare these two filtered sky frames:
http://mars.jpl.nasa.gov/msl-raw-images/msss/00024/mcam/0024ML0118002000D1_DXXX.jpg
http://mars.jpl.nasa.gov/msl-raw-images/msss/00024/mcam/0024ML0118001000D1_DXXX.jpg
I think that makes sense, since the visibility in a filtered image will depend on the position of the speck with respect to the Bayer pattern. So their visibility in a filtered sky image could be greater than in a full-colour sky image, which is some weighted average over the four subpixels.

Posted by: Lightning May 10 2013, 09:50 PM

Do you about the schedule for MARDI images release ? I mean, the raw images, not the lossy-compressed ones.
And could you tell me if the EDL sensor suite raw data (IMU, radar, etc...) that were stored onboard have been downloaded and, if yes, are they available to the public ?

Thanks. smile.gif

Posted by: arko Jun 3 2013, 06:24 PM

Just a thought.. Saw this picture today http://mars.jpl.nasa.gov/msl/multimedia/raw/?rawid=0293MH0285000010T1_DXXX&s=293

MAHLI took that at night with its LED's if I'm not mistaken... which brought up the thought.. How cool would it be for Curiosity to take a self portrait at night! (like this at night http://www.nasa.gov/mission_pages/msl/multimedia/pia16239.html)

Is this possible?

Posted by: mcaplinger Jun 3 2013, 06:37 PM

QUOTE (arko @ Jun 3 2013, 11:24 AM) *
JHow cool would it be for Curiosity to take a self portrait at night!...
Is this possible?

The LEDs become very dim at distances of more than a few centimeters (inverse-square law). You could potentially take an image of something like the RSM with the LEDs and a long-enough exposure, but I question if it would be that interesting.

Posted by: djellison Jun 3 2013, 08:17 PM

Yeah - it would take some very long exposures ( and thus pretty noisy images probably ) - and also there's the energy penalty of heating the actuators at night to do it as well.


Posted by: Astro0 Jun 3 2013, 11:00 PM

It also might get a little creepy unsure.gif



laugh.gif

Posted by: wildespace Jun 4 2013, 09:58 AM

A comparison of the Mastcam 100 and MAHLI images of the newly-drilled hole at 'Cumberland' site, taken on the same day (Sol 279).



Further to my posts earlier in this thread, I'd like to draw attention to the difference in colour balance between these two images. The Mastcam image has a noticeable yellow/orange cast to it, while the MAHLI image seems to be closer to the true (white-balanced) colours. People have been insisting that there is no difference between the two cameras, yet the difference in the resulting images is apparent to me.

Original images: http://mars.jpl.nasa.gov/msl-raw-images/msss/00279/mcam/0279MR1174000000E1_DXXX.jpg
http://mars.jpl.nasa.gov/msl-raw-images/msss/00279/mhli/0279MH0268001000E1_DXXX.jpg

Posted by: mcaplinger Jun 4 2013, 11:47 AM

QUOTE (wildespace @ Jun 4 2013, 02:58 AM) *
People have been insisting that there is no difference between the two cameras...

Who's insisting that? Not me, see post #270 in this thread. There are (IMHO, minor) differences in color and I have explained why.

Posted by: fredk Jun 4 2013, 05:40 PM

Those images are probably not the best comparison, since the turret will be blocking part of the sky in the MH images. That should shift those images towards the blue (sky is redder than sun). Still, I'd say this likely won't account for all of the difference.

A better comparison would be views of the distant landscape with MH and mastcam, from the same location and close to the same time of day, if you could find such a pair.

Posted by: wildespace Jun 4 2013, 09:11 PM

QUOTE (fredk @ Jun 4 2013, 06:40 PM) *
Those images are probably not the best comparison, since the turret will be blocking part of the sky in the MH images. That should shift those images towards the blue (sky is redder than sun). Still, I'd say this likely won't account for all of the difference.

A better comparison would be views of the distant landscape with MH and mastcam, from the same location and close to the same time of day, if you could find such a pair.


Curiosity's self-portraits (taken with MAHLI) provide a good view of the surrounding terrain. For example, see the raw images from s85. MAHLI: http://mars.jpl.nasa.gov/msl/multimedia/raw/?s=85&camera=MAHLI and Mastcam: http://mars.jpl.nasa.gov/msl/multimedia/raw/?s=85&camera=MAST_
MAHLI shows a gentle light-brown terrain, and the white surfaces of the rover as white.

Comparison of the view of the distant Mt Sharp, from Mastcam (s271) and MAHLI (s85):

Posted by: djellison Jun 4 2013, 09:41 PM

Almost 200 sols apart - 1/3rd of a martian year - we know there to be dramatic differences in the Tau - those images SHOULDN'T look the same.

Posted by: mcaplinger Jun 4 2013, 09:45 PM

QUOTE (djellison @ Jun 4 2013, 02:41 PM) *
those images SHOULDN'T look the same.

Quite. Also, we acknowledge that the uncorrected color between MAHLI and Mastcam is slightly different. wildespace, you can stop making this point. The PDS delivery will have color-corrected versions of all images, and then you can argue about those if you wish. sad.gif

Posted by: Ant103 Jun 4 2013, 10:19 PM

But at the end, of what are we arguing about ? Some little variations of white balance ? Saturation ? What else ?

This is exactly the same as bring a groupe of personn to visit the Great Canyon, taking some pictures, and view their picture on the Internet later when they share it to each other. Some will look a little bit too blue-ish or too yellow-ish. But at the end, this will be the same landscape to be pictured, and that's all that count.

Posted by: Gerald Jun 5 2013, 11:49 AM

According to http://www.lpi.usra.edu/meetings/lpsc2012/pdf/2541.pdf Mastcam is more sensitive to red near 676nm than MAHLI according to http://msl-scicorner.jpl.nasa.gov/Instruments/MAHLI/.
If you compare the two camera sensitivity spectra to red with the http://en.wikipedia.org/wiki/Color_vision (human eye red sensors), you may see, that it's closer to the MAHLI sensitivity for red (no peak near 676nm). Therefore the closer similarity of MAHLI raw images to natural colors.
This as a hopefully plausible and detailed explanation for the color shift, besides different observation conditions.

A possible benefit by differencing the two images to simulate a far red filter can be retrieved more easily by applying the L4 filter, to my eyes.

Posted by: Phil Stooke Jun 5 2013, 12:43 PM

This is why I never do anything with color!

Phil


Posted by: fredk Jun 5 2013, 02:32 PM

QUOTE (Gerald @ Jun 5 2013, 11:49 AM) *
Therefore the closer similarity of MAHLI raw images to natural colors.

What makes you say MH is closer to natural? What does that even mean? Is your monitor carefully calibrated? Do we know what to expect, given the red sky illumination - i.e., shouldn't white parts of the rover appear yellow/redish? Etc, etc.

All we can do is look at the calibrated MER colour imagery, and wait for corrected MSL images...

Posted by: mcaplinger Jun 5 2013, 02:49 PM

QUOTE (Gerald @ Jun 5 2013, 04:49 AM) *
According to http://www.lpi.usra.edu/meetings/lpsc2012/pdf/2541.pdf Mastcam is more sensitive to red near 676nm than MAHLI according to http://msl-scicorner.jpl.nasa.gov/Instruments/MAHLI/.

I'm a coauthor on the LPSC abstract and you can't really get that level of detail out of those figures IMHO.

There's no debate that there are slight color differences between the cameras. They are not intentional. We didn't do anything to explicitly balance the colors; the raw images are just what's coming out of the camera. The differences are caused by the spectral transmissions of the lenses and the slightly different bandpasses of the IR cut filters. I spent quite a bit of time on the color-corrected archive products, which have been delivered to PDS, but I don't know when they will be released. And there will still be some uncertainty about what "color-corrected" means; I tried to be explicit in the documentation about exactly what the processing entailed, but radiometry is complicated and these instruments are cameras, not colorimeters or spectrometers.

Posted by: wildespace Jun 5 2013, 05:07 PM

Sorry if I upset or aggravated anyone here. My main intent behind this discussion is what would Mars appear like to the human eye. I know that our eyes get adjusted to any prevalent colouration, doing a sort of "white-balancing" themselves. If you take a RAW image in incandescent lighting using a DSLR, the image will come out unnaturally red because, unlike our eyes, the camera didn't adjust itself to the scene. Likewise, the raw Mastcam images come out with the prevalent yellow/orange cast to them (also affecting the white surfaces of the rover and the grey exposed rock), because of the dusty atmosphere. Which is why I was pleasantly surprised to see that MAHLI images show the (more or less) white rover, grey exposed rock, and much more subdued hues of the landscape and the sky. Call my conclusion unbased, but like Gerald I think that MAHLI images are closer to what the human eye would see on Mars.

Anyhoo, looking forward to the PDS. smile.gif

Powered by Invision Power Board (http://www.invisionboard.com)
© Invision Power Services (http://www.invisionpower.com)