MSL Images & Cameras, technical discussions of images, image processing and cameras |
MSL Images & Cameras, technical discussions of images, image processing and cameras |
Oct 15 2012, 08:32 AM
Post
#181
|
|
Martian Photographer Group: Members Posts: 352 Joined: 3-March 05 Member No.: 183 |
Smith & Lemmon 1999 (MPF special issue) talks about tau=0.5 implying ~40:60 sky:Sun. I should note that getting to 40:60 or 50:50 accounts for all the sky light, and the relatively bluer light near the Sun can offset the rest of the sky.
With respect to Sun color: it is a subtle (few %) effect, but the optical depth increases with wavelength in the visible in the absence of ice (same ref). The color of the sky near the Sun at sunset is not the same as that of the Sun. On Earth, the coloring of Sun & sky is due to removal of blue light. On Mars, the coloring is all about distribution: diffraction of the blue light keeps it closer, in angle, to the Sun compared to red light. The daytime sky is reddened due to absorption of blue, not the preferential removal of red light. |
|
|
Oct 15 2012, 08:56 AM
Post
#182
|
|
Member Group: Members Posts: 154 Joined: 19-September 12 Member No.: 6658 |
|
|
|
Oct 15 2012, 04:55 PM
Post
#183
|
|
Senior Member Group: Members Posts: 2511 Joined: 13-September 05 Member No.: 497 |
Smith & Lemmon 1999 (MPF special issue) talks about tau=0.5 implying ~40:60 sky:Sun. I should note that getting to 40:60 or 50:50 accounts for all the sky light, and the relatively bluer light near the Sun can offset the rest of the sky. That would be "Opacity of the Martian atmosphere measured by the Imager for Mars Pathfinder", Smith, Peter H.; Lemmon, Mark, Journal of Geophysical Research, Volume 104, Issue E4, p. 8975-8986. Doesn't that imply that shadowed regions would be something like half the brightness of directly-illuminated ones, something which is demonstrably not true for, say, MAHLI images with small amounts of shadowing from the arm? For the one image I looked at, the ratio (linearized) was more like 3.2:1. Of course, this is a tricky geometric radiosity problem and I do admit that the shadows are brighter than I expected. -------------------- Disclaimer: This post is based on public information only. Any opinions are my own.
|
|
|
Oct 15 2012, 07:26 PM
Post
#184
|
|
Member Group: Members Posts: 222 Joined: 7-August 12 From: Garberville, CA Member No.: 6500 |
Sorry for the long post but if this works as a fairly accurate white balancing trick I wanted to get everybody's take on it and share the technique...
The white balance issue (thanks for bringing it to the forefront mcaplinger) certainly is a pesky one for imagery geeks like myself. Where there is a calibration target in the image its a pretty simple task to equalize RGB values based on a gray target. Mcapliger's general adjustment values are certainly a move in the right direction. But with so many differing landscapes and lighting values based on time of day, sun angle, and atmosphic dust content, and absolutely nothing to accurately base mean gray values on, I've been struggling with how to find a method of determining just how one would go about getting an accurate white balance with any given image. This morning I stumbled across a technique that after a some extensive tests seems to show promise... I was remembering how if one takes a color image (any image that is, Earth based or not), and copies it into another layer, inverts the color, and reduces the opacity of that inverted upper layer to 50%, the transparent inverted colors cancel out the colors of the original image below, leaving a blank neutral grayscale image. Using this concept and and a few tools in Photoshop (requires any CS version) I created custom photo filters (that vary slightly on a per image basis) that appear to white balance the Martian atmospheric tinge on calibration targets near perfectly, and so by extension one could argue the landscape as well. Here's the technique I used for the following examples (I suppose one could loosely refer to it as "IBF" or Invert > Blur > Filter): 1. Open an MSL image in Photoshop. 2. Duplicate the "Background" layer. You now have a "Background copy" layer above the original. 3. With the "Background copy" layer selected, choose "Image > Adjustments > Invert", then "Filter > Blur > Average". You should now have a bluish single colored blank layer. 4. Use the eye dropper tool to select this color as your foreground color on the tools palette. Now turn this layer off so you can see the original image. 5. Now select your MSL image layer again ("Background") and choose "Image > Adjustments > Photo Filter..." 6. In the dialog that opens choose the "Color" radio button and click on the default color swatch and assign it with the bluish foreground color you saved on the tools palette. 7. Move the slider to 95%. (determined because at 95% the calibration target gray RGB values are closest). Now if the image you're using is a landscape shot you're going to notice the color has washed out quite a bit. This is the filter at work. Just do the following. 8. Choose "Image > Adjustments > Hue/Saturation..." and increase the Saturation Slider to about 55-60. This last adjustment is about where most of the tests I ran on landscapes seemed to restore the color to about the intensity of the original, though admittedly it's arbitrary. In fact, the calibration target samples below only seemed to require a 25-30 increase in saturation, as any higher seemed to over saturate them. I take this to perhaps be and indication that the farther away the target, the more saturation "recovery" must be applied (due to the extra desaturation effect of the atmosphere?). Using this technique on an image by image basis (i.e. the precise color of the filter varying as per the inverted, blurred averaged color of the original) I was able to achieve the following results. As the technique seemed to almost perfectly balance the white, gray, and black levels of the calibration targets, could we then assume that the landscape color values must then be similarly accurate? Hmmmm. -------------------- "We shall not cease from exploration, and the end of all our exploring will be to arrive where we started and know the place for the first time." -T.S. Eliot
|
|
|
Oct 15 2012, 08:34 PM
Post
#185
|
|
Senior Member Group: Members Posts: 2511 Joined: 13-September 05 Member No.: 497 |
Sorry for the long post but if this works as a fairly accurate white balancing trick I wanted to get everybody's take on it and share the technique... How is this different than any other auto white-balance algorithm? It looks to me like a variant of the standard "gray world" algorithm. http://therefractedlight.blogspot.com/2011...gray-world.html If you want to make the average color of the scene neutral, it works fine, but that may not be what you really want to do. -------------------- Disclaimer: This post is based on public information only. Any opinions are my own.
|
|
|
Oct 15 2012, 08:46 PM
Post
#186
|
|
Senior Member Group: Members Posts: 3516 Joined: 4-November 05 From: North Wales Member No.: 542 |
It looks (on my monitor, to my eyes, adjusted to current ambient lighting from a tungsten filament lamp) too blue to be true. I'll check again in the morning.
|
|
|
Oct 15 2012, 08:49 PM
Post
#187
|
|
Member Group: Members Posts: 154 Joined: 19-September 12 Member No.: 6658 |
With Deimos arguments above (40:60 - sky:sun) we are back to a slight color tint from the sky?
I would appreciate this (Of course the human eye would counterbalance this if you stay there). Regarding Ed's approach - all greyer parts in the image (clean parts of the rocks for example) would come out too blue because of the averaging done in step 3, assuming that most of the image is somewhat yellowish. |
|
|
Oct 16 2012, 12:08 AM
Post
#188
|
|
Member Group: Members Posts: 866 Joined: 15-March 05 From: Santa Cruz, CA Member No.: 196 |
the calibration target is looking pretty dusty already, is there any means to clean it?
sorry to pose such a noob question, but how to otherwise prevent the calibration to skew towards martian dust tones going forward? UPDATE: if anyone has similar questions, Joe's post and others surrounding it cleared up a lot o this since i missed that whole discussion.. nevertheless, since it would dust-up in a couple months, i guess the color target was only included for a brief post-landing calibration sanity-check? i was envisioning the arm lurching up to brush off the dust off whilst attempting its best to not clobber critical components... |
|
|
Oct 16 2012, 12:22 AM
Post
#189
|
|
Senior Member Group: Members Posts: 2511 Joined: 13-September 05 Member No.: 497 |
the calibration target is looking pretty dusty already, is there any means to clean it? In a word, no. I think people may be confused about the difference between calibration and white balance. Calibration is removing instrument signature. Once it's done, it doesn't need to be done again as long as the instrument stays stable (and there is not much reason for it not to be). White balance is making white things look white in a particular image regardless of whether they would "really look white in reality" (whatever that means). What I was attempting to do was more the former than the latter. -------------------- Disclaimer: This post is based on public information only. Any opinions are my own.
|
|
|
Oct 16 2012, 02:38 AM
Post
#190
|
|
Member Group: Members Posts: 222 Joined: 7-August 12 From: Garberville, CA Member No.: 6500 |
How is this different than any other auto white-balance algorithm? It looks to me like a variant of the standard "gray world" algorithm. http://therefractedlight.blogspot.com/2011...gray-world.html If you want to make the average color of the scene neutral, it works fine, but that may not be what you really want to do. Well you're right it's kind of a variant but with a totally different mechanism. What I like about the filter approach is that traditional white balancing adjustments like the "Gray World" curves balancing, levels tweaks, and most of the "Auto White Balance" algorithms I've experimented with involve directly altering the separate RGB input levels to hopefully achieve a balanced assumption of neutral gray, but in doing so, often alter the white and black intensity levels and contrast. It can get really tricky. The filtering approach doesn't attempt to foundationally alter the already existing RGB relationships or drastically alter the white or black intensity levels, just correct the yellowish cast from the atmospheric light by utilizing its directly inverted counterpart to filter it back toward neutral. That said, though professional graphics is a part of my business, I'm certainly not a scholar of color science as the knowledge of many members here are clearly beyond mine. What I do know is that of the many differing and sometimes subjectively random results from a variety of white balance routines I've played around with, it seems to be a pretty quick and painless technique, and if the post-filtering calibration target grays and whites are any evidence, offers a reasonably acceptable quantum of accuracy too. - As you said though... If "neutral" is the goal that is. -------------------- "We shall not cease from exploration, and the end of all our exploring will be to arrive where we started and know the place for the first time." -T.S. Eliot
|
|
|
Oct 16 2012, 08:26 AM
Post
#191
|
|
Senior Member Group: Members Posts: 1084 Joined: 19-February 05 From: Close to Meudon Observatory in France Member No.: 172 |
I tend to be more on the side of expecting the sky to color the terrain I totally agree with Deimos. When you look carefully to the VL1 images shown here (I know it's a kind of "old" visual science for some bloggers here, BUT let's go "back to the basics" !), http://www.unmannedspaceflight.com/index.p...st&p=193322 you will easily notice that on Sols 1520 and 1557 with maximum sky opacity, the terrain is much brighter than on Sols 1298 and 2001 (of course, all images were taken with same gain and offset). Besides, like Don, I think that there is still good science to be retrieved from VL images... So, in my opinion, Deimos is absolutely right in his sayings. |
|
|
Oct 18 2012, 10:42 AM
Post
#192
|
||||
Member Group: Members Posts: 154 Joined: 19-September 12 Member No.: 6658 |
I'm still somewhat puzzled with the brightness of the images. There is a nice image down from sol 71 with the rover and surface both in one image. As you can see below I compared this to an image from sol 61 (green filter). In the middle is a brightness corrected version to match the average surface brightness of the sol 61 image (and most other images).
As you can see the rover would be too bright now - so in reverse ... This would be mcaplingers color correction set to about 50% and somewhat reduced brightness (not linear as the highlights are still in and not that much as in the upper comparison). |
|||
|
||||
Oct 18 2012, 11:07 AM
Post
#193
|
|
Senior Member Group: Members Posts: 1619 Joined: 12-February 06 From: Bergerac - FR Member No.: 678 |
Ronald, you just can't made deduction starting on the basis of Navcam STRETECHED pictures. The big difference between Navcam and Mastcan is that its pics are not stretched.
But I can't seriously where's the debate is about. This is just a question of white balance. Even with a camera (compact, bridge or reflex, whatever), you can adjust it (sunlight, shadow, cloudy, tungstene, flash, etc.) and this can lead to a red-ish or a blue-ish picture. So, I guess that the Mastcams are tunned on sunlight white balance (maybe around a temperature of 5200 K). For me, and as a photographer, they are just correctly white balanced, and it's normal that there is some lack of contrast, especially with Mastcam100, because it's a telelens, and there is a lot of glass between the sensor and the subject. And most of the time, we have scenery imaged with a very high sun, this lead to a lack of shadowing, and yet, contrast. -------------------- |
|
|
Oct 18 2012, 01:05 PM
Post
#194
|
|
Member Group: Members Posts: 154 Joined: 19-September 12 Member No.: 6658 |
|
|
|
Oct 18 2012, 02:49 PM
Post
#195
|
|
Senior Member Group: Members Posts: 3648 Joined: 1-October 05 From: Croatia Member No.: 523 |
Mastcam images should be your reference point, not navcams. The former use a square root encoding that matches sRGB gamma pretty well, while navcam images to me seem to be returned in linear A/D converted form, which on computer screens makes the contrast enhanced. This, in addition to the raw stretch that makes darkest areas black and brightest areas white.
-------------------- |
|
|
Lo-Fi Version | Time is now: 29th April 2024 - 01:21 PM |
RULES AND GUIDELINES Please read the Forum Rules and Guidelines before posting. IMAGE COPYRIGHT |
OPINIONS AND MODERATION Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators. |
SUPPORT THE FORUM Unmannedspaceflight.com is funded by the Planetary Society. Please consider supporting our work and many other projects by donating to the Society or becoming a member. |