IPB

Welcome Guest ( Log In | Register )

5 Pages V  « < 2 3 4 5 >  
Reply to this topicStart new topic
Processing VIMS cubes, An attempt at "true" color
slinted
post Sep 20 2006, 11:00 PM
Post #46


Member
***

Group: Admin
Posts: 468
Joined: 11-February 04
From: USA
Member No.: 21



UGordon, first off, this is an amazing project you have undertaken! I'm really impressed with your results so far. It is a perfect use of the VIMS dataset, especially given the large number of vastly different targets.

As to the VIMS cube to color conversion, I think you may be better off to start fresh rather than using the conversion code by Andrew Young. Some of the hue problems you described might be explained by his 'weighting' of the color matching functions with Illuminant C. It is a bad approximation of direct sunlight, but more importantly, I'm fairly sure his use of it as a weighting for the color matching functions (and your scaling of it, to D65) actually changes the hues arbitrarily.

From my understanding of spectrum to XYZ conversion, the best way to handle the calculation is to integrate (Newton-Cotes would be a good method) your spectrum directly through the CIE Color Matching Functions ( http://www.cvrl.org/cmfs.htm , CIE 1931 2-deg modified by Judd (1951) and Vos (1978)). That will give you XYZ coordinates, appropriate to whatever you deem the source lighting to be. Then, in a separate step, you convert from source XYZ to destination XYZ (D65). XYZ scaling isn't the best way to handle this conversion since it changes the hues inappropriately and doesn't change the luminance at all. You might want to look into either Bradford or Von Kries as an alternate method.

What you choose as a source white point is another matter entirely. It depends on what you want your true color to represent. Here are two scenarios:

1) An astronaut is sitting inside a ship, looking out the window and sees the scene. In this case, the viewer would be adapted to the internal lighting of the ship and the lighting of the scene would have little to no effect on the way it is perceived. If this is the goal, no chromatic adaptation is necessary. You can simply convert the original XYZ values directly to D65 according to the matrices listed earlier in this thread.

2) An astronaut is floating in space, with only the VIMS scene visible to them (or, in a similar scenario, a person is seeing the VIMS scene as if it were through the eyepiece of a very powerful telescope).

If you are aiming for #2, then you can use the Bradford or Von Kries method for adapting into D65 from whatever you deem the source white to be.
Go to the top of the page
 
+Quote Post
ugordan
post Sep 21 2006, 07:46 AM
Post #47


Senior Member
****

Group: Members
Posts: 3648
Joined: 1-October 05
From: Croatia
Member No.: 523



Thanks, Slinted! I'll certainly have a look at the Bradford method and compare the results, even though it seems to be very similar in terms of the conversion matrix to the XYZ scaling. The planetary targets are mostly dull, so I figure the different methods won't produce radically different results. I already tried with the normal XYZ CMFs, but got weird results no matter if I removed or not the solar spectrum from the data. My sort of ground-truth check to color is Enceladus, as well as Mimas and Dione. Enceladus is basically all white and if I don't get 1.0/1.0/1.0 as RGB intensities I assume the result is bogus. Currently, it fits neatly with all three coming out nice and neutral, with Dione's dark-stained side ever-so-slightly greenish, which as I understand is correct.
What I don't understand, though, is how the heck neutral Enceladus turns white with my current code and the Moon doesn't. If they both have a flat spectrum, they should both turn out the same color. Mimas did, it's gray and though it's substantially darker than Enceladus, it made no difference to the code.

The Moon result reminds me of shots like this, Apollo 15 and Apollo 15 #2 for example (last two ripped off from www.apolloarchive.com). I haven't looked further, but there are other similar ones as well.

Is it even remotely possible the Moon is pale brownish and it's because of great brightness difference between it and the dark sky that it appears gray and washed out to us on Earth? Don Mitchell's palette would certainly suggest so.

Another image which is more of a demonstration of color than a real image. For all practical purposes VIMS couldn't resolve the Galilean satellites, they were merely pixels when viewed in normal resolution mode. The spectra is there, however. This composite of Io, Europa and Ganymede is vey, very, very enlarged and it's only to show the color. The cubes were taken at about 10 million km.
Attached Image


Note how yellowish Io turned out, compared to Europa and Ganymede. This is in good agreement with the "true" color of Io, unlike those enhanced pizza-shots Galileo and Voyagers took. Europa and Ganymede turned out colorless, Ganymede perhaps in part due to brightness applied here. The image doesn't show the relative brightnesses of the moons, I scaled each one to maximum. Note that VIMS visual channel exhibits a kind of chromatic aberration where all spectral channels don't project perfectly on the same pixel. This can cause some color shifting at point-like sources like these (in fact it's noticeable) so the end result is just illustrative. Nevertheless, it's nice to see Io standing out even at this distance.


--------------------
Go to the top of the page
 
+Quote Post
Guest_DonPMitchell_*
post Sep 21 2006, 12:55 PM
Post #48





Guests






It is true that Judd and Vos made improvements to the color matching functions. But I don't believe those changes are part of the XYZ color value talked about in televsion and sRGB standards. So I think you want to integrate against the CIE 1931 matching functions, not the Judd or Vos versions.

Why is there a white-point correction? Are you calculating an image of the reflectance function, or an image of what would be seen by an observer in space?
Go to the top of the page
 
+Quote Post
ugordan
post Sep 21 2006, 02:25 PM
Post #49


Senior Member
****

Group: Members
Posts: 3648
Joined: 1-October 05
From: Croatia
Member No.: 523



QUOTE (DonPMitchell @ Sep 21 2006, 01:55 PM) *
Why is there a white-point correction? Are you calculating an image of the reflectance function, or an image of what would be seen by an observer in space?

As I was saying, using Young's matching code, uniform reflectance gives me converted RGB values of let's say 0.9,0.8,1.0 - somewhat bluish. I want that to be 1.0,1.0,1.0 so the surface would turn out white (this of course gamma-corrected and scaled to 255). Hence that XYZ scaling. I'm trying to calculate what would be seen by an observer in space. What's the difference to the other approach anyway?

The caveat with Young's code is I think in that it takes reflectance spectra of a material illuminated by Illuminant C. Hence, I have to divide the observed VIMS brightness spectra by solar spectra. This code then gives me the appearance of the object by this illumination. As I say, it's an approximation of daylight on Earth so an all-white material (such as Enceladus, effectively) will turn out bluish. That's where the Illuminant C -> D65 scaling comes in. It's not a magic fudge factor I made up, but found in the table slinted also referenced. This correction gives me 1.0,1.0,1.0 for white materials.
This worked well for the time being. I've been more focussed lately on trying to remove those pesky lines than do the color matching thing from scratch. In any case, a more "neat" matching code is on my todo list.


--------------------
Go to the top of the page
 
+Quote Post
Guest_DonPMitchell_*
post Sep 21 2006, 02:43 PM
Post #50





Guests






Reflectance is independant of the illuminant. It should just be the ratio of reflected/incident spectral density.

sRGB is defined with a white point of D65. That means that sunlight should give you very close to RGB = 1.0, 1.0, 1.0. A flat spectrum would be illuminant E, but you don't want to make that your white point.

I don't think you should be doing any of that division by spectra and whitepoint transformation. The VIMS camera is giving you the information to calculate absolute spectral density values. Just project those onto the matching functions to get XYZ, and transform that to sRGB, and you're done. That will give you what a human observer would see in space.

Just for a sanity check, I ran some tests through my colorimetry code, to print out XYZ and RGB and chromaticity for some standard spectra:

CODE
Illuminant A:
XYZ = 24.9524822235107, 22.7151050567626, 22.7151050567626
RGB = 41.9157028198242, 18.7636756896972, 5.29991245269775
xy = 0.447566837072372, 0.407435506582260

Illuminant D65:
XYZ = 21.1444530487060, 22.2462940216064, 22.2462940216064
RGB = 22.2496452331542, 22.2456779479980, 22.2425174713134
xy = 0.312734514474868, 0.329031169414520

Solar Spectrum:
XYZ = 405.107116699218, 417.494018554687, 417.494018554687
RGB = 456.046386718749, 408.477874755859, 393.248840332031
xy = 0.323091745376586, 0.332970887422561

Illuminant E:
XYZ = 0.224980384111404, 0.224962189793586, 0.224962189793586
RGB = 0.271081715822219, 0.213312312960624, 0.204518526792526
xy = 0.333313554525375, 0.333286613225936


These seem to check, at least the chromaticity values are right. Illuminant E and pure sunlight (in space) both come out just slightly reddish. D65 is a better approximation of sunlight at sea level.

I would still claim that "true color" means you don't get to fiddle with anything to make the image look subjectively "better". You can't change the white point or gamma or try to make Saturn look like it is being lit by a giant 6500 K tungsten lamp instead of the sun. You have to just process the spectrum into XYZ and then into sRGB and show that to people. And it's up to them to make sure their monitor is sRGB compatable, that's not our problem.
Go to the top of the page
 
+Quote Post
JRehling
post Sep 21 2006, 06:06 PM
Post #51


Senior Member
****

Group: Members
Posts: 2530
Joined: 20-April 05
Member No.: 321



QUOTE (ugordan @ Sep 21 2006, 12:46 AM) *
Is it even remotely possible the Moon is pale brownish and it's because of great brightness difference between it and the dark sky that it appears gray and washed out to us on Earth?


Bingo.

Time for a lot of hedges, though. Color is not an objective property the way mass is. Color is something we perceive, and the perceived color of an object most definitely depends, as you suggest, on the level of illumination. It all comes down to the response of the three types of cones plus the one type of rod, and those are nonlinear with respect to level of illumination.

In dim light, the eye perceives no color because the rods respond but no cones do. At moderate-to-bright levels of lighting, the cones will respond according to the rough R/G/B reflectance of a surface. But at brighter levels, saturation can take place. A somewhat greenish object will look increasingly pale-to-white as illumination increases to the point that the "green" cones are responding at full capacity, but the reds and blues start to catch up. Now to retract some of that -- it's not just the cones, but the contextually-influenced color sensory perception a couple of layers deeper than the retina. So the saturation effect can occur at moderate levels of illumination if the eye is otherwise dark-adapted.

If you get the opportunity to see the Moon in a daylit sky with low humidity (you may not live in an area where that ever really happens) and there happens to be a cumulus cloud close enough to compare it with the Moon, you'll see: The Moon is kind of brownish. To your eye, in those conditions. Having your eye be light-adapted makes all the difference.

The thing is, it doesn't make sense to say the Moon really "is" the way it appears when your eye is dark adapted or the way it appears when your eye is light adapted. That's your eye's baggage, and it has nothing to do with the Moon. There really isn't any such thing as true color, when it comes right down to it. It's necessarily subjective and contextual.
Go to the top of the page
 
+Quote Post
Guest_DonPMitchell_*
post Sep 21 2006, 07:03 PM
Post #52





Guests






[attachment=7578:attachment]

There's a brown Moon for you, taken by Galileo. I doubt if the image was created with any fancy colorimetric processing, but it is amusing.

Color perception is effected by viewing conditions. Among other things, your brain does a sort of automatic white-point compensation -- adaptation. Everyone has noticed how bad color photographs look when they are taken with indoor lighting, but to our eye, the lighting never seemed as bad.

For looking at dyes and fabrics or doing graphics arts, people buy expensive precision lighting, typically D50 lamps that meet ISO standards. Monitors generate their own light, so its not as critical to have D50 lamps, except for the secondary effect of your visual system's color adaptation. Here is a nice article I found by an artist, about how he set up his workspace: Kevin Mills Photography.

However, sRGB is well defined. If you are synthesizing an image or if you have spectral sensor data, then there is one unique standard thing to do, calculate 24-bit sRGB. All the messy stuff happens when people view it, but that is someone else's problem.
Go to the top of the page
 
+Quote Post
ugordan
post Sep 21 2006, 07:16 PM
Post #53


Senior Member
****

Group: Members
Posts: 3648
Joined: 1-October 05
From: Croatia
Member No.: 523



JRehling, those are all valid, good points. I said from the beginning true color is subjective. Heck, I even put "true" in quotes to emphasize that. One can still try, though...

Right, so here are a couple of checks using regular CIE XYZ 1931 2-deg color matching functions. If I pass them a unity spectrum, the result is xy=0.333, 0.333. The sRGB color comes at slightly reddish as you suggested. Checks out with your result.
However, if I pass the solar spectrum (derived from Cassini ISS calibration volume) I get xy=0.3496, 0.3526. The sRGB comes at fairly reddish. This obviously differs from your xy = 0.32309, 0.33297. Why is that, I'm still trying to figure out. Even if it did match your result, you say it would still turn out reddish. Obviously if I use this function on Enceladus, it too turns reddish. How accurate is that, especially taking into consideration the eye will adapt to this reddish color of the sun's illumination and perceive Enceladus as being white? The image on the screen will remain reddish, though.

Are you saying I should drop 255,255,255 as being white but settle for 255,253,248 (example) and hope the calibrated monitor setting will turn that into visually apparent white?


--------------------
Go to the top of the page
 
+Quote Post
slinted
post Sep 21 2006, 08:52 PM
Post #54


Member
***

Group: Admin
Posts: 468
Joined: 11-February 04
From: USA
Member No.: 21



Good points made by all,

Part of my confusion on this might be coming from a misunderstanding about what the dataset is providing. Are the VIMS cubes in spectral power, or reflectance? (I appologize if I'm using these terms incorrectly) Has the illuminant spectrum already been removed?

If the VIMS data is in spectral power, then I would say that chromatic adaptation is still necessary since, as Don said, D65 isn't solar spectrum. It is what we see here on Earth.

If it is a reflectance dataset, then wouldn't that spectrum have to be multiplied by the spectrum of the D65 illuminant before being passed to the CIE color matching functions?
Go to the top of the page
 
+Quote Post
ugordan
post Sep 21 2006, 09:14 PM
Post #55


Senior Member
****

Group: Members
Posts: 3648
Joined: 1-October 05
From: Croatia
Member No.: 523



VIMS cubes, their raw form, capture the brightness of the source. During calibration, it's possible (in the official code) to choose to divide by the solar spectrum so you get the reflectance (I don't know if I'm using the term correctly. I'd say you get the "intrinsic color" of the target). Since as Don said, if you leave the solar spectrum in and simply integrate the CMFs, you'd get a white point that would be reddish, i.e. it wouldn't be 255,255,255 in RGB. Young's code worked on this intrinsic color and assumed an Illuminant C (not solar spectrum) on it. He was I think interested in what color the materials would be here on Earth, so that's where C came from. That's why I had to convert to the D65 whitepoint to get a 255,255,255 whitepoint. I've now dropped his code. What I do now is I still divide by solar spectrum, multiply by D65 spectrum (you can call this chromatic adaptation, but it's not a simple scaling as before, but a full spectrum apllied so it should be more accurate) and integrate the CIE functions. This eliminates the need for the XYZ scaling afterwards. The results are virtually indistinguishable from the previous method, nevertheless I'm more comfortable now as I understand what I've actually done:



Left side: solar spectrum divide -> Young's algorithm -> XYZ scaling
Right side: solar spectrum divide -> D65 spectrum multiply -> CIE functions


--------------------
Go to the top of the page
 
+Quote Post
JRehling
post Sep 21 2006, 10:32 PM
Post #56


Senior Member
****

Group: Members
Posts: 2530
Joined: 20-April 05
Member No.: 321



QUOTE (slinted @ Sep 21 2006, 01:52 PM) *
If the VIMS data is in spectral power, then I would say that chromatic adaptation is still necessary since, as Don said, D65 isn't solar spectrum. It is what we see here on Earth.

If it is a reflectance dataset, then wouldn't that spectrum have to be multiplied by the spectrum of the D65 illuminant before being passed to the CIE color matching functions?


It's worse than that. wink.gif

In most of these cases, monitors are inherently dimmer than the sunlit surfaces we're talking about, and the relative sensory-system response to the three primary colors can and does shift as you increase/decrease illumination. So let's say some body out there simply reflected twice the green light it did blue light, reflecting those only in narrow bands, and it reflected no red at all. It would not necessarily capture the color of that object merely to have a display also emit twice the green light as blue; if the illumination is different, then a different ratio must be used to capture the ratio of color responses one would have seeing that object in daylight. An extreme example of this can cause the apparent relative brightness of two patches to depend on the illumination in which you view them -- even if the light sources both have solar spectrum! Another related phenomenon is that it is often possible to see sun dogs or dim rainbows if you are wearing sunglasses, only to have the color disappear when you take the sunglasses off.

I computed once that Uranus (and bodies beyond that, for the most part) has a luminance that a computer monitor can match, but everything closer to the Sun than that is brighter than your monitor could display. So achieving the colors one would perceive from a spaceship window requires some trickiness.
Go to the top of the page
 
+Quote Post
ugordan
post Sep 22 2006, 08:07 AM
Post #57


Senior Member
****

Group: Members
Posts: 3648
Joined: 1-October 05
From: Croatia
Member No.: 523



QUOTE (JRehling @ Sep 21 2006, 11:32 PM) *
I computed once that Uranus (and bodies beyond that, for the most part) has a luminance that a computer monitor can match, but everything closer to the Sun than that is brighter than your monitor could display.

Now that would be something... getting a totally dark room around you with an image of Neptune on the screen that is precisely as bright as the real thing and your eye is accustomed to the low light. I wonder how dim that'd appear. It would be difficult to calibrate, though. I imagine you'd need one of those Light Intensity Measuring Thingies (LIMTs).

There are CRT monitors nowadays that have a sort of "Magic Bright" function that kicks up the brightness substantially, I imagine accurate luminances could be reached even at Saturn, though I'm not so sure about Enceladus and Tethys, them being so high-albedo and all.


--------------------
Go to the top of the page
 
+Quote Post
ugordan
post Sep 22 2006, 06:41 PM
Post #58


Senior Member
****

Group: Members
Posts: 3648
Joined: 1-October 05
From: Croatia
Member No.: 523



I can understand the reasoning of not touching the illumination, not removing solar spectra from the cubes. The problem with that is solar spectrum turns out reddish and everything has a distinct red hue. Here are a couple of comparisons, left images show no spectrum messing, simple integration through the CIE XYZ functions.




I suppose if you stared long enough at scenes like this, the eye/brain would automatically adapt to the color and make a new white-point, making Enceladus (uppermost left image) appear white again. The problem is, this is not gonna happen on a computer screen.

You be the judge of what looks more reasonable.


--------------------
Go to the top of the page
 
+Quote Post
Guest_DonPMitchell_*
post Sep 22 2006, 07:06 PM
Post #59





Guests






QUOTE (ugordan @ Sep 22 2006, 11:41 AM) *
You be the judge of what looks more reasonable.


That doesn't look right to me. I think something is wrong somewhere in the processing. You've got me interested now, so let me drag a VIMS image into my image library and see if I can reproduce this effect.
Go to the top of the page
 
+Quote Post
ugordan
post Sep 22 2006, 07:45 PM
Post #60


Senior Member
****

Group: Members
Posts: 3648
Joined: 1-October 05
From: Croatia
Member No.: 523



Be my guest. For starters, here's a dump of the 64x64 Jupiter cube. The wavelenghts are stacked top to bottom, there are 95 of them in 5 nm increments starting at 360 nm. They were linearly interpolated from the VIMS cube as it has discrete wavelengths, spaced on average 7-8 nm. The dark background was removed (actually, that was done onboard), flatfield applied and corrected for detector performance. The samples are linear.
Attached Image

This looks weird as a thumbnail, being a 64*6080 image biggrin.gif


--------------------
Go to the top of the page
 
+Quote Post

5 Pages V  « < 2 3 4 5 >
Reply to this topicStart new topic

 



RSS Lo-Fi Version Time is now: 28th March 2024 - 03:16 PM
RULES AND GUIDELINES
Please read the Forum Rules and Guidelines before posting.

IMAGE COPYRIGHT
Images posted on UnmannedSpaceflight.com may be copyrighted. Do not reproduce without permission. Read here for further information on space images and copyright.

OPINIONS AND MODERATION
Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators.
SUPPORT THE FORUM
Unmannedspaceflight.com is funded by the Planetary Society. Please consider supporting our work and many other projects by donating to the Society or becoming a member.