Printable Version of Topic

Click here to view this topic in its original format

Unmanned Spaceflight.com _ Cassini PDS _ Processing VIMS cubes

Posted by: ugordan Sep 10 2006, 07:51 PM

Right, a suggestion I made here in http://www.unmannedspaceflight.com/index.php?s=&showtopic=3125&view=findpost&p=66503 made me wonder why not try that myself. A bunch of data was sitting on the PDS, after all. After a hassle figuring out just how the image cubes are organized and trying to read them, finally I was able to produce some results. This is all very rough work, can be considered first-iteration only and not particularly accurate.
Basically, I used the cubes to extract the visible spectrum in the 380-780 nanometer range which was then input to color matching code I found http://mintaka.sdsu.edu/GF/explain/optics/color/color.html by Andrew T. Young.
The code integrates over 40 10-nm steps to produce CIE XYZ color components. I then converted these to RGB values.

I'm aware of at least three inaccuracies in my code as of yet: one is the above sampled code apparently uses Illuminant C as the light source, not true solar spectra so the color turns out bluish (has a temp. of 9300 K instead of 6500 K, AFAIK). I tried to compensate at the moment by changing the final RGB white balance, but this is probably an inaccurate way to go. Another inaccuracy is I don't do bias removal from the cubes. This likely affects the outcome. Also, I don't use the precise wavelengths the code requires, but use the closest one in the cube. I intend to fix this by interpolating between nearest wavelengths.

All images are enlarged 4x.


The leftmost image is a 4-cube mosaic. The colors in all four frames turned out identical which gives me at least some confidence. The image in the middle shows Dione's disc creeping in front of Saturn. Dione's disc appears elongated probably because as the lines were readout, it moved considerably in its orbit. The rightmost image shows a very overexposed Saturn image, the part below the ring shadows got overexposed. From what I've seen browsing through the PDS, a lot of the cubes are badly overexposed at some wavelengths.

Here's a couple of Jupiter images. I'm not very satisfied with them as they seem to look somewhat greenish, but overall the color looks believeable:



Lastly, two Titan composites. They turned out way more reddish than I thought they would.


It'll be interesting to see how much the results will change once I do a more proper processing pipeline working.

Posted by: Malmer Sep 12 2006, 09:48 PM

its a really great "realitycheck" for the ordinary ISS images.

Posted by: dilo Sep 12 2006, 10:21 PM

Very nice work, gordan... now we see the colors!
Only one observation: in the second 4-cube image, you say that dark, elongated feature in front of Saturn is Dione's disc. To me, is evident we are seeing the Dione shadow projected on the Saturn atmosphere: this would easily explain why is so dark and with an elongated/deformed shape... do not you agree?

Posted by: ugordan Sep 13 2006, 06:56 AM

QUOTE (dilo @ Sep 12 2006, 11:21 PM) *
To me, is evident we are seeing the Dione shadow projected on the Saturn atmosphere: this would easily explain why is so dark and with an elongated/deformed shape... do not you agree?

VIMS cube : v1492290669_1.qub, time : 2005-04-15 20:44:12 thru 2005-04-15 20:49:59

Solar System Simulator view http://space.jpl.nasa.gov/cgi-bin/wspace?tbody=699&vbody=-82&month=4&day=15&year=2005&hour=20&minute=45&fovmul=1&rfov=15&bfov=30&porbs=1&showsc=1.
There is no way any but the closest ring moons could have cast a shadow onto Saturn, it was still far from autumn equinox at the time.
BTW, all images except the first one are single cubes with a 64x64 pixel spatial resolution.

Posted by: ugordan Sep 13 2006, 08:56 AM

QUOTE (Malmer @ Sep 12 2006, 10:48 PM) *
its a really great "realitycheck" for the ordinary ISS images.

Indeed. I was surprised how close to official releases the Saturn and Jupiter images turned out. Frankly, I was a bit skeptical about color balances in some of CICLOPS releases, but these images (if they turn out more or less correct) prove me wrong. There are images that do look strangely color-balanced such as http://ciclops.org/view.php?id=1270 and especially the earlier http://ciclops.org/view.php?id=97 releases, but http://ciclops.org/view.php?id=79 is probably the closest to the left Jupiter VIMS image, especially when taking into consideration I increased the gamma in my images which reduced the contrast and washed out the colors a bit, but making them a bit more realistic IMHO.

Saturn also turned out remarkably similar to for example http://ciclops.org/view.php?id=1474 and http://ciclops.org/view.php?id=1947, again only differing in saturation and brightness. Then again, there's http://ciclops.org/view.php?id=1112...

The most notable exception I can see is Titan which turns out much more http://ciclops.org/view.php?id=1407 than red in ISS releases.

Posted by: CAP-Team Sep 13 2006, 02:16 PM

It looks as if the colors match Voyager's colors a lot more. The images of Titan look like the images Voyager took.

Posted by: Malmer Sep 13 2006, 02:49 PM

QUOTE (ugordan @ Sep 10 2006, 09:51 PM) *
I'm aware of at least three inaccuracies in my code as of yet: one is the above sampled code apparently uses Illuminant C as the light source, not true solar spectra so the color turns out bluish (has a temp. of 9300 K instead of 6500 K, AFAIK). I tried to compensate at the moment by changing the final RGB white balance, but this is probably an inaccurate way to go. Another inaccuracy is I don't do bias removal from the cubes. This likely affects the outcome. Also, I don't use the precise wavelengths the code requires, but use the closest one in the cube. I intend to fix this by interpolating between nearest wavelengths.


Another question you might ask yourself is what reference white one should have? Should one use an solar spectrum that are as seen from earth trough earths atmosphere (D65) or should one use an unfiltered solar spectra. (almost a blackbody at 5780K) check out :http://casa.colorado.edu/~ajsh/colour/Tspectrum.html

/Mattias

Posted by: Malmer Sep 13 2006, 03:04 PM

QUOTE (ugordan @ Sep 13 2006, 10:56 AM) *
Indeed. I was surprised how close to official releases the Saturn and Jupiter images turned out. Frankly, I was a bit skeptical about color balances in some of CICLOPS releases, but these images (if they turn out more or less correct) prove me wrong. There are images that do look strangely color-balanced such as http://ciclops.org/view.php?id=1270 and especially the earlier http://ciclops.org/view.php?id=97 releases, but http://ciclops.org/view.php?id=79 is probably the closest to the left Jupiter VIMS image, especially when taking into consideration I increased the gamma in my images which reduced the contrast and washed out the colors a bit, but making them a bit more realistic IMHO.

Saturn also turned out remarkably similar to for example http://ciclops.org/view.php?id=1474 and http://ciclops.org/view.php?id=1947, again only differing in saturation and brightness. Then again, there's http://ciclops.org/view.php?id=1112...

The most notable exception I can see is Titan which turns out much more http://ciclops.org/view.php?id=1407 than red in ISS releases.



It would be cool to use VIMS to "teach" a compositing software what to do with for example 3 ISS channels in different wavelengths. Sort of like what Daniel Crotty is doing for the MER rovers. (only different "lessons" for each body in the saturnian system)

That way one could get extremly "real" colors from relatively sparse ISS spectral data.

/Mattias

Posted by: dilo Sep 13 2006, 04:17 PM

QUOTE (ugordan @ Sep 13 2006, 06:56 AM) *
There is no way any but the closest ring moons could have cast a shadow onto Saturn, it was still far from autumn equinox at the time.

Thanks ugordan, I didn't consider the inclination of Saturn equator...
By the way, your gallery is really stunning, great work!
This is a little enhancement of one of these pictures; is not intented to be realistic and I cannot compete with your colorimetric precision, my only objective was to amplify the faint Jupiter atmospheric details and Europa surface features. Hope you will enjoy...

 

Posted by: DonPMitchell Sep 13 2006, 04:55 PM

How do you convert XYZ to RGB? Are you making sRGB? Here's the matric for that transform, just in case:

CODE
ML_Matrix3x3 ML_RGBtoXYZ( 0.412410914897918,   0.357584565877914, 0.180453807115554,
                          0.212649390101432,   0.715169131755828, 0.0721815153956413,
                          0.0193317625671625,  0.119194857776165, 0.950390040874481);
ML_Matrix3x3 ML_XYZtoRGB( 3.24081206321716,   -1.53730821609497, -0.498586475849151,
                         -0.969243109226226,   1.87596642971038,  0.0415550582110881,
                          0.0556384064257144, -0.204007476568222, 1.05712962150573);

Posted by: ugordan Sep 13 2006, 08:33 PM

Yes, I'm using that XYZ to RGB matrix. I had to also use a scaling factor for converting Illuminant C to D65, X*0.969459, Y, Z*0.922050, prior to the RGB conversion. Only that gave me white for uniform reflectance. The RGB output is linear, I manually adjusted gamma to 1.25 in Photoshop.

I think I have bias subtraction working now (I'm still unsure if it's correct), but flatfielding seems suspicious still. There are cubes with excessive vertical banding that flat fields don't seem to touch. I'm also using a 40 point solar spectrum derived from the ISS calibration code. Somehow simply dividing by the provided solar spectra cube gives a yellowish result, when in fact it should be roughly the same. I'm now interpolating the discrete spectral wavelengths to get a nice 380-780 range.

Here are the most recent results, this time enlarged 3x:

Mimas, Dione, Rhea, Hyperion, Iapetus


Jupiter still seems too greenish somehow. The rings definitely appear to have a distinct yellow hue to them, unlike the colorful, bluish ISS releases.

Posted by: DonPMitchell Sep 14 2006, 04:07 AM

Very good. Gamma 2.2 is technically what should be done, to get the image we see on the monitor to be proportional to actual scene radiance. What is the camera's radometric response function like? Are the pixel values in the VIMS file logarithmic?

As for dark current, making the background be 0 black should be right. But check the camera response. Subtracting background before or after linearization of the pixel values gives a totally different result if the camera response is nonlinear (and it probably is).

I'm not sure about the Illuminant C business. I'm not sure I understand what you did. The sRGB matrix assumes a white point of D65, which all new monitors are supposed to do. Don't think about correcting the color on the camera end, just concentrate on linearizing its response. Just plug those spectral radiance values into the XYZ integrals, convert to RGB by matrix multiple, and do gamma of 2.2. And you should see what the camera saw then!

Posted by: ugordan Sep 14 2006, 07:09 AM

The pixels are linear as far as I know, a 12 bit A/D converter is used. I think a precommanded bias value might be set to get the values out of the converter's low nonlinearity portion, or it might simply be a matter of subtracting dark current. Either way, I used the background algorithm found in the calibration package and then subtracted it.

As for Illuminant C, as I said the integration code I used produces a bluish white color if I pass it a uniform input reflectance spectra. When I used the regular CIE XYZ color matching functions, the colors turned weird. That'll need looking into. At the moment, using the supplied 40-step integration tables and using scaling factors found http://www.brucelindbloom.com/index.html?Eqn_ChromAdapt.html, for Illum. C -> D65 got me an exact white for the test input. This can be double-checked with Mimas, Dione and Enceladus which are all very gray and sort of provide "ground truth" for the white-point.

Regarding gamma, if I set it to 2.2, the colors are all washed up and all contrast is lost. Somehow that doesn't seem right? Even with a 1.25 gamma Hyperion's reddish tint is pretty much indescernible.

Posted by: Malmer Sep 14 2006, 09:30 AM

QUOTE (DonPMitchell @ Sep 14 2006, 06:07 AM) *
Very good. Gamma 2.2 is technically what should be done, to get the image we see on the monitor to be proportional to actual scene radiance. What is the camera's radometric response function like? Are the pixel values in the VIMS file logarithmic?



correct me if im wrong but: sRGB does not use a 2.2 gamma. it uses its own custom crve that is reasonably close to gamma 2.2...




the red curve is gamma 2.2
the green is sRGB:s "gamma" curve
the blue is just linear.

The difference is mostly visible in the dark regions.

/M

Posted by: Malmer Sep 14 2006, 09:43 AM

The pixels in the ISS images are not linearly saved. (the sensor is resonably linear but they dont save the data that way) You have to use a custom lookuptable to linearize those images. Maybe thats why you get strange results with the VIMS too. It looks to me as you are treating the raw data as linear an i dont think its saved that way. i think they save it almost logaritmically to increase dynamic range) thats why you get an image that looks as though it has alredy had an gamma correction applied to it. (i might be wrong here...)

here is the ISS data converion table.
http://pds-rings.seti.org/cassini/iss/COISS_0011_DOCUMENT/REPORT/iss/navcontent/appf/12-8-12.pdf
Im not sure if they use something similar on the VIMS data but my guess is that they do.

ISS calibration report:
http://pds-rings.seti.org/cassini/iss/COISS_0011_DOCUMENT/REPORT/


info and software for calibrating VIMS:
http://pds-rings.seti.org/cassini/vims/

/M

Posted by: ugordan Sep 14 2006, 10:47 AM

I am very well aware of the ISS data specifics. While it is mostly the case a LUT was used to convert linear (more or less, barring uneven bit weighting) 12 bit DNs to 8 bit, approximately square root encoded numbers, it is not always the case as you suggest. Sometimes full 12 bits were returned. The third case being the 12->8 bit conversion by returning only the lower 8 bits. This is also linear. In any case, my ISS code takes care of that.

However, I found absolutely no mention anywhere of lookup tables used in the VIMS instrument, in fact only mentions of lossless 12 bit encoders. The calibration code I checked doesn't use such conversions anywhere. It would be logical to save it via a square root LUT if they were downsampling the data, but all points to that not being the case.
The raw data is linear which can be seen from the calibration steps used in the provided code:

1. subtract dark background
2. divide by flatfield
3. divide by solar spectrum
4. multiply by detector performance as function of wavelength
5. convert into radiometrically correct units (optional)

Steps 1-4 should already yield accurate colors since I'm not interested in exact units.

Posted by: DonPMitchell Sep 14 2006, 03:39 PM

QUOTE (Malmer @ Sep 14 2006, 02:30 AM) *
correct me if im wrong but: sRGB does not use a 2.2 gamma. it uses its own custom crve that is reasonably close to gamma 2.2...




the red curve is gamma 2.2
the green is sRGB:s "gamma" curve
the blue is just linear.

The difference is mostly visible in the dark regions.

/M


I asked Gary Starkweather about this a few years ago, because the documentation talks about gamma of 2.2, and then also describes a piece-wise function. He told me the piecewise function was just introduced because there is some way to calculate it that is much faster than pow(x, 2.2). He claimed that gamma of 2.2 is what is intended. Just now, I found this http://www.srgb.com/srgbgammacalculation.pdf which seems to say the same thing.

Posted by: Malmer Sep 15 2006, 12:33 PM

I think the strange piecewise funktion has its benefits. it gives slightly better coverage in the black regions... when handling 8bit images...

Posted by: Malmer Sep 15 2006, 12:50 PM

QUOTE (ugordan @ Sep 14 2006, 12:47 PM) *
I am very well aware of the ISS data specifics. While it is mostly the case a LUT was used to convert linear (more or less, barring uneven bit weighting) 12 bit DNs to 8 bit, approximately square root encoded numbers, it is not always the case as you suggest. Sometimes full 12 bits were returned. The third case being the 12->8 bit conversion by returning only the lower 8 bits. This is also linear. In any case, my ISS code takes care of that.

However, I found absolutely no mention anywhere of lookup tables used in the VIMS instrument, in fact only mentions of lossless 12 bit encoders. The calibration code I checked doesn't use such conversions anywhere. It would be logical to save it via a square root LUT if they were downsampling the data, but all points to that not being the case.
The raw data is linear which can be seen from the calibration steps used in the provided code:

1. subtract dark background
2. divide by flatfield
3. divide by solar spectrum
4. multiply by detector performance as function of wavelength
5. convert into radiometrically correct units (optional)

Steps 1-4 should already yield accurate colors since I'm not interested in exact units.



I stand corrected.

I guess i should have been a bit more clear with the fact that I was speculating...

While im at it, here is another speculation that might or might not have scientific grounds;

One reason that could make the images look washed out in gamma 2.2 is that the solar intensity at saturn is only 1% of the intensity here on earth. So maybe the images on your screen are actually brighter than the real thing. And since the eye has a logatitmic response the colors wash out.

So maybe you should either calibrate the images to the actual intensity of your screen or just use an arbitrary gamma that looks good. I would go for the "looks good option"



Speculativly yours
/Mattias

Posted by: ugordan Sep 15 2006, 06:10 PM

QUOTE (Malmer @ Sep 15 2006, 01:33 PM) *
I think the strange piecewise funktion has its benefits. it gives slightly better coverage in the black regions... when handling 8bit images...

If you're referring to the linear portion close to 0, it's supposedly for suppressing low DN noise that would otherwise be apparent. As for different curves for different channels, I don't know anything about that. I assume that would totally trash the colors on all but those displays that were precisely calibrated - not many of those surely.

Posted by: ugordan Sep 15 2006, 06:26 PM

QUOTE (Malmer @ Sep 15 2006, 01:50 PM) *
One reason that could make the images look washed out in gamma 2.2 is that the solar intensity at saturn is only 1% of the intensity here on earth. So maybe the images on your screen are actually brighter than the real thing. And since the eye has a logatitmic response the colors wash out.

This could very well be true. But, since using absolute brightness here on Earth, it wouldn't be very useful as everything would turn out pretty dim on a computer screen. It would be interesting, though, to know how the apparent brightness would change once the eye got accustomed to low light levels at Saturn (if you were there, for example).
I guess we can settle the point on this whole matter being too subjective to quantify in a scientifically meaningful way. My primary goal was to get the "correct" hue of objects, principally Saturn's disc (more importantly the rings as I was wondering whether they're as colorful as ISS would lead to believe) and Titan, Iapetus and Hyperion. Pretty much anything there is gray anyway. Brightness was of lesser importance as that sort of thing is a hard beast to accurately "tame".

Posted by: Malmer Sep 15 2006, 11:40 PM

it would be fun to paint spheres in the colors you have derived and put them in a dark room with a dim light and just wait and see... smile.gif

/M

Posted by: DonPMitchell Sep 16 2006, 01:56 AM

One concept is to make an image that resembles what would be seen by a human being, there in space, looking through a window. As long as the colors are inside the gamut of your display, it is theoretically possible to achieve this:

1. Convert 12-bit VIMS values to linear radiance.
2. Correct for dark current
3. Apply gain flatfield correction
4. Apply spectral-sensitivity gain correction
5. Integrate against CIE observer functions to get XYZ
6. Convert XYZ to RGB
7. Apply global gain adjustment for desired image contrast.
8. Encode pixel channels, for example, iRed = int(pow(R, 1.0/2.2) * 255 + 0.5)

Gamma has a big effect on the color, so if 2.2 looks drastically wrong, I would double check the process. You don't want to do everything carefully and then fudge gamma in photoshop. I bet there is just a small problem somewhere, and if you find it, the image will come out beautifully.

Posted by: Malmer Sep 16 2006, 09:42 AM

QUOTE (DonPMitchell @ Sep 16 2006, 03:56 AM) *
Gamma has a big effect on the color, so if 2.2 looks drastically wrong, I would double check the process. You don't want to do everything carefully and then fudge gamma in photoshop. I bet there is just a small problem somewhere, and if you find it, the image will come out beautifully.


sometimes when i do gamma correction i convert to HLS, apply gamma to the L channel and convert back to RGB. that way the colors dond desaturate as much... maybe its not the most colorimetrically correct process but it looks better sometimes...

/M

Posted by: ugordan Sep 16 2006, 03:39 PM

A mosaic of Saturn's northern latitudes:


Appearance of Jupiter and Saturn at similar phase angles:


Two shots of the lit side of the rings:

The rings in the left image are clipped off, it's not Saturn's shadow. The narrow, white ring is the F ring. The dark area in the middle image is Saturn's shadow.

Several narrow slices of the unlit side mosaicked together:


The unlit side appears less dull-brownish and more bluish, possibly due to ice particles forward-scattering blue light stronger.

Posted by: john_s Sep 16 2006, 04:54 PM

Just checked out your gallery, ugordan- your processed Cassini images are truly glorious. Great work!

Posted by: ugordan Sep 16 2006, 05:56 PM

QUOTE (john_s @ Sep 16 2006, 05:54 PM) *
Just checked out your gallery, ugordan- your processed Cassini images are truly glorious. Great work!

Thanks! After comparing the results with the VIMS cubes, which ought to be more accurate, I may have to change the way I process Saturn (Titan too) composites a bit. The results I got so far seem consistently a bit on the greener side than the stuff that turns out here. A channel mix here and there should do the trick. wink.gif

Posted by: DonPMitchell Sep 16 2006, 06:47 PM

QUOTE (Malmer @ Sep 16 2006, 02:42 AM) *
sometimes when i do gamma correction i convert to HLS, apply gamma to the L channel and convert back to RGB. that way the colors dond desaturate as much... maybe its not the most colorimetrically correct process but it looks better sometimes...

/M


That's a good idea. I'd use Lab coordinates, which I think separate "color" from "luminance" even better.

Posted by: DonPMitchell Sep 16 2006, 07:01 PM

I also really like what you are doing. I hope you don't feel I am being discouraging. I am poking at the gamma issue, because I think you might have a bug someplace. Track it down, and then you will have rigorous "true color".

Posted by: Malmer Sep 16 2006, 09:02 PM

QUOTE (DonPMitchell @ Sep 16 2006, 08:47 PM) *
That's a good idea. I'd use Lab coordinates, which I think separate "color" from "luminance" even better.



Yes thats obviously much better but im often a bit too lazy... Guess I have to shape up. In the light of what you and ugordan have done with cassini and venera i feel that i really have to start from scratch with the stuff i have done.

I think that these pictures deserve the very best in processing that is humanly possible. I believe that it is important to make pictures that are true to reality. If its looks "dull" it should stay that way. These pictures are humanitys only way of experiencing these places and i dont think they should be enhanced or distorted to make them look more exotic.

keep it real!

/M

Posted by: ugordan Sep 16 2006, 09:37 PM

QUOTE (DonPMitchell @ Sep 16 2006, 08:01 PM) *
I hope you don't feel I am being discouraging.

Not at all. There's generally just too much "mystique" about gamma that I rather not mess around with it.
Here's a literal implementation of a 1/2.2 power function in the code:

The above shows nicely why I'm not too fond of gamma manipulations -- the terminator comes out too sharp (you don't get the feeling Jupiter is actually 3D), the colors and contrast are bleached, and it also brings out an ever-present non-dark background.
Hmm... or maybe that means I have to calibrate my monitor...

Here's that Saturn mosaic again:

Posted by: DonPMitchell Sep 17 2006, 04:17 AM

QUOTE (ugordan @ Sep 16 2006, 02:37 PM) *
Here's a literal implementation of a 1/2.2 power function in the code:


The real-life scene is probably not as saturated with color as a lot of photos show. Your gamma 2.2 images don't look like anything is going wrong in the software at all. I thought perhaps you were seeing something way off.

Posted by: ugordan Sep 17 2006, 04:44 PM

A couple of rough Saturn mosaics:


The blob in the middle image is Tethys.

Ring mosaics:



All images magnified 2x and gamma-corrected.

Posted by: Malmer Sep 17 2006, 07:17 PM

I think it looks great. Very subtle beautiful colors.

Keep em coming!

/M

Posted by: ugordan Sep 19 2006, 05:21 PM

A few Jupiter images, fixed the greenish hue. It was due to my code subtracting out dark background when apparently it was already subtracted from the Jupiter flyby cubes.


The middle and right image were taken not far apart, the rightmost image is much bigger because it utilized a high-resolution mode developed in-flight, it increases the spatial resolution by 3x at a cost of lowering the S/N ratio a bit. The leftmost image is also hi-res, but was taken far before closest approach.

Outbound crescent, 2 cube mosaic:


A couple of Europa transits:

It's a shame the transit on the right didn't catch more area to the left of Jupiter's limb, there was a great http://space.jpl.nasa.gov/cgi-bin/wspace?tbody=599&vbody=-82&month=1&day=2&year=2001&hour=10&minute=00&fovmul=1&rfov=2&bfov=30&showsc=1 of Io and Ganymede there. The rightmost image is actually very close in time to an ISS http://www.flickr.com/photos/ugordan/224555055/, taken on January 2nd, 2001.

Again, all images magnified 2x and gamma-corrected.

Posted by: Malmer Sep 19 2006, 07:23 PM

QUOTE (ugordan @ Sep 19 2006, 07:21 PM) *
A few Jupiter images, fixed the greenish hue. It was due to my code subtracting out dark background when apparently it was already subtracted from the Jupiter flyby cubes.


Just for the sake of inconsistensy smile.gif



I think it looks great. Fantastic actually. If i had to pick on something i would have to mention the almost invisible residual pattern in the images. faint thin horizontal and vertical lines. some of them seem to show up at the same place in all the images. (sometimes rotated 90 degrees but i guess thats you rotating the images for display) others seem more random. is this due to sensor sensitivity variation and/or does it have to do with the scanning process of the instrument?

Do you write your image processing software for some standard platform (IDL or something) or do you write totally standalone code?

/M

Posted by: ugordan Sep 19 2006, 07:37 PM

The vertical stripes (most noticeable is a blue line) in some darker shots... I've checked and double-checked the flatfields and dark background removal code and none of them even touches that. There's a load of them in the raw data, they appear to be static so those are probably hot pixels on the CCD. Since the cubes are readout one line at a time, in a push-broom mode with the spectra of the line being split in the vertical dimension, It means same samples in all lines in a given spectral band will suffer from the same problem. Hence the vertical lines. What I don't understand is why adequate flatfields/dark current models haven't been produced to deal with this. This only affects the visible channel, the IR channel seems fine (note the VIS and IR channels are actually two different physical parts). So far, there's nothing I can do about it.

The code is an ad-hoc implementation in C, it's rudimentary and messy as I progressively modified it from simply trying to read the cubes to do more complex stuff like color matching. It doesn't even output a nice PNG color image, but a raw 64x64 pixel dump with 3 interleaved rgb channels biggrin.gif

Posted by: Malmer Sep 19 2006, 08:16 PM

I guess you have to make an additional flatfield removal using a homebrewn flatfield. Try simply takeing one row of an image of black space, scale to width of image and remove it from the data and see what it looks like.



/M

Posted by: tedstryk Sep 19 2006, 08:18 PM

Great work! As for the calibration data, it could be that they are still working on it. A lot of these things weren't worked out at the time of the Jupiter encounter. It might be worth contactings someone to find out.

Posted by: ugordan Sep 20 2006, 07:24 AM

QUOTE (Malmer @ Sep 19 2006, 09:16 PM) *
I guess you have to make an additional flatfield removal using a homebrewn flatfield. Try simply takeing one row of an image of black space, scale to width of image and remove it from the data and see what it looks like.

I don't think it's a flatfield problem. Flatfields only correct for varying sensitivity of different pixels. This appears to be primarily a dark current problem - you can see the lines even in dark space which flatfields can't remove. Remember, you divide or multiply the image with a flatfield. This, on the other hand, will obviously need subtracting out. I don't even know where to begin modelling dark current data. The provided dark current model uses a two-parameter table, one a standalone part and the other a variable part multiplied by exposure, for each CCD pixel.
The model seems fine for normal resolution cubes (though even that doesn't completely remove everything), but the hi-res one seems buggy. Furthermore, these Jupiter images where the dark was subtracted by the instrument itself show no apparent signs of those nasty lines. Then again, this could be due to the fact Jupiter images took a 4 times shorter exposure than Saturn targets, because of increased light levels there so dark current didn't have as much time to build up.

A document on the PDS says new models are generated every month or so, but it looks to me the same tables appear on all 8 VIMS DVDs. I mailed one of the guys responsible for the PDS volumes so we'll see what they have to say.

Posted by: Malmer Sep 20 2006, 08:48 AM

it was a typo. i actually ment darkcurrent. mixed them up in my head for some reason. (thats why i suggested you to use a bit of black space to serve as a DC remover)

Posted by: ugordan Sep 20 2006, 09:22 AM

Although the noise stays fixed, it changes intensity -- it's likely connected with exposure duration. It's not likely to be as easy as subtracting a generic dark cube from all cubes. I don't think a small bit of dark space would do the trick as there's also a fair bit of cosmic ray noise induced. It would be required to average many dark lines. There are many dark space cubes that weren't even readout as a full 64 width swath so they're of limited usefulness. Additionally, the physical resolution of the CCD is AFAIK 192x96 pixels with 192 being one spatial line and 96 being the visible spectral channels. All cubes have a maximum resolution of 64 spatial pixels. Normal mode of operation simply sums three pixels in a line to get 192/3=64 pixels. The high res mode, however doesn't sum the individual pixels and that's where the 3x resoulution increase comes from. This is where it gets complicated a bit: which 64 pixels out of 192 will be readout depends upon the offset factors and is somewhat clumsy. A fair bit of margin for error is present if you don't know exactly what you're doing. The PDS document isn't all too clear on certain aspects.

Which brings me to the point -- if this is as simple as we suggest, why didn't the VIMS folks do this in the first place? This noise problem must have been known at least since Saturn orbit insertion. IMHO, it's not up to us to fix the calibration and produce our own files that do basic stuff like dark current removal.

Of course, all this is barring I haven't screwed up in my code somewhere, but I'm more and more certain that's not the case.

Posted by: ugordan Sep 20 2006, 11:34 AM

Not particularly related to my processing, but here are a couple of past releases by the VIMS team of http://vims.artov.rm.cnr.it/data/res-jup.html and http://vims.artov.rm.cnr.it/data/res-tit.html. The Jupiter page shows the same image as my bottom 2nd image from the right. Their image was mirrored left-right as Cassini never saw Jupiter from that vantage point. The dark spot on Jupiter's disc is not Io's shadow as they suggest, it's http://space.jpl.nasa.gov/cgi-bin/wspace?tbody=599&vbody=-82&month=1&day=2&year=2001&hour=08&minute=30&fovmul=1&rfov=2&bfov=30&showsc=1. They used a simple 3-channel RGB composite, but it's notable there's some vertical banding present in their data as well. This is not the same noise we were discussing here, though. It's more likely artifacts during the scanning mirror uneven movement and different lines were unevenly exposed hence the brightness gradient.
The Titan page I bring up because it does show the ugly vertical lines -- note the false color composite labeled "VIS channel". Seems they were unable to remove it as well.

Posted by: Malmer Sep 20 2006, 03:23 PM

QUOTE (ugordan @ Sep 20 2006, 11:22 AM) *
Which brings me to the point -- if this is as simple as we suggest, why didn't the VIMS folks do this in the first place? This noise problem must have been known at least since Saturn orbit insertion. IMHO, it's not up to us to fix the calibration and produce our own files that do basic stuff like dark current removal.



it is a quick and dirty trick. in theory if you shoot an image of a planet and then of black space (or with the lens cap on) with identical settings it should create a perfect darkcurrent image. problem is that it would be noisy. so you would need a few black images to average out the noice. problem is that Cassini would have to send back lets say 5 black images for each "real" image. leading to an extreme waste of downlink time and other issues like SSR space and stuff.

I guess thats why they make a mathematical model of the darkcurrent instead.

I think that it is up to us to fix some of the calibration issues. the 2hz banding in the ISS images can be removed in lots of ways. its really hard to remove it completly. one way is to use the edge pixels that have not been exposed to light to create a DC image. Another way is to use that data to fit some kind of sinus wave to it and create a DC image from the sinus wave. another way is to use the pixelrows that are black space in an image to create a DC image. or fit a sinus wave to that. or combinations of these methods. or some fourier filter or some other magic.

Maybe its the same way with these VIMS lines. they might be really hard to model exactly. and that its up to the user to choose a way that fits him/her.

/M

Posted by: ugordan Sep 20 2006, 07:38 PM

Here are a couple of... hm... questionable(?) results.
First, a wholly unremarkable image -- three narrow strips taken during Cassini's second Venus flyby:

Their relative spacing isn't meaningful, it's just 3 cubes stacked together. Magnified 3x.
I don't know exactly where VIMS was pointed at the time, but my best guess by looking at the PDS data is that it was practically nadir, seeing near-equatorial latitudes. This was near C/A, distance was around 8800 km to Venus' center (it appeared huge in the FOV so no cloud details could be seen here - the vertical scale was about 2 degrees latitude on Venus) and the phase angle was about 90 degrees. That's why I presume the three images show the terminator at right and dayside at left.
I was under the impression Venus was yellowish-white, but this turned out bluish. Note that VIMS was far from its normal operating temperatures so maybe the sensitivity was influenced and the results could be way off. The exposure used was only 50 ms, as opposed to exposures at Saturn being typically 5000 ms and more. Any calibration uncertainties would likely be exaggerated here.
I'd be interested to hear what Don thinks about this, being a Venus expert and all...

EDIT: Again, referencing to an official http://vims.artov.rm.cnr.it/data/res-ven.html page, they also got a bluish Venus, though using only simple rgb channels. That's also the reason their image turned out so noisy.

Second, a Moon image, super-resolution view of 4 cubes stacked together:

Again, I'm surprised by the way the image turned out. I'd expect the Moon to turn out gray, but this is more like that Mariner spacecraft composite of Earth and the Moon. Or some of the Apollo shots. The image in the middle is a Solar System Simulator view, blurred to get the same effective resolution. The rightmost image is the same simulated image, unblurred.
The VIMS composite was magnified 4 times. Yes, four! You can really see how VIMS traded off spatial for spectral resolution.

Posted by: slinted Sep 20 2006, 11:00 PM

UGordon, first off, this is an amazing project you have undertaken! I'm really impressed with your results so far. It is a perfect use of the VIMS dataset, especially given the large number of vastly different targets.

As to the VIMS cube to color conversion, I think you may be better off to start fresh rather than using the conversion code by Andrew Young. Some of the hue problems you described might be explained by his 'weighting' of the color matching functions with Illuminant C. It is a bad approximation of direct sunlight, but more importantly, I'm fairly sure his use of it as a weighting for the color matching functions (and your scaling of it, to D65) actually changes the hues arbitrarily.

From my understanding of spectrum to XYZ conversion, the best way to handle the calculation is to integrate (http://en.wikipedia.org/wiki/Newton-Cotes_formulas would be a good method) your spectrum directly through the CIE Color Matching Functions ( http://www.cvrl.org/cmfs.htm , CIE 1931 2-deg modified by Judd (1951) and Vos (1978)). That will give you XYZ coordinates, appropriate to whatever you deem the source lighting to be. Then, in a separate step, you convert from source XYZ to destination XYZ (D65). XYZ scaling http://www.brucelindbloom.com/index.html?ChromAdaptEval.html to handle this conversion since it changes the hues inappropriately and doesn't change the luminance at all. You might want to look into either http://www.brucelindbloom.com/index.html?Eqn_ChromAdapt.html as an alternate method.

What you choose as a source white point is another matter entirely. It depends on what you want your true color to represent. Here are two scenarios:

1) An astronaut is sitting inside a ship, looking out the window and sees the scene. In this case, the viewer would be adapted to the internal lighting of the ship and the lighting of the scene would have little to no effect on the way it is perceived. If this is the goal, no chromatic adaptation is necessary. You can simply convert the original XYZ values directly to D65 according to the matrices listed earlier in this thread.

2) An astronaut is floating in space, with only the VIMS scene visible to them (or, in a similar scenario, a person is seeing the VIMS scene as if it were through the eyepiece of a very powerful telescope).

If you are aiming for #2, then you can use the Bradford or Von Kries method for adapting into D65 from whatever you deem the source white to be.

Posted by: ugordan Sep 21 2006, 07:46 AM

Thanks, Slinted! I'll certainly have a look at the Bradford method and compare the results, even though it seems to be very similar in terms of the conversion matrix to the XYZ scaling. The planetary targets are mostly dull, so I figure the different methods won't produce radically different results. I already tried with the normal XYZ CMFs, but got weird results no matter if I removed or not the solar spectrum from the data. My sort of ground-truth check to color is Enceladus, as well as Mimas and Dione. Enceladus is basically all white and if I don't get 1.0/1.0/1.0 as RGB intensities I assume the result is bogus. Currently, it fits neatly with all three coming out nice and neutral, with Dione's dark-stained side ever-so-slightly greenish, which as I understand is correct.
What I don't understand, though, is how the heck neutral Enceladus turns white with my current code and the Moon doesn't. If they both have a flat spectrum, they should both turn out the same color. Mimas did, it's gray and though it's substantially darker than Enceladus, it made no difference to the code.

The Moon result reminds me of shots like http://photojournal.jpl.nasa.gov/catalog/PIA00342, http://161.115.184.211/teague/apollo/AS15-88-12004.jpg and http://161.115.184.211/teague/apollo/AS15-88-12011.jpg for example (last two ripped off from http://www.apolloarchive.com). I haven't looked further, but there are other similar ones as well.

Is it even remotely possible the Moon is pale brownish and it's because of great brightness difference between it and the dark sky that it appears gray and washed out to us on Earth? Don Mitchell's http://photos1.blogger.com/blogger/1792/1563/1600/Palette.jpg would certainly suggest so.

Another image which is more of a demonstration of color than a real image. For all practical purposes VIMS couldn't resolve the Galilean satellites, they were merely pixels when viewed in normal resolution mode. The spectra is there, however. This composite of Io, Europa and Ganymede is vey, very, very enlarged and it's only to show the color. The cubes were taken at about 10 million km.



Note how yellowish Io turned out, compared to Europa and Ganymede. This is in good agreement with the "true" color of Io, unlike those enhanced pizza-shots Galileo and Voyagers took. Europa and Ganymede turned out colorless, Ganymede perhaps in part due to brightness applied here. The image doesn't show the relative brightnesses of the moons, I scaled each one to maximum. Note that VIMS visual channel exhibits a kind of chromatic aberration where all spectral channels don't project perfectly on the same pixel. This can cause some color shifting at point-like sources like these (in fact it's noticeable) so the end result is just illustrative. Nevertheless, it's nice to see Io standing out even at this distance.

Posted by: DonPMitchell Sep 21 2006, 12:55 PM

It is true that Judd and Vos made improvements to the color matching functions. But I don't believe those changes are part of the XYZ color value talked about in televsion and sRGB standards. So I think you want to integrate against the CIE 1931 matching functions, not the Judd or Vos versions.

Why is there a white-point correction? Are you calculating an image of the reflectance function, or an image of what would be seen by an observer in space?

Posted by: ugordan Sep 21 2006, 02:25 PM

QUOTE (DonPMitchell @ Sep 21 2006, 01:55 PM) *
Why is there a white-point correction? Are you calculating an image of the reflectance function, or an image of what would be seen by an observer in space?

As I was saying, using Young's matching code, uniform reflectance gives me converted RGB values of let's say 0.9,0.8,1.0 - somewhat bluish. I want that to be 1.0,1.0,1.0 so the surface would turn out white (this of course gamma-corrected and scaled to 255). Hence that XYZ scaling. I'm trying to calculate what would be seen by an observer in space. What's the difference to the other approach anyway?

The caveat with Young's code is I think in that it takes reflectance spectra of a material illuminated by Illuminant C. Hence, I have to divide the observed VIMS brightness spectra by solar spectra. This code then gives me the appearance of the object by this illumination. As I say, it's an approximation of daylight on Earth so an all-white material (such as Enceladus, effectively) will turn out bluish. That's where the Illuminant C -> D65 scaling comes in. It's not a magic fudge factor I made up, but found in the table slinted also referenced. This correction gives me 1.0,1.0,1.0 for white materials.
This worked well for the time being. I've been more focussed lately on trying to remove those pesky lines than do the color matching thing from scratch. In any case, a more "neat" matching code is on my todo list.

Posted by: DonPMitchell Sep 21 2006, 02:43 PM

Reflectance is independant of the illuminant. It should just be the ratio of reflected/incident spectral density.

sRGB is defined with a white point of D65. That means that sunlight should give you very close to RGB = 1.0, 1.0, 1.0. A flat spectrum would be illuminant E, but you don't want to make that your white point.

I don't think you should be doing any of that division by spectra and whitepoint transformation. The VIMS camera is giving you the information to calculate absolute spectral density values. Just project those onto the matching functions to get XYZ, and transform that to sRGB, and you're done. That will give you what a human observer would see in space.

Just for a sanity check, I ran some tests through my colorimetry code, to print out XYZ and RGB and chromaticity for some standard spectra:

CODE
Illuminant A:
XYZ = 24.9524822235107, 22.7151050567626, 22.7151050567626
RGB = 41.9157028198242, 18.7636756896972, 5.29991245269775
xy = 0.447566837072372, 0.407435506582260

Illuminant D65:
XYZ = 21.1444530487060, 22.2462940216064, 22.2462940216064
RGB = 22.2496452331542, 22.2456779479980, 22.2425174713134
xy = 0.312734514474868, 0.329031169414520

Solar Spectrum:
XYZ = 405.107116699218, 417.494018554687, 417.494018554687
RGB = 456.046386718749, 408.477874755859, 393.248840332031
xy = 0.323091745376586, 0.332970887422561

Illuminant E:
XYZ = 0.224980384111404, 0.224962189793586, 0.224962189793586
RGB = 0.271081715822219, 0.213312312960624, 0.204518526792526
xy = 0.333313554525375, 0.333286613225936


These seem to check, at least the chromaticity values are right. Illuminant E and pure sunlight (in space) both come out just slightly reddish. D65 is a better approximation of sunlight at sea level.

I would still claim that "true color" means you don't get to fiddle with anything to make the image look subjectively "better". You can't change the white point or gamma or try to make Saturn look like it is being lit by a giant 6500 K tungsten lamp instead of the sun. You have to just process the spectrum into XYZ and then into sRGB and show that to people. And it's up to them to make sure their monitor is sRGB compatable, that's not our problem.

Posted by: JRehling Sep 21 2006, 06:06 PM

QUOTE (ugordan @ Sep 21 2006, 12:46 AM) *
Is it even remotely possible the Moon is pale brownish and it's because of great brightness difference between it and the dark sky that it appears gray and washed out to us on Earth?


Bingo.

Time for a lot of hedges, though. Color is not an objective property the way mass is. Color is something we perceive, and the perceived color of an object most definitely depends, as you suggest, on the level of illumination. It all comes down to the response of the three types of cones plus the one type of rod, and those are nonlinear with respect to level of illumination.

In dim light, the eye perceives no color because the rods respond but no cones do. At moderate-to-bright levels of lighting, the cones will respond according to the rough R/G/B reflectance of a surface. But at brighter levels, saturation can take place. A somewhat greenish object will look increasingly pale-to-white as illumination increases to the point that the "green" cones are responding at full capacity, but the reds and blues start to catch up. Now to retract some of that -- it's not just the cones, but the contextually-influenced color sensory perception a couple of layers deeper than the retina. So the saturation effect can occur at moderate levels of illumination if the eye is otherwise dark-adapted.

If you get the opportunity to see the Moon in a daylit sky with low humidity (you may not live in an area where that ever really happens) and there happens to be a cumulus cloud close enough to compare it with the Moon, you'll see: The Moon is kind of brownish. To your eye, in those conditions. Having your eye be light-adapted makes all the difference.

The thing is, it doesn't make sense to say the Moon really "is" the way it appears when your eye is dark adapted or the way it appears when your eye is light adapted. That's your eye's baggage, and it has nothing to do with the Moon. There really isn't any such thing as true color, when it comes right down to it. It's necessarily subjective and contextual.

Posted by: DonPMitchell Sep 21 2006, 07:03 PM

[attachment=7578:attachment]

There's a brown Moon for you, taken by Galileo. I doubt if the image was created with any fancy colorimetric processing, but it is amusing.

Color perception is effected by viewing conditions. Among other things, your brain does a sort of automatic white-point compensation -- adaptation. Everyone has noticed how bad color photographs look when they are taken with indoor lighting, but to our eye, the lighting never seemed as bad.

For looking at dyes and fabrics or doing graphics arts, people buy expensive precision lighting, typically D50 lamps that meet ISO standards. Monitors generate their own light, so its not as critical to have D50 lamps, except for the secondary effect of your visual system's color adaptation. Here is a nice article I found by an artist, about how he set up his workspace: http://www.kevinmillsphoto.com/Articles/OfficeLightingArticle.html.

However, sRGB is well defined. If you are synthesizing an image or if you have spectral sensor data, then there is one unique standard thing to do, calculate 24-bit sRGB. All the messy stuff happens when people view it, but that is someone else's problem.

Posted by: ugordan Sep 21 2006, 07:16 PM

JRehling, those are all valid, good points. I said from the beginning true color is subjective. Heck, I even put "true" in quotes to emphasize that. One can still try, though...

Right, so here are a couple of checks using regular CIE XYZ 1931 2-deg color matching functions. If I pass them a unity spectrum, the result is xy=0.333, 0.333. The sRGB color comes at slightly reddish as you suggested. Checks out with your result.
However, if I pass the solar spectrum (derived from Cassini ISS calibration volume) I get xy=0.3496, 0.3526. The sRGB comes at fairly reddish. This obviously differs from your xy = 0.32309, 0.33297. Why is that, I'm still trying to figure out. Even if it did match your result, you say it would still turn out reddish. Obviously if I use this function on Enceladus, it too turns reddish. How accurate is that, especially taking into consideration the eye will adapt to this reddish color of the sun's illumination and perceive Enceladus as being white? The image on the screen will remain reddish, though.

Are you saying I should drop 255,255,255 as being white but settle for 255,253,248 (example) and hope the calibrated monitor setting will turn that into visually apparent white?

Posted by: slinted Sep 21 2006, 08:52 PM

Good points made by all,

Part of my confusion on this might be coming from a misunderstanding about what the dataset is providing. Are the VIMS cubes in spectral power, or reflectance? (I appologize if I'm using these terms incorrectly) Has the illuminant spectrum already been removed?

If the VIMS data is in spectral power, then I would say that chromatic adaptation is still necessary since, as Don said, D65 isn't solar spectrum. It is what we see here on Earth.

If it is a reflectance dataset, then wouldn't that spectrum have to be multiplied by the spectrum of the D65 illuminant before being passed to the CIE color matching functions?

Posted by: ugordan Sep 21 2006, 09:14 PM

VIMS cubes, their raw form, capture the brightness of the source. During calibration, it's possible (in the official code) to choose to divide by the solar spectrum so you get the reflectance (I don't know if I'm using the term correctly. I'd say you get the "intrinsic color" of the target). Since as Don said, if you leave the solar spectrum in and simply integrate the CMFs, you'd get a white point that would be reddish, i.e. it wouldn't be 255,255,255 in RGB. Young's code worked on this intrinsic color and assumed an Illuminant C (not solar spectrum) on it. He was I think interested in what color the materials would be here on Earth, so that's where C came from. That's why I had to convert to the D65 whitepoint to get a 255,255,255 whitepoint. I've now dropped his code. What I do now is I still divide by solar spectrum, multiply by D65 spectrum (you can call this chromatic adaptation, but it's not a simple scaling as before, but a full spectrum apllied so it should be more accurate) and integrate the CIE functions. This eliminates the need for the XYZ scaling afterwards. The results are virtually indistinguishable from the previous method, nevertheless I'm more comfortable now as I understand what I've actually done:



Left side: solar spectrum divide -> Young's algorithm -> XYZ scaling
Right side: solar spectrum divide -> D65 spectrum multiply -> CIE functions

Posted by: JRehling Sep 21 2006, 10:32 PM

QUOTE (slinted @ Sep 21 2006, 01:52 PM) *
If the VIMS data is in spectral power, then I would say that chromatic adaptation is still necessary since, as Don said, D65 isn't solar spectrum. It is what we see here on Earth.

If it is a reflectance dataset, then wouldn't that spectrum have to be multiplied by the spectrum of the D65 illuminant before being passed to the CIE color matching functions?


It's worse than that. wink.gif

In most of these cases, monitors are inherently dimmer than the sunlit surfaces we're talking about, and the relative sensory-system response to the three primary colors can and does shift as you increase/decrease illumination. So let's say some body out there simply reflected twice the green light it did blue light, reflecting those only in narrow bands, and it reflected no red at all. It would not necessarily capture the color of that object merely to have a display also emit twice the green light as blue; if the illumination is different, then a different ratio must be used to capture the ratio of color responses one would have seeing that object in daylight. An extreme example of this can cause the apparent relative brightness of two patches to depend on the illumination in which you view them -- even if the light sources both have solar spectrum! Another related phenomenon is that it is often possible to see sun dogs or dim rainbows if you are wearing sunglasses, only to have the color disappear when you take the sunglasses off.

I computed once that Uranus (and bodies beyond that, for the most part) has a luminance that a computer monitor can match, but everything closer to the Sun than that is brighter than your monitor could display. So achieving the colors one would perceive from a spaceship window requires some trickiness.

Posted by: ugordan Sep 22 2006, 08:07 AM

QUOTE (JRehling @ Sep 21 2006, 11:32 PM) *
I computed once that Uranus (and bodies beyond that, for the most part) has a luminance that a computer monitor can match, but everything closer to the Sun than that is brighter than your monitor could display.

Now that would be something... getting a totally dark room around you with an image of Neptune on the screen that is precisely as bright as the real thing and your eye is accustomed to the low light. I wonder how dim that'd appear. It would be difficult to calibrate, though. I imagine you'd need one of those Light Intensity Measuring Thingies (LIMTs).

There are CRT monitors nowadays that have a sort of "Magic Bright" function that kicks up the brightness substantially, I imagine accurate luminances could be reached even at Saturn, though I'm not so sure about Enceladus and Tethys, them being so high-albedo and all.

Posted by: ugordan Sep 22 2006, 06:41 PM

I can understand the reasoning of not touching the illumination, not removing solar spectra from the cubes. The problem with that is solar spectrum turns out reddish and everything has a distinct red hue. Here are a couple of comparisons, left images show no spectrum messing, simple integration through the CIE XYZ functions.




I suppose if you stared long enough at scenes like this, the eye/brain would automatically adapt to the color and make a new white-point, making Enceladus (uppermost left image) appear white again. The problem is, this is not gonna happen on a computer screen.

You be the judge of what looks more reasonable.

Posted by: DonPMitchell Sep 22 2006, 07:06 PM

QUOTE (ugordan @ Sep 22 2006, 11:41 AM) *
You be the judge of what looks more reasonable.


That doesn't look right to me. I think something is wrong somewhere in the processing. You've got me interested now, so let me drag a VIMS image into my image library and see if I can reproduce this effect.

Posted by: ugordan Sep 22 2006, 07:45 PM

Be my guest. For starters, here's a dump of the 64x64 Jupiter cube. The wavelenghts are stacked top to bottom, there are 95 of them in 5 nm increments starting at 360 nm. They were linearly interpolated from the VIMS cube as it has discrete wavelengths, spaced on average 7-8 nm. The dark background was removed (actually, that was done onboard), flatfield applied and corrected for detector performance. The samples are linear.


This looks weird as a thumbnail, being a 64*6080 image biggrin.gif

Posted by: tedstryk Sep 22 2006, 09:12 PM

For Jupiter and Saturn, I am biased toward the right hand images based on experiences with a telescope, but this brings in a whole new host of visual issues...

Posted by: ugordan Sep 25 2006, 08:02 PM

I've been experimenting with cleaning up the VIMS visual channel a bit. I've found a cube that captures the night side of Titan with the entire 64x64 hires field of view that served as an augmented dark background to remove the pesky noise. It turned out better than I hoped. Unfortunately, the noise doesn't seem to be as static as I thought. Most of the noise is removed, but some isn't.

To illustrate just what mess the VIMS cubes are (including their calibration data), here are a couple of before and after images:



The bottom image is one of the highest resolution views of Iapetus by VIMS. The noise is disastrous. You can actually see it's mostly coincident with the ring image above.
My dark current removal model assumes noise is linear with exposure, this seems to work well. The background will need some touching up still.

I emailed one of the guys responsible for the PDS volumes, he said he'll look into it and get back to me. We shall see, though I'm not getting my hopes up.

Posted by: ugordan Sep 26 2006, 07:33 AM

Sure enough, I got a response from the VIMS team. They had this to say:

QUOTE
New dark models have not been generated with regularity and users of the VIS science data have been doing corrections after the calibration pipeline by looking at average values along columns in data and using these values to correct their data.

That doesn't sound like a very exact science, though.

Posted by: edstrick Sep 26 2006, 09:49 AM

It probably boils down to too-much-to-do, not-enough-techies, not-enough-grad-students...

The sort of thing that pennypinching does to a live mission. They'll have time for a systematic beginning-to-end calibration effort when the mission is over.

Posted by: ugordan Sep 26 2006, 02:27 PM

To reiterate what you said, on my asking if updated calibration data will be provided, he said that will happen at the end of Cassini mission. An earlier update may be possible, but not likely.

Guess that means I'm stuck with homebrewn dark current models <sigh>...

Posted by: ugordan Oct 4 2006, 05:55 PM

Here are a few quick results from the new PDS release. Not that much interesting stuff in there this time. These don't have the sRGB correct gamma of 2.2 because it just looks too washed out to me and it really brings the nasty vertical noise out. I chose a gamma of 1.33 instead. Magnified 2x.

The first image is coincident with an http://static.flickr.com/96/256542004_3b202e1c5e_o.jpg. The 3rd image from the left shows the F ring, the 4th shows Titan's north pole at high phase. The Saturn mosaic has one of the moons visible at the bottom - might be Dione/Tethys/Rhea, I haven't checked. The last image was taken during that Titan flyby where we had Saturn emerging from behind Titan's limb. You can barely see part of Saturn's crescent there, too bad the view didn't capture more of it.

Posted by: Indian3000 Oct 11 2006, 11:53 AM

First test to process VIMS cube ( gamma 2.2 )


v1509053994_1

color



thumbnails




v1509133941_1

color


thumbnails


v1509134411_1

color



thumbnails


and a small screenshot of my software ( perharp a release un some days .. or weeks smile.gif )


Posted by: elakdawalla Feb 18 2011, 10:20 PM

Hey folks, the Rings Node has generated preview images for Cassini VIMS cubes to their http://pds-rings.seti.org/search/. The attached screen cap shows what these look like for a colorful object, Iapetus. http://pds-rings.seti.org/cassini/vims/COVIMS_previews.txt This will make it much easier to browse for good VIMS data!

 

Posted by: ngunn Apr 28 2011, 09:27 PM

A superb piece of work at TPS but we have to have it here too.

http://www.planetary.org/image/2383059379_c56b0272cb_o.png

Believable true colour is a great gift. Thank you Gordan.

Powered by Invision Power Board (http://www.invisionboard.com)
© Invision Power Services (http://www.invisionpower.com)