Help - Search - Members - Calendar
Full Version: Processing VIMS cubes
Unmanned Spaceflight.com > Outer Solar System > Saturn > Cassini Huygens > Cassini PDS
Pages: 1, 2
ugordan
Right, a suggestion I made here in another topic made me wonder why not try that myself. A bunch of data was sitting on the PDS, after all. After a hassle figuring out just how the image cubes are organized and trying to read them, finally I was able to produce some results. This is all very rough work, can be considered first-iteration only and not particularly accurate.
Basically, I used the cubes to extract the visible spectrum in the 380-780 nanometer range which was then input to color matching code I found here by Andrew T. Young.
The code integrates over 40 10-nm steps to produce CIE XYZ color components. I then converted these to RGB values.

I'm aware of at least three inaccuracies in my code as of yet: one is the above sampled code apparently uses Illuminant C as the light source, not true solar spectra so the color turns out bluish (has a temp. of 9300 K instead of 6500 K, AFAIK). I tried to compensate at the moment by changing the final RGB white balance, but this is probably an inaccurate way to go. Another inaccuracy is I don't do bias removal from the cubes. This likely affects the outcome. Also, I don't use the precise wavelengths the code requires, but use the closest one in the cube. I intend to fix this by interpolating between nearest wavelengths.

All images are enlarged 4x.

Click to view attachmentClick to view attachmentClick to view attachment
The leftmost image is a 4-cube mosaic. The colors in all four frames turned out identical which gives me at least some confidence. The image in the middle shows Dione's disc creeping in front of Saturn. Dione's disc appears elongated probably because as the lines were readout, it moved considerably in its orbit. The rightmost image shows a very overexposed Saturn image, the part below the ring shadows got overexposed. From what I've seen browsing through the PDS, a lot of the cubes are badly overexposed at some wavelengths.

Here's a couple of Jupiter images. I'm not very satisfied with them as they seem to look somewhat greenish, but overall the color looks believeable:
Click to view attachmentClick to view attachment

Lastly, two Titan composites. They turned out way more reddish than I thought they would.
Click to view attachmentClick to view attachment

It'll be interesting to see how much the results will change once I do a more proper processing pipeline working.
Malmer
its a really great "realitycheck" for the ordinary ISS images.
dilo
Very nice work, gordan... now we see the colors!
Only one observation: in the second 4-cube image, you say that dark, elongated feature in front of Saturn is Dione's disc. To me, is evident we are seeing the Dione shadow projected on the Saturn atmosphere: this would easily explain why is so dark and with an elongated/deformed shape... do not you agree?
ugordan
QUOTE (dilo @ Sep 12 2006, 11:21 PM) *
To me, is evident we are seeing the Dione shadow projected on the Saturn atmosphere: this would easily explain why is so dark and with an elongated/deformed shape... do not you agree?

VIMS cube : v1492290669_1.qub, time : 2005-04-15 20:44:12 thru 2005-04-15 20:49:59

Solar System Simulator view here.
There is no way any but the closest ring moons could have cast a shadow onto Saturn, it was still far from autumn equinox at the time.
BTW, all images except the first one are single cubes with a 64x64 pixel spatial resolution.
ugordan
QUOTE (Malmer @ Sep 12 2006, 10:48 PM) *
its a really great "realitycheck" for the ordinary ISS images.

Indeed. I was surprised how close to official releases the Saturn and Jupiter images turned out. Frankly, I was a bit skeptical about color balances in some of CICLOPS releases, but these images (if they turn out more or less correct) prove me wrong. There are images that do look strangely color-balanced such as this recent Jupiter map and especially the earlier Jupiter flyby releases, but this rendered portrait is probably the closest to the left Jupiter VIMS image, especially when taking into consideration I increased the gamma in my images which reduced the contrast and washed out the colors a bit, but making them a bit more realistic IMHO.

Saturn also turned out remarkably similar to for example this composite and this higher phase view, again only differing in saturation and brightness. Then again, there's this image...

The most notable exception I can see is Titan which turns out much more yellow than red in ISS releases.
CAP-Team
It looks as if the colors match Voyager's colors a lot more. The images of Titan look like the images Voyager took.
Malmer
QUOTE (ugordan @ Sep 10 2006, 09:51 PM) *
I'm aware of at least three inaccuracies in my code as of yet: one is the above sampled code apparently uses Illuminant C as the light source, not true solar spectra so the color turns out bluish (has a temp. of 9300 K instead of 6500 K, AFAIK). I tried to compensate at the moment by changing the final RGB white balance, but this is probably an inaccurate way to go. Another inaccuracy is I don't do bias removal from the cubes. This likely affects the outcome. Also, I don't use the precise wavelengths the code requires, but use the closest one in the cube. I intend to fix this by interpolating between nearest wavelengths.


Another question you might ask yourself is what reference white one should have? Should one use an solar spectrum that are as seen from earth trough earths atmosphere (D65) or should one use an unfiltered solar spectra. (almost a blackbody at 5780K) check out :http://casa.colorado.edu/~ajsh/colour/Tspectrum.html

/Mattias
Malmer
QUOTE (ugordan @ Sep 13 2006, 10:56 AM) *
Indeed. I was surprised how close to official releases the Saturn and Jupiter images turned out. Frankly, I was a bit skeptical about color balances in some of CICLOPS releases, but these images (if they turn out more or less correct) prove me wrong. There are images that do look strangely color-balanced such as this recent Jupiter map and especially the earlier Jupiter flyby releases, but this rendered portrait is probably the closest to the left Jupiter VIMS image, especially when taking into consideration I increased the gamma in my images which reduced the contrast and washed out the colors a bit, but making them a bit more realistic IMHO.

Saturn also turned out remarkably similar to for example this composite and this higher phase view, again only differing in saturation and brightness. Then again, there's this image...

The most notable exception I can see is Titan which turns out much more yellow than red in ISS releases.



It would be cool to use VIMS to "teach" a compositing software what to do with for example 3 ISS channels in different wavelengths. Sort of like what Daniel Crotty is doing for the MER rovers. (only different "lessons" for each body in the saturnian system)

That way one could get extremly "real" colors from relatively sparse ISS spectral data.

/Mattias
dilo
QUOTE (ugordan @ Sep 13 2006, 06:56 AM) *
There is no way any but the closest ring moons could have cast a shadow onto Saturn, it was still far from autumn equinox at the time.

Thanks ugordan, I didn't consider the inclination of Saturn equator...
By the way, your gallery is really stunning, great work!
This is a little enhancement of one of these pictures; is not intented to be realistic and I cannot compete with your colorimetric precision, my only objective was to amplify the faint Jupiter atmospheric details and Europa surface features. Hope you will enjoy...
DonPMitchell
How do you convert XYZ to RGB? Are you making sRGB? Here's the matric for that transform, just in case:

CODE
ML_Matrix3x3 ML_RGBtoXYZ( 0.412410914897918,   0.357584565877914, 0.180453807115554,
                          0.212649390101432,   0.715169131755828, 0.0721815153956413,
                          0.0193317625671625,  0.119194857776165, 0.950390040874481);
ML_Matrix3x3 ML_XYZtoRGB( 3.24081206321716,   -1.53730821609497, -0.498586475849151,
                         -0.969243109226226,   1.87596642971038,  0.0415550582110881,
                          0.0556384064257144, -0.204007476568222, 1.05712962150573);
ugordan
Yes, I'm using that XYZ to RGB matrix. I had to also use a scaling factor for converting Illuminant C to D65, X*0.969459, Y, Z*0.922050, prior to the RGB conversion. Only that gave me white for uniform reflectance. The RGB output is linear, I manually adjusted gamma to 1.25 in Photoshop.

I think I have bias subtraction working now (I'm still unsure if it's correct), but flatfielding seems suspicious still. There are cubes with excessive vertical banding that flat fields don't seem to touch. I'm also using a 40 point solar spectrum derived from the ISS calibration code. Somehow simply dividing by the provided solar spectra cube gives a yellowish result, when in fact it should be roughly the same. I'm now interpolating the discrete spectral wavelengths to get a nice 380-780 range.

Here are the most recent results, this time enlarged 3x:

Mimas, Dione, Rhea, Hyperion, Iapetus


Jupiter still seems too greenish somehow. The rings definitely appear to have a distinct yellow hue to them, unlike the colorful, bluish ISS releases.
DonPMitchell
Very good. Gamma 2.2 is technically what should be done, to get the image we see on the monitor to be proportional to actual scene radiance. What is the camera's radometric response function like? Are the pixel values in the VIMS file logarithmic?

As for dark current, making the background be 0 black should be right. But check the camera response. Subtracting background before or after linearization of the pixel values gives a totally different result if the camera response is nonlinear (and it probably is).

I'm not sure about the Illuminant C business. I'm not sure I understand what you did. The sRGB matrix assumes a white point of D65, which all new monitors are supposed to do. Don't think about correcting the color on the camera end, just concentrate on linearizing its response. Just plug those spectral radiance values into the XYZ integrals, convert to RGB by matrix multiple, and do gamma of 2.2. And you should see what the camera saw then!
ugordan
The pixels are linear as far as I know, a 12 bit A/D converter is used. I think a precommanded bias value might be set to get the values out of the converter's low nonlinearity portion, or it might simply be a matter of subtracting dark current. Either way, I used the background algorithm found in the calibration package and then subtracted it.

As for Illuminant C, as I said the integration code I used produces a bluish white color if I pass it a uniform input reflectance spectra. When I used the regular CIE XYZ color matching functions, the colors turned weird. That'll need looking into. At the moment, using the supplied 40-step integration tables and using scaling factors found here, for Illum. C -> D65 got me an exact white for the test input. This can be double-checked with Mimas, Dione and Enceladus which are all very gray and sort of provide "ground truth" for the white-point.

Regarding gamma, if I set it to 2.2, the colors are all washed up and all contrast is lost. Somehow that doesn't seem right? Even with a 1.25 gamma Hyperion's reddish tint is pretty much indescernible.
Malmer
QUOTE (DonPMitchell @ Sep 14 2006, 06:07 AM) *
Very good. Gamma 2.2 is technically what should be done, to get the image we see on the monitor to be proportional to actual scene radiance. What is the camera's radometric response function like? Are the pixel values in the VIMS file logarithmic?



correct me if im wrong but: sRGB does not use a 2.2 gamma. it uses its own custom crve that is reasonably close to gamma 2.2...

Click to view attachment

the red curve is gamma 2.2
the green is sRGB:s "gamma" curve
the blue is just linear.

The difference is mostly visible in the dark regions.

/M
Malmer
The pixels in the ISS images are not linearly saved. (the sensor is resonably linear but they dont save the data that way) You have to use a custom lookuptable to linearize those images. Maybe thats why you get strange results with the VIMS too. It looks to me as you are treating the raw data as linear an i dont think its saved that way. i think they save it almost logaritmically to increase dynamic range) thats why you get an image that looks as though it has alredy had an gamma correction applied to it. (i might be wrong here...)

here is the ISS data converion table.
http://pds-rings.seti.org/cassini/iss/COIS...ppf/12-8-12.pdf
Im not sure if they use something similar on the VIMS data but my guess is that they do.

ISS calibration report:
http://pds-rings.seti.org/cassini/iss/COIS...OCUMENT/REPORT/


info and software for calibrating VIMS:
http://pds-rings.seti.org/cassini/vims/

/M
ugordan
I am very well aware of the ISS data specifics. While it is mostly the case a LUT was used to convert linear (more or less, barring uneven bit weighting) 12 bit DNs to 8 bit, approximately square root encoded numbers, it is not always the case as you suggest. Sometimes full 12 bits were returned. The third case being the 12->8 bit conversion by returning only the lower 8 bits. This is also linear. In any case, my ISS code takes care of that.

However, I found absolutely no mention anywhere of lookup tables used in the VIMS instrument, in fact only mentions of lossless 12 bit encoders. The calibration code I checked doesn't use such conversions anywhere. It would be logical to save it via a square root LUT if they were downsampling the data, but all points to that not being the case.
The raw data is linear which can be seen from the calibration steps used in the provided code:

1. subtract dark background
2. divide by flatfield
3. divide by solar spectrum
4. multiply by detector performance as function of wavelength
5. convert into radiometrically correct units (optional)

Steps 1-4 should already yield accurate colors since I'm not interested in exact units.
DonPMitchell
QUOTE (Malmer @ Sep 14 2006, 02:30 AM) *
correct me if im wrong but: sRGB does not use a 2.2 gamma. it uses its own custom crve that is reasonably close to gamma 2.2...

Click to view attachment

the red curve is gamma 2.2
the green is sRGB:s "gamma" curve
the blue is just linear.

The difference is mostly visible in the dark regions.

/M


I asked Gary Starkweather about this a few years ago, because the documentation talks about gamma of 2.2, and then also describes a piece-wise function. He told me the piecewise function was just introduced because there is some way to calculate it that is much faster than pow(x, 2.2). He claimed that gamma of 2.2 is what is intended. Just now, I found this page which seems to say the same thing.
Malmer
I think the strange piecewise funktion has its benefits. it gives slightly better coverage in the black regions... when handling 8bit images...
Malmer
QUOTE (ugordan @ Sep 14 2006, 12:47 PM) *
I am very well aware of the ISS data specifics. While it is mostly the case a LUT was used to convert linear (more or less, barring uneven bit weighting) 12 bit DNs to 8 bit, approximately square root encoded numbers, it is not always the case as you suggest. Sometimes full 12 bits were returned. The third case being the 12->8 bit conversion by returning only the lower 8 bits. This is also linear. In any case, my ISS code takes care of that.

However, I found absolutely no mention anywhere of lookup tables used in the VIMS instrument, in fact only mentions of lossless 12 bit encoders. The calibration code I checked doesn't use such conversions anywhere. It would be logical to save it via a square root LUT if they were downsampling the data, but all points to that not being the case.
The raw data is linear which can be seen from the calibration steps used in the provided code:

1. subtract dark background
2. divide by flatfield
3. divide by solar spectrum
4. multiply by detector performance as function of wavelength
5. convert into radiometrically correct units (optional)

Steps 1-4 should already yield accurate colors since I'm not interested in exact units.



I stand corrected.

I guess i should have been a bit more clear with the fact that I was speculating...

While im at it, here is another speculation that might or might not have scientific grounds;

One reason that could make the images look washed out in gamma 2.2 is that the solar intensity at saturn is only 1% of the intensity here on earth. So maybe the images on your screen are actually brighter than the real thing. And since the eye has a logatitmic response the colors wash out.

So maybe you should either calibrate the images to the actual intensity of your screen or just use an arbitrary gamma that looks good. I would go for the "looks good option"



Speculativly yours
/Mattias
ugordan
QUOTE (Malmer @ Sep 15 2006, 01:33 PM) *
I think the strange piecewise funktion has its benefits. it gives slightly better coverage in the black regions... when handling 8bit images...

If you're referring to the linear portion close to 0, it's supposedly for suppressing low DN noise that would otherwise be apparent. As for different curves for different channels, I don't know anything about that. I assume that would totally trash the colors on all but those displays that were precisely calibrated - not many of those surely.
ugordan
QUOTE (Malmer @ Sep 15 2006, 01:50 PM) *
One reason that could make the images look washed out in gamma 2.2 is that the solar intensity at saturn is only 1% of the intensity here on earth. So maybe the images on your screen are actually brighter than the real thing. And since the eye has a logatitmic response the colors wash out.

This could very well be true. But, since using absolute brightness here on Earth, it wouldn't be very useful as everything would turn out pretty dim on a computer screen. It would be interesting, though, to know how the apparent brightness would change once the eye got accustomed to low light levels at Saturn (if you were there, for example).
I guess we can settle the point on this whole matter being too subjective to quantify in a scientifically meaningful way. My primary goal was to get the "correct" hue of objects, principally Saturn's disc (more importantly the rings as I was wondering whether they're as colorful as ISS would lead to believe) and Titan, Iapetus and Hyperion. Pretty much anything there is gray anyway. Brightness was of lesser importance as that sort of thing is a hard beast to accurately "tame".
Malmer
it would be fun to paint spheres in the colors you have derived and put them in a dark room with a dim light and just wait and see... smile.gif

/M
DonPMitchell
One concept is to make an image that resembles what would be seen by a human being, there in space, looking through a window. As long as the colors are inside the gamut of your display, it is theoretically possible to achieve this:

1. Convert 12-bit VIMS values to linear radiance.
2. Correct for dark current
3. Apply gain flatfield correction
4. Apply spectral-sensitivity gain correction
5. Integrate against CIE observer functions to get XYZ
6. Convert XYZ to RGB
7. Apply global gain adjustment for desired image contrast.
8. Encode pixel channels, for example, iRed = int(pow(R, 1.0/2.2) * 255 + 0.5)

Gamma has a big effect on the color, so if 2.2 looks drastically wrong, I would double check the process. You don't want to do everything carefully and then fudge gamma in photoshop. I bet there is just a small problem somewhere, and if you find it, the image will come out beautifully.
Malmer
QUOTE (DonPMitchell @ Sep 16 2006, 03:56 AM) *
Gamma has a big effect on the color, so if 2.2 looks drastically wrong, I would double check the process. You don't want to do everything carefully and then fudge gamma in photoshop. I bet there is just a small problem somewhere, and if you find it, the image will come out beautifully.


sometimes when i do gamma correction i convert to HLS, apply gamma to the L channel and convert back to RGB. that way the colors dond desaturate as much... maybe its not the most colorimetrically correct process but it looks better sometimes...

/M
ugordan
A mosaic of Saturn's northern latitudes:


Appearance of Jupiter and Saturn at similar phase angles:


Two shots of the lit side of the rings:

The rings in the left image are clipped off, it's not Saturn's shadow. The narrow, white ring is the F ring. The dark area in the middle image is Saturn's shadow.

Several narrow slices of the unlit side mosaicked together:


The unlit side appears less dull-brownish and more bluish, possibly due to ice particles forward-scattering blue light stronger.
john_s
Just checked out your gallery, ugordan- your processed Cassini images are truly glorious. Great work!
ugordan
QUOTE (john_s @ Sep 16 2006, 05:54 PM) *
Just checked out your gallery, ugordan- your processed Cassini images are truly glorious. Great work!

Thanks! After comparing the results with the VIMS cubes, which ought to be more accurate, I may have to change the way I process Saturn (Titan too) composites a bit. The results I got so far seem consistently a bit on the greener side than the stuff that turns out here. A channel mix here and there should do the trick. wink.gif
DonPMitchell
QUOTE (Malmer @ Sep 16 2006, 02:42 AM) *
sometimes when i do gamma correction i convert to HLS, apply gamma to the L channel and convert back to RGB. that way the colors dond desaturate as much... maybe its not the most colorimetrically correct process but it looks better sometimes...

/M


That's a good idea. I'd use Lab coordinates, which I think separate "color" from "luminance" even better.
DonPMitchell
I also really like what you are doing. I hope you don't feel I am being discouraging. I am poking at the gamma issue, because I think you might have a bug someplace. Track it down, and then you will have rigorous "true color".
Malmer
QUOTE (DonPMitchell @ Sep 16 2006, 08:47 PM) *
That's a good idea. I'd use Lab coordinates, which I think separate "color" from "luminance" even better.



Yes thats obviously much better but im often a bit too lazy... Guess I have to shape up. In the light of what you and ugordan have done with cassini and venera i feel that i really have to start from scratch with the stuff i have done.

I think that these pictures deserve the very best in processing that is humanly possible. I believe that it is important to make pictures that are true to reality. If its looks "dull" it should stay that way. These pictures are humanitys only way of experiencing these places and i dont think they should be enhanced or distorted to make them look more exotic.

keep it real!

/M
ugordan
QUOTE (DonPMitchell @ Sep 16 2006, 08:01 PM) *
I hope you don't feel I am being discouraging.

Not at all. There's generally just too much "mystique" about gamma that I rather not mess around with it.
Here's a literal implementation of a 1/2.2 power function in the code:

The above shows nicely why I'm not too fond of gamma manipulations -- the terminator comes out too sharp (you don't get the feeling Jupiter is actually 3D), the colors and contrast are bleached, and it also brings out an ever-present non-dark background.
Hmm... or maybe that means I have to calibrate my monitor...

Here's that Saturn mosaic again:
Click to view attachment
DonPMitchell
QUOTE (ugordan @ Sep 16 2006, 02:37 PM) *
Here's a literal implementation of a 1/2.2 power function in the code:


The real-life scene is probably not as saturated with color as a lot of photos show. Your gamma 2.2 images don't look like anything is going wrong in the software at all. I thought perhaps you were seeing something way off.
ugordan
A couple of rough Saturn mosaics:
Click to view attachmentClick to view attachmentClick to view attachment
The blob in the middle image is Tethys.

Ring mosaics:
Click to view attachment
Click to view attachment
All images magnified 2x and gamma-corrected.
Malmer
I think it looks great. Very subtle beautiful colors.

Keep em coming!

/M
ugordan
A few Jupiter images, fixed the greenish hue. It was due to my code subtracting out dark background when apparently it was already subtracted from the Jupiter flyby cubes.


The middle and right image were taken not far apart, the rightmost image is much bigger because it utilized a high-resolution mode developed in-flight, it increases the spatial resolution by 3x at a cost of lowering the S/N ratio a bit. The leftmost image is also hi-res, but was taken far before closest approach.

Outbound crescent, 2 cube mosaic:


A couple of Europa transits:

It's a shame the transit on the right didn't catch more area to the left of Jupiter's limb, there was a great grouping of Io and Ganymede there. The rightmost image is actually very close in time to an ISS narrow-angle sequence, taken on January 2nd, 2001.

Again, all images magnified 2x and gamma-corrected.
Malmer
QUOTE (ugordan @ Sep 19 2006, 07:21 PM) *
A few Jupiter images, fixed the greenish hue. It was due to my code subtracting out dark background when apparently it was already subtracted from the Jupiter flyby cubes.


Just for the sake of inconsistensy smile.gif



I think it looks great. Fantastic actually. If i had to pick on something i would have to mention the almost invisible residual pattern in the images. faint thin horizontal and vertical lines. some of them seem to show up at the same place in all the images. (sometimes rotated 90 degrees but i guess thats you rotating the images for display) others seem more random. is this due to sensor sensitivity variation and/or does it have to do with the scanning process of the instrument?

Do you write your image processing software for some standard platform (IDL or something) or do you write totally standalone code?

/M
ugordan
The vertical stripes (most noticeable is a blue line) in some darker shots... I've checked and double-checked the flatfields and dark background removal code and none of them even touches that. There's a load of them in the raw data, they appear to be static so those are probably hot pixels on the CCD. Since the cubes are readout one line at a time, in a push-broom mode with the spectra of the line being split in the vertical dimension, It means same samples in all lines in a given spectral band will suffer from the same problem. Hence the vertical lines. What I don't understand is why adequate flatfields/dark current models haven't been produced to deal with this. This only affects the visible channel, the IR channel seems fine (note the VIS and IR channels are actually two different physical parts). So far, there's nothing I can do about it.

The code is an ad-hoc implementation in C, it's rudimentary and messy as I progressively modified it from simply trying to read the cubes to do more complex stuff like color matching. It doesn't even output a nice PNG color image, but a raw 64x64 pixel dump with 3 interleaved rgb channels biggrin.gif
Malmer
I guess you have to make an additional flatfield removal using a homebrewn flatfield. Try simply takeing one row of an image of black space, scale to width of image and remove it from the data and see what it looks like.



/M
tedstryk
Great work! As for the calibration data, it could be that they are still working on it. A lot of these things weren't worked out at the time of the Jupiter encounter. It might be worth contactings someone to find out.
ugordan
QUOTE (Malmer @ Sep 19 2006, 09:16 PM) *
I guess you have to make an additional flatfield removal using a homebrewn flatfield. Try simply takeing one row of an image of black space, scale to width of image and remove it from the data and see what it looks like.

I don't think it's a flatfield problem. Flatfields only correct for varying sensitivity of different pixels. This appears to be primarily a dark current problem - you can see the lines even in dark space which flatfields can't remove. Remember, you divide or multiply the image with a flatfield. This, on the other hand, will obviously need subtracting out. I don't even know where to begin modelling dark current data. The provided dark current model uses a two-parameter table, one a standalone part and the other a variable part multiplied by exposure, for each CCD pixel.
The model seems fine for normal resolution cubes (though even that doesn't completely remove everything), but the hi-res one seems buggy. Furthermore, these Jupiter images where the dark was subtracted by the instrument itself show no apparent signs of those nasty lines. Then again, this could be due to the fact Jupiter images took a 4 times shorter exposure than Saturn targets, because of increased light levels there so dark current didn't have as much time to build up.

A document on the PDS says new models are generated every month or so, but it looks to me the same tables appear on all 8 VIMS DVDs. I mailed one of the guys responsible for the PDS volumes so we'll see what they have to say.
Malmer
it was a typo. i actually ment darkcurrent. mixed them up in my head for some reason. (thats why i suggested you to use a bit of black space to serve as a DC remover)
ugordan
Although the noise stays fixed, it changes intensity -- it's likely connected with exposure duration. It's not likely to be as easy as subtracting a generic dark cube from all cubes. I don't think a small bit of dark space would do the trick as there's also a fair bit of cosmic ray noise induced. It would be required to average many dark lines. There are many dark space cubes that weren't even readout as a full 64 width swath so they're of limited usefulness. Additionally, the physical resolution of the CCD is AFAIK 192x96 pixels with 192 being one spatial line and 96 being the visible spectral channels. All cubes have a maximum resolution of 64 spatial pixels. Normal mode of operation simply sums three pixels in a line to get 192/3=64 pixels. The high res mode, however doesn't sum the individual pixels and that's where the 3x resoulution increase comes from. This is where it gets complicated a bit: which 64 pixels out of 192 will be readout depends upon the offset factors and is somewhat clumsy. A fair bit of margin for error is present if you don't know exactly what you're doing. The PDS document isn't all too clear on certain aspects.

Which brings me to the point -- if this is as simple as we suggest, why didn't the VIMS folks do this in the first place? This noise problem must have been known at least since Saturn orbit insertion. IMHO, it's not up to us to fix the calibration and produce our own files that do basic stuff like dark current removal.

Of course, all this is barring I haven't screwed up in my code somewhere, but I'm more and more certain that's not the case.
ugordan
Not particularly related to my processing, but here are a couple of past releases by the VIMS team of Jupiter results and Titan results. The Jupiter page shows the same image as my bottom 2nd image from the right. Their image was mirrored left-right as Cassini never saw Jupiter from that vantage point. The dark spot on Jupiter's disc is not Io's shadow as they suggest, it's Europa. They used a simple 3-channel RGB composite, but it's notable there's some vertical banding present in their data as well. This is not the same noise we were discussing here, though. It's more likely artifacts during the scanning mirror uneven movement and different lines were unevenly exposed hence the brightness gradient.
The Titan page I bring up because it does show the ugly vertical lines -- note the false color composite labeled "VIS channel". Seems they were unable to remove it as well.
Malmer
QUOTE (ugordan @ Sep 20 2006, 11:22 AM) *
Which brings me to the point -- if this is as simple as we suggest, why didn't the VIMS folks do this in the first place? This noise problem must have been known at least since Saturn orbit insertion. IMHO, it's not up to us to fix the calibration and produce our own files that do basic stuff like dark current removal.



it is a quick and dirty trick. in theory if you shoot an image of a planet and then of black space (or with the lens cap on) with identical settings it should create a perfect darkcurrent image. problem is that it would be noisy. so you would need a few black images to average out the noice. problem is that Cassini would have to send back lets say 5 black images for each "real" image. leading to an extreme waste of downlink time and other issues like SSR space and stuff.

I guess thats why they make a mathematical model of the darkcurrent instead.

I think that it is up to us to fix some of the calibration issues. the 2hz banding in the ISS images can be removed in lots of ways. its really hard to remove it completly. one way is to use the edge pixels that have not been exposed to light to create a DC image. Another way is to use that data to fit some kind of sinus wave to it and create a DC image from the sinus wave. another way is to use the pixelrows that are black space in an image to create a DC image. or fit a sinus wave to that. or combinations of these methods. or some fourier filter or some other magic.

Maybe its the same way with these VIMS lines. they might be really hard to model exactly. and that its up to the user to choose a way that fits him/her.

/M
ugordan
Here are a couple of... hm... questionable(?) results.
First, a wholly unremarkable image -- three narrow strips taken during Cassini's second Venus flyby:

Their relative spacing isn't meaningful, it's just 3 cubes stacked together. Magnified 3x.
I don't know exactly where VIMS was pointed at the time, but my best guess by looking at the PDS data is that it was practically nadir, seeing near-equatorial latitudes. This was near C/A, distance was around 8800 km to Venus' center (it appeared huge in the FOV so no cloud details could be seen here - the vertical scale was about 2 degrees latitude on Venus) and the phase angle was about 90 degrees. That's why I presume the three images show the terminator at right and dayside at left.
I was under the impression Venus was yellowish-white, but this turned out bluish. Note that VIMS was far from its normal operating temperatures so maybe the sensitivity was influenced and the results could be way off. The exposure used was only 50 ms, as opposed to exposures at Saturn being typically 5000 ms and more. Any calibration uncertainties would likely be exaggerated here.
I'd be interested to hear what Don thinks about this, being a Venus expert and all...

EDIT: Again, referencing to an official Venus flyby page, they also got a bluish Venus, though using only simple rgb channels. That's also the reason their image turned out so noisy.

Second, a Moon image, super-resolution view of 4 cubes stacked together:

Again, I'm surprised by the way the image turned out. I'd expect the Moon to turn out gray, but this is more like that Mariner spacecraft composite of Earth and the Moon. Or some of the Apollo shots. The image in the middle is a Solar System Simulator view, blurred to get the same effective resolution. The rightmost image is the same simulated image, unblurred.
The VIMS composite was magnified 4 times. Yes, four! You can really see how VIMS traded off spatial for spectral resolution.
slinted
UGordon, first off, this is an amazing project you have undertaken! I'm really impressed with your results so far. It is a perfect use of the VIMS dataset, especially given the large number of vastly different targets.

As to the VIMS cube to color conversion, I think you may be better off to start fresh rather than using the conversion code by Andrew Young. Some of the hue problems you described might be explained by his 'weighting' of the color matching functions with Illuminant C. It is a bad approximation of direct sunlight, but more importantly, I'm fairly sure his use of it as a weighting for the color matching functions (and your scaling of it, to D65) actually changes the hues arbitrarily.

From my understanding of spectrum to XYZ conversion, the best way to handle the calculation is to integrate (Newton-Cotes would be a good method) your spectrum directly through the CIE Color Matching Functions ( http://www.cvrl.org/cmfs.htm , CIE 1931 2-deg modified by Judd (1951) and Vos (1978)). That will give you XYZ coordinates, appropriate to whatever you deem the source lighting to be. Then, in a separate step, you convert from source XYZ to destination XYZ (D65). XYZ scaling isn't the best way to handle this conversion since it changes the hues inappropriately and doesn't change the luminance at all. You might want to look into either Bradford or Von Kries as an alternate method.

What you choose as a source white point is another matter entirely. It depends on what you want your true color to represent. Here are two scenarios:

1) An astronaut is sitting inside a ship, looking out the window and sees the scene. In this case, the viewer would be adapted to the internal lighting of the ship and the lighting of the scene would have little to no effect on the way it is perceived. If this is the goal, no chromatic adaptation is necessary. You can simply convert the original XYZ values directly to D65 according to the matrices listed earlier in this thread.

2) An astronaut is floating in space, with only the VIMS scene visible to them (or, in a similar scenario, a person is seeing the VIMS scene as if it were through the eyepiece of a very powerful telescope).

If you are aiming for #2, then you can use the Bradford or Von Kries method for adapting into D65 from whatever you deem the source white to be.
ugordan
Thanks, Slinted! I'll certainly have a look at the Bradford method and compare the results, even though it seems to be very similar in terms of the conversion matrix to the XYZ scaling. The planetary targets are mostly dull, so I figure the different methods won't produce radically different results. I already tried with the normal XYZ CMFs, but got weird results no matter if I removed or not the solar spectrum from the data. My sort of ground-truth check to color is Enceladus, as well as Mimas and Dione. Enceladus is basically all white and if I don't get 1.0/1.0/1.0 as RGB intensities I assume the result is bogus. Currently, it fits neatly with all three coming out nice and neutral, with Dione's dark-stained side ever-so-slightly greenish, which as I understand is correct.
What I don't understand, though, is how the heck neutral Enceladus turns white with my current code and the Moon doesn't. If they both have a flat spectrum, they should both turn out the same color. Mimas did, it's gray and though it's substantially darker than Enceladus, it made no difference to the code.

The Moon result reminds me of shots like this, Apollo 15 and Apollo 15 #2 for example (last two ripped off from www.apolloarchive.com). I haven't looked further, but there are other similar ones as well.

Is it even remotely possible the Moon is pale brownish and it's because of great brightness difference between it and the dark sky that it appears gray and washed out to us on Earth? Don Mitchell's palette would certainly suggest so.

Another image which is more of a demonstration of color than a real image. For all practical purposes VIMS couldn't resolve the Galilean satellites, they were merely pixels when viewed in normal resolution mode. The spectra is there, however. This composite of Io, Europa and Ganymede is vey, very, very enlarged and it's only to show the color. The cubes were taken at about 10 million km.
Click to view attachment

Note how yellowish Io turned out, compared to Europa and Ganymede. This is in good agreement with the "true" color of Io, unlike those enhanced pizza-shots Galileo and Voyagers took. Europa and Ganymede turned out colorless, Ganymede perhaps in part due to brightness applied here. The image doesn't show the relative brightnesses of the moons, I scaled each one to maximum. Note that VIMS visual channel exhibits a kind of chromatic aberration where all spectral channels don't project perfectly on the same pixel. This can cause some color shifting at point-like sources like these (in fact it's noticeable) so the end result is just illustrative. Nevertheless, it's nice to see Io standing out even at this distance.
DonPMitchell
It is true that Judd and Vos made improvements to the color matching functions. But I don't believe those changes are part of the XYZ color value talked about in televsion and sRGB standards. So I think you want to integrate against the CIE 1931 matching functions, not the Judd or Vos versions.

Why is there a white-point correction? Are you calculating an image of the reflectance function, or an image of what would be seen by an observer in space?
ugordan
QUOTE (DonPMitchell @ Sep 21 2006, 01:55 PM) *
Why is there a white-point correction? Are you calculating an image of the reflectance function, or an image of what would be seen by an observer in space?

As I was saying, using Young's matching code, uniform reflectance gives me converted RGB values of let's say 0.9,0.8,1.0 - somewhat bluish. I want that to be 1.0,1.0,1.0 so the surface would turn out white (this of course gamma-corrected and scaled to 255). Hence that XYZ scaling. I'm trying to calculate what would be seen by an observer in space. What's the difference to the other approach anyway?

The caveat with Young's code is I think in that it takes reflectance spectra of a material illuminated by Illuminant C. Hence, I have to divide the observed VIMS brightness spectra by solar spectra. This code then gives me the appearance of the object by this illumination. As I say, it's an approximation of daylight on Earth so an all-white material (such as Enceladus, effectively) will turn out bluish. That's where the Illuminant C -> D65 scaling comes in. It's not a magic fudge factor I made up, but found in the table slinted also referenced. This correction gives me 1.0,1.0,1.0 for white materials.
This worked well for the time being. I've been more focussed lately on trying to remove those pesky lines than do the color matching thing from scratch. In any case, a more "neat" matching code is on my todo list.
DonPMitchell
Reflectance is independant of the illuminant. It should just be the ratio of reflected/incident spectral density.

sRGB is defined with a white point of D65. That means that sunlight should give you very close to RGB = 1.0, 1.0, 1.0. A flat spectrum would be illuminant E, but you don't want to make that your white point.

I don't think you should be doing any of that division by spectra and whitepoint transformation. The VIMS camera is giving you the information to calculate absolute spectral density values. Just project those onto the matching functions to get XYZ, and transform that to sRGB, and you're done. That will give you what a human observer would see in space.

Just for a sanity check, I ran some tests through my colorimetry code, to print out XYZ and RGB and chromaticity for some standard spectra:

CODE
Illuminant A:
XYZ = 24.9524822235107, 22.7151050567626, 22.7151050567626
RGB = 41.9157028198242, 18.7636756896972, 5.29991245269775
xy = 0.447566837072372, 0.407435506582260

Illuminant D65:
XYZ = 21.1444530487060, 22.2462940216064, 22.2462940216064
RGB = 22.2496452331542, 22.2456779479980, 22.2425174713134
xy = 0.312734514474868, 0.329031169414520

Solar Spectrum:
XYZ = 405.107116699218, 417.494018554687, 417.494018554687
RGB = 456.046386718749, 408.477874755859, 393.248840332031
xy = 0.323091745376586, 0.332970887422561

Illuminant E:
XYZ = 0.224980384111404, 0.224962189793586, 0.224962189793586
RGB = 0.271081715822219, 0.213312312960624, 0.204518526792526
xy = 0.333313554525375, 0.333286613225936


These seem to check, at least the chromaticity values are right. Illuminant E and pure sunlight (in space) both come out just slightly reddish. D65 is a better approximation of sunlight at sea level.

I would still claim that "true color" means you don't get to fiddle with anything to make the image look subjectively "better". You can't change the white point or gamma or try to make Saturn look like it is being lit by a giant 6500 K tungsten lamp instead of the sun. You have to just process the spectrum into XYZ and then into sRGB and show that to people. And it's up to them to make sure their monitor is sRGB compatable, that's not our problem.
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Invision Power Board © 2001-2021 Invision Power Services, Inc.