Right, a suggestion I made here in http://www.unmannedspaceflight.com/index.php?s=&showtopic=3125&view=findpost&p=66503 made me wonder why not try that myself. A bunch of data was sitting on the PDS, after all. After a hassle figuring out just how the image cubes are organized and trying to read them, finally I was able to produce some results. This is all very rough work, can be considered first-iteration only and not particularly accurate.
Basically, I used the cubes to extract the visible spectrum in the 380-780 nanometer range which was then input to color matching code I found http://mintaka.sdsu.edu/GF/explain/optics/color/color.html by Andrew T. Young.
The code integrates over 40 10-nm steps to produce CIE XYZ color components. I then converted these to RGB values.
I'm aware of at least three inaccuracies in my code as of yet: one is the above sampled code apparently uses Illuminant C as the light source, not true solar spectra so the color turns out bluish (has a temp. of 9300 K instead of 6500 K, AFAIK). I tried to compensate at the moment by changing the final RGB white balance, but this is probably an inaccurate way to go. Another inaccuracy is I don't do bias removal from the cubes. This likely affects the outcome. Also, I don't use the precise wavelengths the code requires, but use the closest one in the cube. I intend to fix this by interpolating between nearest wavelengths.
All images are enlarged 4x.
its a really great "realitycheck" for the ordinary ISS images.
Very nice work, gordan... now we see the colors!
Only one observation: in the second 4-cube image, you say that dark, elongated feature in front of Saturn is Dione's disc. To me, is evident we are seeing the Dione shadow projected on the Saturn atmosphere: this would easily explain why is so dark and with an elongated/deformed shape... do not you agree?
It looks as if the colors match Voyager's colors a lot more. The images of Titan look like the images Voyager took.
How do you convert XYZ to RGB? Are you making sRGB? Here's the matric for that transform, just in case:
Yes, I'm using that XYZ to RGB matrix. I had to also use a scaling factor for converting Illuminant C to D65, X*0.969459, Y, Z*0.922050, prior to the RGB conversion. Only that gave me white for uniform reflectance. The RGB output is linear, I manually adjusted gamma to 1.25 in Photoshop.
I think I have bias subtraction working now (I'm still unsure if it's correct), but flatfielding seems suspicious still. There are cubes with excessive vertical banding that flat fields don't seem to touch. I'm also using a 40 point solar spectrum derived from the ISS calibration code. Somehow simply dividing by the provided solar spectra cube gives a yellowish result, when in fact it should be roughly the same. I'm now interpolating the discrete spectral wavelengths to get a nice 380-780 range.
Here are the most recent results, this time enlarged 3x:
Mimas, Dione, Rhea, Hyperion, Iapetus
Jupiter still seems too greenish somehow. The rings definitely appear to have a distinct yellow hue to them, unlike the colorful, bluish ISS releases.
Very good. Gamma 2.2 is technically what should be done, to get the image we see on the monitor to be proportional to actual scene radiance. What is the camera's radometric response function like? Are the pixel values in the VIMS file logarithmic?
As for dark current, making the background be 0 black should be right. But check the camera response. Subtracting background before or after linearization of the pixel values gives a totally different result if the camera response is nonlinear (and it probably is).
I'm not sure about the Illuminant C business. I'm not sure I understand what you did. The sRGB matrix assumes a white point of D65, which all new monitors are supposed to do. Don't think about correcting the color on the camera end, just concentrate on linearizing its response. Just plug those spectral radiance values into the XYZ integrals, convert to RGB by matrix multiple, and do gamma of 2.2. And you should see what the camera saw then!
The pixels are linear as far as I know, a 12 bit A/D converter is used. I think a precommanded bias value might be set to get the values out of the converter's low nonlinearity portion, or it might simply be a matter of subtracting dark current. Either way, I used the background algorithm found in the calibration package and then subtracted it.
As for Illuminant C, as I said the integration code I used produces a bluish white color if I pass it a uniform input reflectance spectra. When I used the regular CIE XYZ color matching functions, the colors turned weird. That'll need looking into. At the moment, using the supplied 40-step integration tables and using scaling factors found http://www.brucelindbloom.com/index.html?Eqn_ChromAdapt.html, for Illum. C -> D65 got me an exact white for the test input. This can be double-checked with Mimas, Dione and Enceladus which are all very gray and sort of provide "ground truth" for the white-point.
Regarding gamma, if I set it to 2.2, the colors are all washed up and all contrast is lost. Somehow that doesn't seem right? Even with a 1.25 gamma Hyperion's reddish tint is pretty much indescernible.
The pixels in the ISS images are not linearly saved. (the sensor is resonably linear but they dont save the data that way) You have to use a custom lookuptable to linearize those images. Maybe thats why you get strange results with the VIMS too. It looks to me as you are treating the raw data as linear an i dont think its saved that way. i think they save it almost logaritmically to increase dynamic range) thats why you get an image that looks as though it has alredy had an gamma correction applied to it. (i might be wrong here...)
here is the ISS data converion table.
http://pds-rings.seti.org/cassini/iss/COISS_0011_DOCUMENT/REPORT/iss/navcontent/appf/12-8-12.pdf
Im not sure if they use something similar on the VIMS data but my guess is that they do.
ISS calibration report:
http://pds-rings.seti.org/cassini/iss/COISS_0011_DOCUMENT/REPORT/
info and software for calibrating VIMS:
http://pds-rings.seti.org/cassini/vims/
/M
I am very well aware of the ISS data specifics. While it is mostly the case a LUT was used to convert linear (more or less, barring uneven bit weighting) 12 bit DNs to 8 bit, approximately square root encoded numbers, it is not always the case as you suggest. Sometimes full 12 bits were returned. The third case being the 12->8 bit conversion by returning only the lower 8 bits. This is also linear. In any case, my ISS code takes care of that.
However, I found absolutely no mention anywhere of lookup tables used in the VIMS instrument, in fact only mentions of lossless 12 bit encoders. The calibration code I checked doesn't use such conversions anywhere. It would be logical to save it via a square root LUT if they were downsampling the data, but all points to that not being the case.
The raw data is linear which can be seen from the calibration steps used in the provided code:
1. subtract dark background
2. divide by flatfield
3. divide by solar spectrum
4. multiply by detector performance as function of wavelength
5. convert into radiometrically correct units (optional)
Steps 1-4 should already yield accurate colors since I'm not interested in exact units.
I think the strange piecewise funktion has its benefits. it gives slightly better coverage in the black regions... when handling 8bit images...
it would be fun to paint spheres in the colors you have derived and put them in a dark room with a dim light and just wait and see...
/M
One concept is to make an image that resembles what would be seen by a human being, there in space, looking through a window. As long as the colors are inside the gamut of your display, it is theoretically possible to achieve this:
1. Convert 12-bit VIMS values to linear radiance.
2. Correct for dark current
3. Apply gain flatfield correction
4. Apply spectral-sensitivity gain correction
5. Integrate against CIE observer functions to get XYZ
6. Convert XYZ to RGB
7. Apply global gain adjustment for desired image contrast.
8. Encode pixel channels, for example, iRed = int(pow(R, 1.0/2.2) * 255 + 0.5)
Gamma has a big effect on the color, so if 2.2 looks drastically wrong, I would double check the process. You don't want to do everything carefully and then fudge gamma in photoshop. I bet there is just a small problem somewhere, and if you find it, the image will come out beautifully.
A mosaic of Saturn's northern latitudes:
Appearance of Jupiter and Saturn at similar phase angles:
Two shots of the lit side of the rings:
The rings in the left image are clipped off, it's not Saturn's shadow. The narrow, white ring is the F ring. The dark area in the middle image is Saturn's shadow.
Several narrow slices of the unlit side mosaicked together:
The unlit side appears less dull-brownish and more bluish, possibly due to ice particles forward-scattering blue light stronger.
Just checked out your gallery, ugordan- your processed Cassini images are truly glorious. Great work!
I also really like what you are doing. I hope you don't feel I am being discouraging. I am poking at the gamma issue, because I think you might have a bug someplace. Track it down, and then you will have rigorous "true color".
A couple of rough Saturn mosaics:
I think it looks great. Very subtle beautiful colors.
Keep em coming!
/M
A few Jupiter images, fixed the greenish hue. It was due to my code subtracting out dark background when apparently it was already subtracted from the Jupiter flyby cubes.
The middle and right image were taken not far apart, the rightmost image is much bigger because it utilized a high-resolution mode developed in-flight, it increases the spatial resolution by 3x at a cost of lowering the S/N ratio a bit. The leftmost image is also hi-res, but was taken far before closest approach.
Outbound crescent, 2 cube mosaic:
A couple of Europa transits:
It's a shame the transit on the right didn't catch more area to the left of Jupiter's limb, there was a great http://space.jpl.nasa.gov/cgi-bin/wspace?tbody=599&vbody=-82&month=1&day=2&year=2001&hour=10&minute=00&fovmul=1&rfov=2&bfov=30&showsc=1 of Io and Ganymede there. The rightmost image is actually very close in time to an ISS http://www.flickr.com/photos/ugordan/224555055/, taken on January 2nd, 2001.
Again, all images magnified 2x and gamma-corrected.
The vertical stripes (most noticeable is a blue line) in some darker shots... I've checked and double-checked the flatfields and dark background removal code and none of them even touches that. There's a load of them in the raw data, they appear to be static so those are probably hot pixels on the CCD. Since the cubes are readout one line at a time, in a push-broom mode with the spectra of the line being split in the vertical dimension, It means same samples in all lines in a given spectral band will suffer from the same problem. Hence the vertical lines. What I don't understand is why adequate flatfields/dark current models haven't been produced to deal with this. This only affects the visible channel, the IR channel seems fine (note the VIS and IR channels are actually two different physical parts). So far, there's nothing I can do about it.
The code is an ad-hoc implementation in C, it's rudimentary and messy as I progressively modified it from simply trying to read the cubes to do more complex stuff like color matching. It doesn't even output a nice PNG color image, but a raw 64x64 pixel dump with 3 interleaved rgb channels
I guess you have to make an additional flatfield removal using a homebrewn flatfield. Try simply takeing one row of an image of black space, scale to width of image and remove it from the data and see what it looks like.
/M
Great work! As for the calibration data, it could be that they are still working on it. A lot of these things weren't worked out at the time of the Jupiter encounter. It might be worth contactings someone to find out.
it was a typo. i actually ment darkcurrent. mixed them up in my head for some reason. (thats why i suggested you to use a bit of black space to serve as a DC remover)
Although the noise stays fixed, it changes intensity -- it's likely connected with exposure duration. It's not likely to be as easy as subtracting a generic dark cube from all cubes. I don't think a small bit of dark space would do the trick as there's also a fair bit of cosmic ray noise induced. It would be required to average many dark lines. There are many dark space cubes that weren't even readout as a full 64 width swath so they're of limited usefulness. Additionally, the physical resolution of the CCD is AFAIK 192x96 pixels with 192 being one spatial line and 96 being the visible spectral channels. All cubes have a maximum resolution of 64 spatial pixels. Normal mode of operation simply sums three pixels in a line to get 192/3=64 pixels. The high res mode, however doesn't sum the individual pixels and that's where the 3x resoulution increase comes from. This is where it gets complicated a bit: which 64 pixels out of 192 will be readout depends upon the offset factors and is somewhat clumsy. A fair bit of margin for error is present if you don't know exactly what you're doing. The PDS document isn't all too clear on certain aspects.
Which brings me to the point -- if this is as simple as we suggest, why didn't the VIMS folks do this in the first place? This noise problem must have been known at least since Saturn orbit insertion. IMHO, it's not up to us to fix the calibration and produce our own files that do basic stuff like dark current removal.
Of course, all this is barring I haven't screwed up in my code somewhere, but I'm more and more certain that's not the case.
Not particularly related to my processing, but here are a couple of past releases by the VIMS team of http://vims.artov.rm.cnr.it/data/res-jup.html and http://vims.artov.rm.cnr.it/data/res-tit.html. The Jupiter page shows the same image as my bottom 2nd image from the right. Their image was mirrored left-right as Cassini never saw Jupiter from that vantage point. The dark spot on Jupiter's disc is not Io's shadow as they suggest, it's http://space.jpl.nasa.gov/cgi-bin/wspace?tbody=599&vbody=-82&month=1&day=2&year=2001&hour=08&minute=30&fovmul=1&rfov=2&bfov=30&showsc=1. They used a simple 3-channel RGB composite, but it's notable there's some vertical banding present in their data as well. This is not the same noise we were discussing here, though. It's more likely artifacts during the scanning mirror uneven movement and different lines were unevenly exposed hence the brightness gradient.
The Titan page I bring up because it does show the ugly vertical lines -- note the false color composite labeled "VIS channel". Seems they were unable to remove it as well.
Here are a couple of... hm... questionable(?) results.
First, a wholly unremarkable image -- three narrow strips taken during Cassini's second Venus flyby:
Their relative spacing isn't meaningful, it's just 3 cubes stacked together. Magnified 3x.
I don't know exactly where VIMS was pointed at the time, but my best guess by looking at the PDS data is that it was practically nadir, seeing near-equatorial latitudes. This was near C/A, distance was around 8800 km to Venus' center (it appeared huge in the FOV so no cloud details could be seen here - the vertical scale was about 2 degrees latitude on Venus) and the phase angle was about 90 degrees. That's why I presume the three images show the terminator at right and dayside at left.
I was under the impression Venus was yellowish-white, but this turned out bluish. Note that VIMS was far from its normal operating temperatures so maybe the sensitivity was influenced and the results could be way off. The exposure used was only 50 ms, as opposed to exposures at Saturn being typically 5000 ms and more. Any calibration uncertainties would likely be exaggerated here.
I'd be interested to hear what Don thinks about this, being a Venus expert and all...
EDIT: Again, referencing to an official http://vims.artov.rm.cnr.it/data/res-ven.html page, they also got a bluish Venus, though using only simple rgb channels. That's also the reason their image turned out so noisy.
Second, a Moon image, super-resolution view of 4 cubes stacked together:
Again, I'm surprised by the way the image turned out. I'd expect the Moon to turn out gray, but this is more like that Mariner spacecraft composite of Earth and the Moon. Or some of the Apollo shots. The image in the middle is a Solar System Simulator view, blurred to get the same effective resolution. The rightmost image is the same simulated image, unblurred.
The VIMS composite was magnified 4 times. Yes, four! You can really see how VIMS traded off spatial for spectral resolution.
UGordon, first off, this is an amazing project you have undertaken! I'm really impressed with your results so far. It is a perfect use of the VIMS dataset, especially given the large number of vastly different targets.
As to the VIMS cube to color conversion, I think you may be better off to start fresh rather than using the conversion code by Andrew Young. Some of the hue problems you described might be explained by his 'weighting' of the color matching functions with Illuminant C. It is a bad approximation of direct sunlight, but more importantly, I'm fairly sure his use of it as a weighting for the color matching functions (and your scaling of it, to D65) actually changes the hues arbitrarily.
From my understanding of spectrum to XYZ conversion, the best way to handle the calculation is to integrate (http://en.wikipedia.org/wiki/Newton-Cotes_formulas would be a good method) your spectrum directly through the CIE Color Matching Functions ( http://www.cvrl.org/cmfs.htm , CIE 1931 2-deg modified by Judd (1951) and Vos (1978)). That will give you XYZ coordinates, appropriate to whatever you deem the source lighting to be. Then, in a separate step, you convert from source XYZ to destination XYZ (D65). XYZ scaling http://www.brucelindbloom.com/index.html?ChromAdaptEval.html to handle this conversion since it changes the hues inappropriately and doesn't change the luminance at all. You might want to look into either http://www.brucelindbloom.com/index.html?Eqn_ChromAdapt.html as an alternate method.
What you choose as a source white point is another matter entirely. It depends on what you want your true color to represent. Here are two scenarios:
1) An astronaut is sitting inside a ship, looking out the window and sees the scene. In this case, the viewer would be adapted to the internal lighting of the ship and the lighting of the scene would have little to no effect on the way it is perceived. If this is the goal, no chromatic adaptation is necessary. You can simply convert the original XYZ values directly to D65 according to the matrices listed earlier in this thread.
2) An astronaut is floating in space, with only the VIMS scene visible to them (or, in a similar scenario, a person is seeing the VIMS scene as if it were through the eyepiece of a very powerful telescope).
If you are aiming for #2, then you can use the Bradford or Von Kries method for adapting into D65 from whatever you deem the source white to be.
Thanks, Slinted! I'll certainly have a look at the Bradford method and compare the results, even though it seems to be very similar in terms of the conversion matrix to the XYZ scaling. The planetary targets are mostly dull, so I figure the different methods won't produce radically different results. I already tried with the normal XYZ CMFs, but got weird results no matter if I removed or not the solar spectrum from the data. My sort of ground-truth check to color is Enceladus, as well as Mimas and Dione. Enceladus is basically all white and if I don't get 1.0/1.0/1.0 as RGB intensities I assume the result is bogus. Currently, it fits neatly with all three coming out nice and neutral, with Dione's dark-stained side ever-so-slightly greenish, which as I understand is correct.
What I don't understand, though, is how the heck neutral Enceladus turns white with my current code and the Moon doesn't. If they both have a flat spectrum, they should both turn out the same color. Mimas did, it's gray and though it's substantially darker than Enceladus, it made no difference to the code.
The Moon result reminds me of shots like http://photojournal.jpl.nasa.gov/catalog/PIA00342, http://161.115.184.211/teague/apollo/AS15-88-12004.jpg and http://161.115.184.211/teague/apollo/AS15-88-12011.jpg for example (last two ripped off from http://www.apolloarchive.com). I haven't looked further, but there are other similar ones as well.
Is it even remotely possible the Moon is pale brownish and it's because of great brightness difference between it and the dark sky that it appears gray and washed out to us on Earth? Don Mitchell's http://photos1.blogger.com/blogger/1792/1563/1600/Palette.jpg would certainly suggest so.
Another image which is more of a demonstration of color than a real image. For all practical purposes VIMS couldn't resolve the Galilean satellites, they were merely pixels when viewed in normal resolution mode. The spectra is there, however. This composite of Io, Europa and Ganymede is vey, very, very enlarged and it's only to show the color. The cubes were taken at about 10 million km.
It is true that Judd and Vos made improvements to the color matching functions. But I don't believe those changes are part of the XYZ color value talked about in televsion and sRGB standards. So I think you want to integrate against the CIE 1931 matching functions, not the Judd or Vos versions.
Why is there a white-point correction? Are you calculating an image of the reflectance function, or an image of what would be seen by an observer in space?
Reflectance is independant of the illuminant. It should just be the ratio of reflected/incident spectral density.
sRGB is defined with a white point of D65. That means that sunlight should give you very close to RGB = 1.0, 1.0, 1.0. A flat spectrum would be illuminant E, but you don't want to make that your white point.
I don't think you should be doing any of that division by spectra and whitepoint transformation. The VIMS camera is giving you the information to calculate absolute spectral density values. Just project those onto the matching functions to get XYZ, and transform that to sRGB, and you're done. That will give you what a human observer would see in space.
Just for a sanity check, I ran some tests through my colorimetry code, to print out XYZ and RGB and chromaticity for some standard spectra:
[attachment=7578:attachment]
There's a brown Moon for you, taken by Galileo. I doubt if the image was created with any fancy colorimetric processing, but it is amusing.
Color perception is effected by viewing conditions. Among other things, your brain does a sort of automatic white-point compensation -- adaptation. Everyone has noticed how bad color photographs look when they are taken with indoor lighting, but to our eye, the lighting never seemed as bad.
For looking at dyes and fabrics or doing graphics arts, people buy expensive precision lighting, typically D50 lamps that meet ISO standards. Monitors generate their own light, so its not as critical to have D50 lamps, except for the secondary effect of your visual system's color adaptation. Here is a nice article I found by an artist, about how he set up his workspace: http://www.kevinmillsphoto.com/Articles/OfficeLightingArticle.html.
However, sRGB is well defined. If you are synthesizing an image or if you have spectral sensor data, then there is one unique standard thing to do, calculate 24-bit sRGB. All the messy stuff happens when people view it, but that is someone else's problem.
JRehling, those are all valid, good points. I said from the beginning true color is subjective. Heck, I even put "true" in quotes to emphasize that. One can still try, though...
Right, so here are a couple of checks using regular CIE XYZ 1931 2-deg color matching functions. If I pass them a unity spectrum, the result is xy=0.333, 0.333. The sRGB color comes at slightly reddish as you suggested. Checks out with your result.
However, if I pass the solar spectrum (derived from Cassini ISS calibration volume) I get xy=0.3496, 0.3526. The sRGB comes at fairly reddish. This obviously differs from your xy = 0.32309, 0.33297. Why is that, I'm still trying to figure out. Even if it did match your result, you say it would still turn out reddish. Obviously if I use this function on Enceladus, it too turns reddish. How accurate is that, especially taking into consideration the eye will adapt to this reddish color of the sun's illumination and perceive Enceladus as being white? The image on the screen will remain reddish, though.
Are you saying I should drop 255,255,255 as being white but settle for 255,253,248 (example) and hope the calibrated monitor setting will turn that into visually apparent white?
Good points made by all,
Part of my confusion on this might be coming from a misunderstanding about what the dataset is providing. Are the VIMS cubes in spectral power, or reflectance? (I appologize if I'm using these terms incorrectly) Has the illuminant spectrum already been removed?
If the VIMS data is in spectral power, then I would say that chromatic adaptation is still necessary since, as Don said, D65 isn't solar spectrum. It is what we see here on Earth.
If it is a reflectance dataset, then wouldn't that spectrum have to be multiplied by the spectrum of the D65 illuminant before being passed to the CIE color matching functions?
VIMS cubes, their raw form, capture the brightness of the source. During calibration, it's possible (in the official code) to choose to divide by the solar spectrum so you get the reflectance (I don't know if I'm using the term correctly. I'd say you get the "intrinsic color" of the target). Since as Don said, if you leave the solar spectrum in and simply integrate the CMFs, you'd get a white point that would be reddish, i.e. it wouldn't be 255,255,255 in RGB. Young's code worked on this intrinsic color and assumed an Illuminant C (not solar spectrum) on it. He was I think interested in what color the materials would be here on Earth, so that's where C came from. That's why I had to convert to the D65 whitepoint to get a 255,255,255 whitepoint. I've now dropped his code. What I do now is I still divide by solar spectrum, multiply by D65 spectrum (you can call this chromatic adaptation, but it's not a simple scaling as before, but a full spectrum apllied so it should be more accurate) and integrate the CIE functions. This eliminates the need for the XYZ scaling afterwards. The results are virtually indistinguishable from the previous method, nevertheless I'm more comfortable now as I understand what I've actually done:
Left side: solar spectrum divide -> Young's algorithm -> XYZ scaling
Right side: solar spectrum divide -> D65 spectrum multiply -> CIE functions
I can understand the reasoning of not touching the illumination, not removing solar spectra from the cubes. The problem with that is solar spectrum turns out reddish and everything has a distinct red hue. Here are a couple of comparisons, left images show no spectrum messing, simple integration through the CIE XYZ functions.
I suppose if you stared long enough at scenes like this, the eye/brain would automatically adapt to the color and make a new white-point, making Enceladus (uppermost left image) appear white again. The problem is, this is not gonna happen on a computer screen.
You be the judge of what looks more reasonable.
Be my guest. For starters, here's a dump of the 64x64 Jupiter cube. The wavelenghts are stacked top to bottom, there are 95 of them in 5 nm increments starting at 360 nm. They were linearly interpolated from the VIMS cube as it has discrete wavelengths, spaced on average 7-8 nm. The dark background was removed (actually, that was done onboard), flatfield applied and corrected for detector performance. The samples are linear.
For Jupiter and Saturn, I am biased toward the right hand images based on experiences with a telescope, but this brings in a whole new host of visual issues...
I've been experimenting with cleaning up the VIMS visual channel a bit. I've found a cube that captures the night side of Titan with the entire 64x64 hires field of view that served as an augmented dark background to remove the pesky noise. It turned out better than I hoped. Unfortunately, the noise doesn't seem to be as static as I thought. Most of the noise is removed, but some isn't.
To illustrate just what mess the VIMS cubes are (including their calibration data), here are a couple of before and after images:
The bottom image is one of the highest resolution views of Iapetus by VIMS. The noise is disastrous. You can actually see it's mostly coincident with the ring image above.
My dark current removal model assumes noise is linear with exposure, this seems to work well. The background will need some touching up still.
I emailed one of the guys responsible for the PDS volumes, he said he'll look into it and get back to me. We shall see, though I'm not getting my hopes up.
Sure enough, I got a response from the VIMS team. They had this to say:
It probably boils down to too-much-to-do, not-enough-techies, not-enough-grad-students...
The sort of thing that pennypinching does to a live mission. They'll have time for a systematic beginning-to-end calibration effort when the mission is over.
To reiterate what you said, on my asking if updated calibration data will be provided, he said that will happen at the end of Cassini mission. An earlier update may be possible, but not likely.
Guess that means I'm stuck with homebrewn dark current models <sigh>...
Here are a few quick results from the new PDS release. Not that much interesting stuff in there this time. These don't have the sRGB correct gamma of 2.2 because it just looks too washed out to me and it really brings the nasty vertical noise out. I chose a gamma of 1.33 instead. Magnified 2x.
The first image is coincident with an http://static.flickr.com/96/256542004_3b202e1c5e_o.jpg. The 3rd image from the left shows the F ring, the 4th shows Titan's north pole at high phase. The Saturn mosaic has one of the moons visible at the bottom - might be Dione/Tethys/Rhea, I haven't checked. The last image was taken during that Titan flyby where we had Saturn emerging from behind Titan's limb. You can barely see part of Saturn's crescent there, too bad the view didn't capture more of it.
First test to process VIMS cube ( gamma 2.2 )
v1509053994_1
color
Hey folks, the Rings Node has generated preview images for Cassini VIMS cubes to their http://pds-rings.seti.org/search/. The attached screen cap shows what these look like for a colorful object, Iapetus. http://pds-rings.seti.org/cassini/vims/COVIMS_previews.txt This will make it much easier to browse for good VIMS data!
A superb piece of work at TPS but we have to have it here too.
http://www.planetary.org/image/2383059379_c56b0272cb_o.png
Believable true colour is a great gift. Thank you Gordan.
Powered by Invision Power Board (http://www.invisionboard.com)
© Invision Power Services (http://www.invisionpower.com)