Printable Version of Topic

Click here to view this topic in its original format

Unmanned Spaceflight.com _ Image Processing Techniques _ Creating images by interpolating between filters?

Posted by: EDG Oct 24 2010, 03:19 AM

http://www.unmannedspaceflight.com/index.php?showtopic=6775&view=findpost&p=165519 on my Ganymede mosaic intrigued me, and I just want to check something here.

It sounds like ugordan's software can just linearly interpolate between (say) a Green filter and a Violet filter image to create a (hopefully reasonably accurate) simulation of what a Blue filter image might look like - and then doing the same with other filters means that a simulated "true colour" image can be constructed.

e.g. (use http://pds-rings.seti.org/voyager/iss/inst_cat_na1.html#filters as a reference). Let's say we had a Green filter image and a Violet filter image of Ganymede where a given pixel had a DN (data number, or 8-bit pixel brightness, from 0 to 255) of 69 in Green and 51 in Violet. Green peaks at 585 nm and Violet peaks at 400 nm, and we want to interpolate an image taken with a Blue filter which would peak at 480 nm. Blue is 80/185 of the way between the two (on the Violet side), which means that the DN of that pixel in Blue should be (69-51)*(80/185)=7.78, or 8 DN higher than the Violet DN, which is a DN of 59.

So if I create a program that can do that interpolation across the whole image, would the result be reasonably accurate simulation of an image actually taken with that filter ? That assumes of course that the surface responds linearly to visible light wavelengths, which isn't necessarily true. But then, I'm not wanting to do this for scientific analysis or anything - this is just for aesthetic purposes really, to try to construct simulated "true colour" images.

And could it be possible to extrapolate a reasonably realistic Red filter image by seeing the difference between (say) Green and Orange and extending that further (linearly) into the red?

Looking at http://www.rwc.uc.edu/koehler/biophys/6d.html it seems that the human eye's sensitivity peaks at 558, 531 and 419 nm for red, green and blue respectively, so (assuming this approach works) would interpolating to those wavelengths instead of the ones described in the ISS filter descriptions actually produce a result that is more like what the human eye would see rather than an eye tuned to Voyager's wavelengths?

Posted by: ugordan Oct 24 2010, 10:55 AM

QUOTE (EDG @ Oct 24 2010, 05:19 AM) *
It sounds like ugordan's software can just linearly interpolate between (say) a Green filter and a Violet filter image to create a (hopefully reasonably accurate) simulation of what a Blue filter image might look like - and then doing the same with other filters means that a simulated "true colour" image can be constructed.

That's not really all I have in mind when I say interpolation. Indeed, if all you wanted to do is to approximate the appearance at a different wavelength you can do that by mixing channels in Photoshop.

The difference between my code and this simple channel mixing is that I convolve that interpolated spectra (I think it's in 5 nm increments or something like that, can't check now) through the CIE XYZ tristimulus sensitivity curves. This turns a filter set with wavelengths A,B,C (or just 2 filters, but can also be more than 3) to their approximate color representation (approximate because I linearly interpolate the missing spectral curve points) in XYZ space. Then a conversion formula is used to turn those values into sRGB colorspace. This takes better care of colors that are otherwise hard to fit in the monitor gamut than simple channel mixing does.

QUOTE
So if I create a program that can do that interpolation across the whole image, would the result be reasonably accurate simulation of an image actually taken with that filter ? That assumes of course that the surface responds linearly to visible light wavelengths, which isn't necessarily true. But then, I'm not wanting to do this for scientific analysis or anything - this is just for aesthetic purposes really, to try to construct simulated "true colour" images.

As I said above, you don't really need a program for that. As long as the spectra of the object of interest is relatively smooth and doesn't have large excursions in hue, you should be okay. If it does have large variations, like for example a strong blue (probably red as well) component, that's where differences start appearing. By way of XYZ calculations, a strong blue hue will also affect (suppress) the output sRGB red channel as the monitor gamut is limited. You don't get that with simple channel mixing. This isn't often the case with planetary bodies, but I've seen it while playing with Phoenix surface imagery. I couldn't get the "proper" color of the color calibration chips with channel mixing based on any "scientific" approach like interpolating target wavelengths.

I haven't worked with Voyager Jupiter images so can't say if simple mixing can fix the fact the "green" filter is close to orange - it probably can, but here's an example of what that CIE XYZ interpolation gives as result for OGB color: http://www.unmannedspaceflight.com/index.php?s=&showtopic=5876&view=findpost&p=162551

QUOTE
And could it be possible to extrapolate a reasonably realistic Red filter image by seeing the difference between (say) Green and Orange and extending that further (linearly) into the red?

Yes, although that increases noise as you're extending the linear segment as opposed to averaging two filters (like for example IR and green to get red). The same is obviously true for my code as well since that extrapolation step is essentially the same. I used that extrapolation from green and violet to get an approximately true color of Titania in this thread: http://www.unmannedspaceflight.com/index.php?s=&showtopic=6351&view=findpost&p=163385

QUOTE
Looking at http://www.rwc.uc.edu/koehler/biophys/6d.html it seems that the human eye's sensitivity peaks at 558, 531 and 419 nm for red, green and blue respectively, so (assuming this approach works)

I'm not entirely sure of the exact wavelengths you should shoot for. I don't believe it's human eye sensitivity, I think it should be the sRGB gamut corner wavelengths or even the reference monitor phosphors if they're not the same. If you extrapolate for human sensitivity peak and then display it through sRGB phosphors, you don't really improve on the situation where you had spacecraft-specific filters and sRGB phosphors. Either way, the sets are mutually incompatible.

Posted by: PDP8E Jan 10 2012, 04:45 AM

Based on the above threads and other sources, I've been playing working on color routines. The sundial is a good place to experiment. So, here is a recent jpeggy-noisy-downsampled sundial from Sol-2814 (I didn't use Photoshop. I have a directory of cobbled together tools mainly written in C and scripts)


As we all know, the bandwidth of the MER filters are very narrow (20nm) which makes for narrowly saturated colors at their peak frequencies. I am also comparing some PDS calibrated images against their raw-jpgs. The hope is to get future raws-jpegs to look ... better. huh.gif

Posted by: PDP8E Jan 16 2012, 01:01 AM

...color experiments continue... Here is Spirit's sundial on SOL 2102 (PDS data)
Spirit was embedded in Scamander Crater. The late afternoon sun and the tilt of the rover made the lighting dramatic.
What really stands out to me, is the dust load on the sweep magnet.




Powered by Invision Power Board (http://www.invisionboard.com)
© Invision Power Services (http://www.invisionpower.com)