IPB

Welcome Guest ( Log In | Register )

 
Reply to this topicStart new topic
3 Colours From 2 Channels, How to generate synthetic colour
SickNick
post Feb 17 2004, 09:33 AM
Post #1


Junior Member
**

Group: Members
Posts: 50
Joined: 8-February 04
From: Melbourne, Australia
Member No.: 5



A lot of the panorama shots are coming as L2 & L7 only (plus some R filters for stereo work).

For those frustrated by the lack of colour images, here's my recipe:

Download the L2 and L7 images. These are Red and Blue. Concoct a synthetic green from (L2+L7)/2. I find a tweak is required to put a tad more green into it, but it's close.

Here is a sample synthetic colour from L2 & L7 only



And here the equivalent from L2, L5, and L7.


Of course, neither is true colour, but we can begin to fiddle - lots less blue, a bit more red, and dial down the saturation a lot...


--------------------
- Nick

=====================================
Nick Hoffman Mars Specialist

3D-GEO Pty Ltd
Melbourne
Australia

http://whitemars.com

"First they ignore you,
then they laugh at you,
then they fight you,
then you win."
- Mahatma Gandhi (1869-1948)
=====================================
Go to the top of the page
 
+Quote Post
jmknapp
post Feb 17 2004, 02:08 PM
Post #2


Senior Member
****

Group: Members
Posts: 1465
Joined: 9-February 04
From: Columbus OH USA
Member No.: 13



Don't forget that L2 is technically infra red.

These "false color" images are so much more revealing of detail than the "true color" versions which are basically a monotone red. An idea of the proper balance for the different filtered images can be gotten from a graph on page 93 of thePancam document.

For imaging bright soils, they predict the following exposure times:

L7, R1 11 seconds
L6 3 seconds
L5 2 seconds
L4 0.7 second
L3 0.6 second
L2, R2 0.4 second
R3 0.5 second
R4 0.8 second
R5 0.6 second
R6 0.9 second
R7 1 second


So putting blue (L7) and infrared (L2) together without scaling overemphasizes the blue by a factor of more than 25!

But I'm not sure that scaling the images by the above factors would be a good solution either.


--------------------
Go to the top of the page
 
+Quote Post
djellison
post Feb 17 2004, 06:45 PM
Post #3


Founder
****

Group: Chairman
Posts: 14432
Joined: 8-February 04
Member No.: 1



oo - where did you get those figures from?

Doug
Go to the top of the page
 
+Quote Post
DavidVicari
post Feb 17 2004, 07:31 PM
Post #4


Junior Member
**

Group: Members
Posts: 36
Joined: 9-February 04
Member No.: 14



I think he got it from this 98 page pdf about the pancam.

http://europa.la.asu.edu:8585/PGG/greeley/...f/bell_2003.pdf
Go to the top of the page
 
+Quote Post
slinted
post Feb 26 2004, 11:53 AM
Post #5


Member
***

Group: Admin
Posts: 468
Joined: 11-February 04
From: USA
Member No.: 21



The maestro datasets can provide us with a better average, and some sad news about exposure time ratios. These are rough averages, as the maestro datasets do not include a great deal of the "odd" filters (3, 4 and 7 are very sparce in the releases made so far) but...the .msml files associated with the timestamp of each images contains the full PDS tag info, including what we're looking for.

the exposure times averaged here are normalized (by me) to L2.
L2 - 1
L3 - 1.820843084
L4 - 1.770944075
L5 - 4.507149264
L6 - 7.109196549
L7 - 24.95015542

as an average, this is accurate, but it does not take into account one of the most particularily disruptive features of using this to calibrate to "true" color...it varies greatly. As discussed in MAKI ET AL.: MARS EXPLORATION ROVER ENGINEERING CAMERAS robotics.jpl.nasa.gov/people/rwillson/papers/2003JE002077.pdf page 10, exposure time tables are kept in local memory and can be used in place of the auto exposing all the filters, once a location's luminescence is determined once by auto. I'll quote:

3.1.6. Exposure Time Tables
[34] The flight software keeps an onboard table of the
most recently used exposure time values for each camera/
filter combination and makes these values available for use
by subsequent image commands. These exposure time
tables are particularly useful when acquiring images of the
same general scene in rapid succession (e.g., Hazcam
imaging when driving, Navcam/Pancam panorama acquisition,
or multispectral Pancam imaging), where the overall
lighting level changes from image to image are relatively
small. If desired the exposure time table values can be used
as seed values in an autoexposure iteration. At the end of the
autoexposure iteration the exposure time table is optionally
updated with the most recently calculated exposure time for
that image.
3.1.7. Exposure Timescale Factors
[35] The flight software also allows exposure times to be
multiplied by a user-supplied floating point scale factor.
This feature is particularly useful when the absolute exposure
time is not known in advance, but the responsivitiy
ratios (i.e., the scale factor) between camera/filter combinations
are known. For example, if a Navcam image of the
terrain in front of the rover is acquired using autoexposure,
a front Hazcam image can be acquired using the previously
used Navcam exposure time multiplied by the scale
factor representing the ratio of the Navcam/Hazcam camera
sensitivities. Similarly, if a multispectral Pancam series
begins with an autoexposure using a particular spectral
filter, the next image in the series has access (via the
exposure time table) to the previously used value and can
modify that value by multiplying it by the user-supplied
scale factor. The use of the exposure time table and scale
factors help to improve image acquisition speed.

So within a particular site, or camera target (the outcrop for example), the exposure ratio is fairly stable, but as spirit has driven around, I've noticed this ratio appears to change depending on the amount of rocks in the shot, or different again if the lander is being imaged.


In followup to an email i sent to Dr. Justin Maki, he replied: "Yes the exposure time tables are being used extensively, but the values in the tables are dynamic and are automatically updated based on scene content. With respect to the actual values of the exposure times - these values are included in the headers of the image data, which will be released to the general public as part of the archiving process in a few months, per the MER Science team data release policy."

So...until we have those actual exposure times (differing in the maestro data sets as much as 2x for the ratio of L2-L7), it would seem those averages are as good as it gets, at least for those images that aren't included themselves in maestro.



Another factor entirely that must be taken into account is the CCD chip itself, which has a "quantum efficiency" that differs based on wavelength. QE correlates linearly with the luminocity:image brightness ratio for each filter (higher values = higher effeciency). Below is a rough estimate, traced off the graph included in Bell et al.: MER Pancam Investigation europa.la.asu.edu:8585/PGG/greeley/courses/pdf/bell_2003.pdf], page 87.
L2 0.45
L3 0.33
L4 0.35
L5 0.225
L6 0.15
L7 0.07


I wish the actual values for these were available within the paper, tracing off the graph inside a lowres pdf just doesn't seem right. But, with that and the averaged exposure time (ratio at least), a pretty good feel for "true" color balance starts to emerge.

I'm working with those assumptions myself, to do quite the opposite of what this thread discusses. Instead of working backwords from 2 filters, I'm trying to keep it to only those sites which include all 7 visible light filters, and extrapolating a spectrum for every pixel and working from that spectrum into color.
MER Multispectral Color Imagery www.lyle.org/~markoff/.
Go to the top of the page
 
+Quote Post
djellison
post Feb 26 2004, 12:19 PM
Post #6


Founder
****

Group: Chairman
Posts: 14432
Joined: 8-February 04
Member No.: 1



And remember - once you have taken exposure into consideration, there is also the fact that the flat/dark field subtraction takes place on board the rover - followed by a stretching of the image so infact they send less than a full 12bit image down to earth. We dont know the nature of that stretching either.

biggrin.gif

Like to play with us dont they biggrin.gif On the upside, sol 60 on spirit will mark the 1/3rd mark toward the first PDS release of Spirits first 30 sols of data.

Doug
Go to the top of the page
 
+Quote Post
jmknapp
post Feb 26 2004, 03:32 PM
Post #7


Senior Member
****

Group: Members
Posts: 1465
Joined: 9-February 04
From: Columbus OH USA
Member No.: 13



Plus there's the "quantum efficiency" of the eye itself to consider?

Seems like if, say, the exposure time ratio of L7 to L2 is 25:1, and the corresponding QE ratio is 1:6.4 (.07/.45), then the relative amount of photons red:blue in the scene is about 4:1, assuming similar brightness histograms in the images. But wait--the bandpasses of each filter have to be considered too. Bandpass of L2 is 20nm and L7 is 25, so maybe the ratio slips back to 5:1.

Even so, how would such a ratio of 5:1 red:blue appear to the eye, or more practically be reproduced in screen phosphor red/blue levels? Maybe it's just linear?


--------------------
Go to the top of the page
 
+Quote Post

Reply to this topicStart new topic

 



RSS Lo-Fi Version Time is now: 24th April 2024 - 05:28 PM
RULES AND GUIDELINES
Please read the Forum Rules and Guidelines before posting.

IMAGE COPYRIGHT
Images posted on UnmannedSpaceflight.com may be copyrighted. Do not reproduce without permission. Read here for further information on space images and copyright.

OPINIONS AND MODERATION
Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators.
SUPPORT THE FORUM
Unmannedspaceflight.com is funded by the Planetary Society. Please consider supporting our work and many other projects by donating to the Society or becoming a member.