IPB

Welcome Guest ( Log In | Register )

 
Reply to this topicStart new topic
Properly mixing color
Astroboy
post Mar 22 2016, 03:55 AM
Post #1


Junior Member
**

Group: Members
Posts: 38
Joined: 27-August 14
From: Private island on Titan
Member No.: 7250



Hey gang, I finally discovered an easy way to align images of various celestial targets taken at slightly different times! I used a plugin for ImageJ called bUnwarpJ. I'm really pleased with most of the results I've gotten so far. I've noticed that the receding limb sometimes doesn't warp as well as the approaching limb, though. That's something I still need to figure out, along with mosaicking and the issue I'm about to bring up.

Anyway, to celebrate performing this pretty rudimentary image processing task for the first time, I tried aligning Dawn images taken through filters 2, 7 and 8 (corresponding roughly to green, red and blue) using .img files I had converted to .gif through NasaView. The thing is, my resulting color images keep coming out looking incredibly garish. One attempt ended up with a bright pink Ceres.

I've noticed a lot of raw NASA images, calibrated or uncalibrated, vary incredibly wildly with respect to brightness, and even across images taken through the same filter with the same exposure. How am I supposed to rectify this, or even begin to know what to do? This has been driving me crazy for a while. I know there's no such thing as "true" color, that color is subjective, etc., but I've always assumed that a given combination of three frames taken through three different filters can be mixed in a specific way that is scientifically useful, predictable, and proper or something.

I guess what I'm asking is, how do you guys decide on how much of each filter to use when mixing color?
Go to the top of the page
 
+Quote Post
JRehling
post Mar 22 2016, 06:49 PM
Post #2


Senior Member
****

Group: Members
Posts: 1949
Joined: 20-April 05
Member No.: 321



Amateur astrophotographers now routinely use de-rotation software for imaging Jupiter. Jupiter rotates so quickly that it's hard to get imaging data fast enough before the rotation is significant. I haven't used this myself.

Imaging taken from spacecraft (e.g., Ceres from Dawn) have an arbitrarily greater challenge, due to the motion of the spacecraft.

"Scientifically useful" and "proper" may be contradictory things. Ultraviolet images of Venus are scientifically useful but are inherently unlike anything a human can see with their eyes.

There really is no objective true color, and that's not just a minor nitpick. It varies dramatically based on the lighting, etc. Almost all space images of solar system objects (from Mercury to Saturn) appear on a monitor much dimmer than the world would appear in direct sunlight. That inherently changes the color a great deal. Deep sky objects, on the other hand, are almost always displayed on a monitor much brighter than they would appear if seen with the naked eye.

The human eye has three broadband color sensors. Many space camera systems have narrowband filters corresponding to the middle of the R, G, and B ranges, and those can approximate human RGB color, but can never be guaranteed to capture it correctly. For example, a sodium yellow line is, in principle, completely invisible to RGB narrowband filters, while the human eye and cameras with broad RGB can see it just fine.

If an image seems too garish to you, you can adjust the saturation.

In summary, there is no validity, in digital imaging, to Keats' lines:

"Beauty is truth, truth beauty," that is all
Ye know on earth, and all ye need to know.
Go to the top of the page
 
+Quote Post
Astroboy
post Mar 22 2016, 09:09 PM
Post #3


Junior Member
**

Group: Members
Posts: 38
Joined: 27-August 14
From: Private island on Titan
Member No.: 7250



Once again, I know there's no valid color in digital imaging. What I'm asking is when you're given a repeating series of images of a target taken through multiple filters, with each filter always using the same exposure, and you'd like to process three differently filtered images into one color image, what do you do when images taken through the same filter have extremely inconsistent brightness levels that can't be explained by varying exposure? Are there any references as to how a frame taken through a certain filter with a certain exposure should look, without these brightness fluctuations? I want to know how you guys choose the specific levels of each filter you choose when processing images, whether or not they include ultraviolet or infrared or whatever. I'm not interested in duplicating the exact kind of color my eye would see, I just want to be like you guys.
Go to the top of the page
 
+Quote Post
mcaplinger
post Mar 22 2016, 10:12 PM
Post #4


Senior Member
****

Group: Members
Posts: 1572
Joined: 13-September 05
Member No.: 497



QUOTE (Astroboy @ Mar 22 2016, 01:09 PM) *
what do you do when images taken through the same filter have extremely inconsistent brightness levels that can't be explained by varying exposure?

If the target illumination conditions (solar distance, phase, etc) are the same, then the only way the images can have different levels is if there's something else going on, like different gain settings (what a typical consumer digital camera calls "ISO".) You have to know all of these settings to get images that can be directly compared.

Our cameras on MRO, LRO, MSL and Juno don't have gain settings, but some of our other cameras do (e.g. our cameras on OREx).

The point of some radiometrically-corrected PDS products is to have images that have such variations removed somehow by appropriate scaling.


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
scalbers
post Mar 22 2016, 10:36 PM
Post #5


Senior Member
****

Group: Members
Posts: 1252
Joined: 5-March 05
From: Boulder, CO
Member No.: 184



In case you would want to consider perceived color, there are some steps that can be detailed that I like to do with full color synthetic (rendered) images. If we assume we're starting with a linear spectral radiance scaling we can do these sorts of things:

1) Convolve the spectral radiance with the tristimulus color functions

2) Apply the 3x3 transfer matrix that puts the XYZ image into the RGB color space of the computer monitor

3) Include a gamma correction to match the monitor brightness scaling

Even though this would strictly speaking apply to perceived color, it should give pretty close results with "actual" color. I define this as if you're sitting (floating) out in space holding your computer monitor side-by-side with the object of interest and seeing if they match.


--------------------
Go to the top of the page
 
+Quote Post
elakdawalla
post Mar 23 2016, 02:54 PM
Post #6


Administrator
****

Group: Admin
Posts: 5022
Joined: 4-August 05
From: Pasadena, CA, USA, Earth
Member No.: 454



For a less exact but very fast approach, I make a color combination of R, G, B images and then compare its levels to a color photo processed by one of the masters here (like ugordan for Saturn targets, machi for small bodies), and I adjust the levels to make my image's histogram look similar to the example image by setting the maximum values of R, G, and B channels to be near those for the brightest color in the example image.


--------------------
My blog - @elakdawalla on Twitter - Please support unmannedspaceflight.com by donating here.
Go to the top of the page
 
+Quote Post
Bjorn Jonsson
post Mar 23 2016, 03:24 PM
Post #7


IMG to PNG GOD
****

Group: Moderator
Posts: 1863
Joined: 19-February 04
From: Near fire and ice
Member No.: 38



QUOTE (Astroboy @ Mar 22 2016, 03:55 AM) *
Anyway, to celebrate performing this pretty rudimentary image processing task for the first time, I tried aligning Dawn images taken through filters 2, 7 and 8 (corresponding roughly to green, red and blue) using .img files I had converted to .gif through NasaView. The thing is, my resulting color images keep coming out looking incredibly garish. One attempt ended up with a bright pink Ceres. I've noticed a lot of raw NASA images, calibrated or uncalibrated, vary incredibly wildly with respect to brightness, and even across images taken through the same filter with the same exposure.


One thing that may contribute to the problem you are having is that you converted the Dawn .IMG files to GIF. The Dawn images are 16 bit images whereas GIF is 8 bits. This means that your GIFs are either going to be very dark (since the maximum intensity level in the Dawn IMG files is much lower than 65535) and include only a few shades of dark gray or they are automatically contrast stretched which can mess up the color (for one thing, even images with the same filter and exposure usually wouldn't be contrast stretched using identical parameters). So you really should convert 16 bit IMG files to a format that supports 16 bits/pixel. PNG does this whereas GIF's maxiumum bit depth is 8. IMG2PNG can convert the Dawn IMG files to 16 bit PNGs.
Go to the top of the page
 
+Quote Post
JohnVV
post Mar 23 2016, 06:44 PM
Post #8


Member
***

Group: Members
Posts: 816
Joined: 18-November 08
Member No.: 4489



besides the 8 bit ( 256 tone) and 16 bit ( 65536 tone) issue Bjorn posted

Nasaview normalizes the images
it auto stretches -5% to + 105%
the top and bottom 5% of tones are turned black(0) or white (255)

then there is the issue of the 8 bit image depth

then the camera calibration needs to be looked at
each ccd array and filter has a set of calibration images that need appalling

example for dawn the flat images
http://sbn.psi.edu/archive/dawn/fc/DWNCALFC2/DATA/FLATS/
"FC2_F1_FLAT_V02.IMG"
-- 8 bit normalized copy of the 32 bit float



now converting DIGITAL-- R ,G, B, black and white images into a single color image is WAY easier than using FILM used to be
( i used to do that dye transfer and color separations in the darkroom )

a LOT of math needs to go into it
or
use the non scientific approach and artistic one and have it "look good"


read up on the MSL and Opportunity and sprite "sun dial" color wheel image processing
or the viking color chart processing (and the blue sky - first image )
Go to the top of the page
 
+Quote Post
JRehling
post Mar 25 2016, 04:11 PM
Post #9


Senior Member
****

Group: Members
Posts: 1949
Joined: 20-April 05
Member No.: 321



QUOTE (elakdawalla @ Mar 23 2016, 07:54 AM) *
For a less exact but very fast approach, I make a color combination of R, G, B images and then compare its levels to a color photo processed by one of the masters here (like ugordan for Saturn targets, machi for small bodies), and I adjust the levels to make my image's histogram look similar to the example image by setting the maximum values of R, G, and B channels to be near those for the brightest color in the example image.


I echo this; one useful addition may be to downsample both images to eliminate pixel-to-pixel variation and use the color of the new, coarser pixels as a specific value. Then I make a table comparing, say, the RGB of Jupiter's equatorial band as a whole. This allows me to plan a conversion with very specific values. Then I use one relatively dark value of each channel and one relatively bright value of each to create a six-point mapping between the two.

That said, the approach can fail if the overall brightnesses of the images varies by a lot. You can end up having insufficient sampling on the dark or bright side of the distribution to map to the "master" image without making it look grainy. But in that case, you're trying to solve an unsolvable problem.
Go to the top of the page
 
+Quote Post
Astroboy
post Apr 26 2016, 10:04 PM
Post #10


Junior Member
**

Group: Members
Posts: 38
Joined: 27-August 14
From: Private island on Titan
Member No.: 7250



QUOTE (mcaplinger @ Mar 22 2016, 11:12 PM) *
If the target illumination conditions (solar distance, phase, etc) are the same, then the only way the images can have different levels is if there's something else going on, like different gain settings (what a typical consumer digital camera calls "ISO".) You have to know all of these settings to get images that can be directly compared.

Our cameras on MRO, LRO, MSL and Juno don't have gain settings, but some of our other cameras do (e.g. our cameras on OREx).

The point of some radiometrically-corrected PDS products is to have images that have such variations removed somehow by appropriate scaling.


What if even the calibrated files have this varying brightness issue? The calibrated Vesta .fits images, for example.

I've been trying to convert IMG to PNG on ImageJ using this plugin called PDS Reader, but I keep getting a "this doesn't appear to be a PDS file" error message. Please help.
Go to the top of the page
 
+Quote Post
JohnVV
post Apr 26 2016, 11:01 PM
Post #11


Member
***

Group: Members
Posts: 816
Joined: 18-November 08
Member No.: 4489



if they are fits images then they are NOT pds img ( img/lbl ) files
fit or fits are mainly used in astronomy and most editors can work with them

for fit images i use
Nip2 -- handles 32 and 16 bit images
Gmic ( terminal tool ) -- handles 32 and 16 bit images
Gimp 2.9.3 ( DEVELOPMENT ) -- handles 32 and 16 bit images
and sometimes GDAL

i take it you are refering to these
http://sbn.psi.edu/archive/dawn/fc/DWNVFC2_1A/DATA/
fit
http://sbn.psi.edu/archive/dawn/fc/DWNVFC2...38030914F1H.FIT
fit in gmic -( handy for images in the web browser )
-- upside down the fit format starts on the bottom and reads UP

The data in the Fit images are 16 bit SIGNED!!!!!!! integer ( -32767 to +32768 )

if the java imageJ is normalizing them then they will be all over the place
png dose NOT support 16 bit short data just 16 bit Ushort ( 0 to 65535 )

the img
http://sbn.psi.edu/archive/dawn/fc/DWNVFC2...38030914F1H.IMG
opens just fine in isis3 and is imported
Go to the top of the page
 
+Quote Post
Astroboy
post Jun 4 2016, 12:02 AM
Post #12


Junior Member
**

Group: Members
Posts: 38
Joined: 27-August 14
From: Private island on Titan
Member No.: 7250



Less than 24 hours ago I had no idea how to write command lines... but after seven continuous hours last night and five continuous hours today of caffeine, cursing and sifting through forum posts, I finally made my first NASA image with good color from images that weren't contrast stretched~~~!!!!!!!!!!11111



Thank you QFitsView, ISIS3 and everyone in the ISIS community!
Go to the top of the page
 
+Quote Post
JohnVV
post Jun 4 2016, 12:12 AM
Post #13


Member
***

Group: Members
Posts: 816
Joined: 18-November 08
Member No.: 4489



most things i work on use typing into the terminal at some point

this is not new , back in the 80's this was the norm when i started using this infernal things ( and 5 in floppies)
Go to the top of the page
 
+Quote Post
Astroboy
post Jun 4 2016, 12:30 AM
Post #14


Junior Member
**

Group: Members
Posts: 38
Joined: 27-August 14
From: Private island on Titan
Member No.: 7250



Yep... I was pretty much determined to install and attempt to use ISIS at all costs last night, and I did that, and then I found a way to get .pngs that hadn't been contrast stretched this afternoon. There really needs to be easy-to-use software that poops out entire folders of unstretched .pngs, for people like me who have spent the past three years trying to do this. But honestly, after getting ISIS running, I feel pretty content with it. I just wish there was a way to batch convert entire folders of images into cubes, and entire folders of cubes into unstretched .pngs.
Go to the top of the page
 
+Quote Post
JohnVV
post Jun 4 2016, 12:58 AM
Post #15


Member
***

Group: Members
Posts: 816
Joined: 18-November 08
Member No.: 4489



if you need help with isis3 and i am assuming CentOS 6.8 ( the free version of RHEL 6.8 )

pm me or send me a email

I find that mixing isis3 , gdal ,and gmic is a good tool set


you may or may not be aware of one of the cool " mostly" undocumented options
convert a cub to a raw AND!!! have all the info and be able to move it back to a cub after working on it

CODE
cubeatt from=image.cub to=image.raw+BSQ+detached

this gives you a raw image
the lbl with the image info - rows and columns

CODE
gmic image.raw,1024,1024 -o image.tiff

converts it to a 32 bit tiff ( the isis default raw is 32 bit float )

edit the image and work on it

then import it back into isis3
convert the tiff to a 32 bit raw
-- gmic image1.tiff -o image1.raw ---

then edit the name of the raw listed in the lbl file to point to the NEW raw image

CODE
cubeatt from=image.lbl to=NewImage.cub

Go to the top of the page
 
+Quote Post

Reply to this topicStart new topic

 



RSS Lo-Fi Version Time is now: 28th May 2017 - 12:27 PM
RULES AND GUIDELINES
Please read the Forum Rules and Guidelines before posting.

IMAGE COPYRIGHT
Images posted on UnmannedSpaceflight.com may be copyrighted. Do not reproduce without permission. Read here for further information on space images and copyright.

OPINIONS AND MODERATION
Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators.
SUPPORT THE FORUM
Unmannedspaceflight.com is a project of the Planetary Society and is funded by donations from visitors and members. Help keep this forum up and running by contributing here.