MSL Images & Cameras, technical discussions of images, image processing and cameras |
MSL Images & Cameras, technical discussions of images, image processing and cameras |
Jan 15 2013, 07:34 PM
Post
#271
|
|
Member Group: Members Posts: 161 Joined: 12-August 12 From: Hillsborough, NJ Member No.: 6546 |
Why aren't Mastcam images white-balanced like the MAHLI ones? Is it to do with science or techincal issues? I love MAHLI images for how natural and pleasing to the eye they look. Would be great if Mastcam images came like this, instead of being tweaked on the ground with not always the best results. Who the heck would want the images white-balanced by default anyways? Besides, a raw image isn't called a "raw image" for nothing. -------------------- |
|
|
Jan 15 2013, 07:37 PM
Post
#272
|
|
Member Group: Members Posts: 238 Joined: 15-January 13 Member No.: 6842 |
I don't know what you are reacting to. None of the cameras do any kind of white balance processing internally. There may be some minor color differences because of the different glasses in the optics and the slightly different bandpass filter between Mastcam and MAHLI. From what I've read, MAHLI's images are white-balanced internally. It uses a calibration target mounted on the rover for this. http://msl-scicorner.jpl.nasa.gov/Instruments/MAHLI/ "MSL carries the MAHLI Flight Calibration Target for color/white balance, resolution and focus checks, and verification of UV LED functionality. The target will be mounted in a vertical position on the rover (i.e., vertical when the rover is on a surface with a slope of 0°) to help prevent dust accumulation." Looking at the raw images, it's also obvious that MAHLI images come already white-balanced, in contrast to Mastcam images. A good example is the white surface of the rover; on Mastcam images it has an ever-present yellow/orange cast, while in MAHLI images it's white. Examples: Mastcam http://mars.jpl.nasa.gov/msl/multimedia/ra..._DXXX&s=106 MAHLI: http://mars.jpl.nasa.gov/msl/multimedia/ra...1_DXXX&s=85 -------------------- Curiosity rover panoramas: http://www.facebook.com/CuriosityRoverPanoramas
My Photosynth panoramas: http://photosynth.net/userprofilepage.aspx...;content=Synths |
|
|
Jan 15 2013, 07:46 PM
Post
#273
|
|
Senior Member Group: Members Posts: 2511 Joined: 13-September 05 Member No.: 497 |
From what I've read, MAHLI's images are white-balanced internally. I'm on the team that built MAHLI, and I can assure you there is no internal white-balancing. The cal target can be used to do white balance, or it could be if it wasn't covered with dust from the landing. If there's a perceptible difference in white balance between the cameras (I just looked at a few images, and it's hard to say for sure since the imaging geometries aren't very similar), it probably has to do with the slightly different cut-on wavelength of the bandpass filter. -------------------- Disclaimer: This post is based on public information only. Any opinions are my own.
|
|
|
Feb 6 2013, 12:42 PM
Post
#274
|
|
Senior Member Group: Members Posts: 1465 Joined: 9-February 04 From: Columbus OH USA Member No.: 13 |
For the multi-spectral sequences, does anyone know the probable order of filters used?
This document lists the filters in a certain order: MASTCAM MULTISPECTRAL IMAGING ...which might be taken to be the preferred order, but how about the sequences where only 4 or 6 filters are used? for example, this ML sequence of four from sol 173: Sol 173 ML It's interesting how the mast shadow essentially disappears for the two later frames--which would be logical if those filters were toward the infrared? -------------------- |
|
|
Feb 6 2013, 02:15 PM
Post
#275
|
|
Senior Member Group: Members Posts: 2511 Joined: 13-September 05 Member No.: 497 |
For the multi-spectral sequences, does anyone know the probable order of filters used? You could possibly figure it out from the time stamps; it takes twice as long to go from filter N to N+2 as from filter N to N+1. As I've noted previously, the thumbnail tint can be used for any filter with different visible throughput, and the infrareds can maybe be told apart from timing or the amount of dark current, as the exposure times go up as you get farther into the IR. Then perhaps patterns would emerge as to how they're using the filters; as you observe, there seems to have been some evolution in which ones are used. -------------------- Disclaimer: This post is based on public information only. Any opinions are my own.
|
|
|
Feb 6 2013, 05:00 PM
Post
#276
|
|
Administrator Group: Admin Posts: 5172 Joined: 4-August 05 From: Pasadena, CA, USA, Earth Member No.: 454 |
We're very close to the date of the first expected PDS release. If patterns aren't obvious from the raws, comparisons between raws and PDS-released images might provide clues that you could generalize from, going forward.
-------------------- My website - My Patreon - @elakdawalla on Twitter - Please support unmannedspaceflight.com by donating here.
|
|
|
Feb 7 2013, 03:07 AM
Post
#277
|
|
Senior Member Group: Members Posts: 2228 Joined: 1-December 04 From: Marble Falls, Texas, USA Member No.: 116 |
-------------------- ...Tom
I'm not a Space Fan, I'm a Space Exploration Enthusiast. |
|
|
Feb 7 2013, 02:43 PM
Post
#278
|
|
Member Group: Members Posts: 219 Joined: 14-November 11 From: Washington, DC Member No.: 6237 |
Per http://pds-geosciences.wustl.edu/missions/msl/index.htm - "MSL Analyst's Notebook - Coming with Release 1. Provides search, display, and download tools for MSL data sets."
Subsequent releases will be every 90 days after Feb 27th initial release of sols 0-90. The imaging node release dates are here: http://pds-imaging.jpl.nasa.gov/schedules/msl_release.html - Release 1 EDRs on same date as above, RDRs on Mar 20th. |
|
|
Mar 3 2013, 10:29 PM
Post
#279
|
|
Member Group: Members Posts: 222 Joined: 7-August 12 From: Garberville, CA Member No.: 6500 |
Answered from a question in this thread...
Ed. I don't know how you do, it, but those anaglyph panoramas are once again simply fantastic - they are so sharp and the depth perception is easy to see across the entire panorama. In 3D the terrain becomes so much easier to explore, and your images add tremendous value. Do you generate the MR/ML panos independently and then merge them, or do you create them as multiple layers in one pano, sharong control points for some fixed depth, and then separate them for the anaglyph? Or something else, such as magic? Airbag In general, I usually do the following for the big anaglyphs.... ~ First, I run raw MC100's and MC34's through the appropriate (manually recorded preset for image size in question) Photoshop automation to carefully remove all the lens "schmutz" from each set (MC100 has 2 big ones and MC34 has now developed 3 smaller ones) so there aren't any distracting "floaters" in the final anaglyph. They're very distracting and break the immersion. ~ I then mosaic each separately with as close to the same projection as possible (varies depending on the pano). ~ Aligning them is done manually by eye in Photoshop by dragging the left channel over the right, and resizing and aligning it by eye with transparency set at 50%. Before splitting the channel colors I usually tweak the levels for an bit brighter illumination without saturating the whites. Some anaglyphs tend come out rather dark otherwise. ~ The crucial (and tricky) part is getting all the matching L & R objects in the matching channels to align along the same horizontal plane for the least eye strain. Even with a near identical initial projection for each, it usually requires a bit of careful warping in several areas. That's the most time (& CPU) consuming part. ~ And lastly, by aligning the horizontal offset so the focal point (where the channels have little to no offset) is more biased to the immediate foreground rather than the distance, it keeps eye strain in check because the eye seems to naturally prefer that the left channel falls to the left of the right channel. Below that focal point the channels reverse, which is harder on the eye. That said, the reason not to have the focal point at the bottom of the frame is that in doing so the top offsets often become far too wide to pull together at high zooms so a compromise focal point must be determined. This will vary depending on the field depth between top and bottom of course. If all works right... it comes out looking good! -------------------- "We shall not cease from exploration, and the end of all our exploring will be to arrive where we started and know the place for the first time." -T.S. Eliot
|
|
|
Mar 4 2013, 01:08 AM
Post
#280
|
|
Senior Member Group: Members Posts: 2511 Joined: 13-September 05 Member No.: 497 |
~ First, I run raw MC100's and MC34's through the appropriate (manually recorded preset for image size in question) Photoshop automation to carefully remove all the lens "schmutz" from each set (MC100 has 2 big ones and MC34 has now developed 3 smaller ones) Do you really have evidence that the 34mm has defects that weren't there before? I don't but I haven't been looking very hard for several weeks. -------------------- Disclaimer: This post is based on public information only. Any opinions are my own.
|
|
|
Mar 4 2013, 05:22 AM
Post
#281
|
|
Member Group: Members Posts: 222 Joined: 7-August 12 From: Garberville, CA Member No.: 6500 |
Mike, I first noticed the three 34mm spots a few weeks ago when proofing a large anaglyph. They're hard to see in 2D against the surface because of the jumbled noise in the terrain but pop out subtly as 'floaters" when viewed in stereo if not cleaned before anaglyphing. The four frame animated GIF below (from the raw Sol 184 set) points these three culprits against the relative smoothness of an out of focus robotic arm...
Your post made me curious as to how long they've actually been there, and to my surprise it looks like they've been present since the first good 34mm shots came down on Sol 3. You can see them in the same positions (as in the GIF above) at the center of this full frame series from Sol 3: http://mars.jpl.nasa.gov/msl-raw-images/ms...5000E1_DXXX.jpg. Now knowing now where to look for these - especially easy to spot if one flips rapidly in an image viewer at any large 34mm set - and yup, they're there. -------------------- "We shall not cease from exploration, and the end of all our exploring will be to arrive where we started and know the place for the first time." -T.S. Eliot
|
|
|
Mar 4 2013, 06:39 AM
Post
#282
|
|
Senior Member Group: Members Posts: 2511 Joined: 13-September 05 Member No.: 497 |
Your post made me curious as to how long they've actually been there, and to my surprise it looks like they've been present since the first good 34mm shots came down on Sol 3. This fits my expectations. These are not on the lens but on the focal plane and have been there since we buttoned up the optics during final assembly. -------------------- Disclaimer: This post is based on public information only. Any opinions are my own.
|
|
|
Mar 4 2013, 03:55 PM
Post
#283
|
|
Senior Member Group: Members Posts: 4246 Joined: 17-January 05 Member No.: 152 |
MC34 has now developed 3 smaller ones There are many more than three visible on MC34 - in a quick count I found at least ten on the sol 24 images. They're easiest to see in the sky shots, but their visibility depends strongly on the filter. For example, compare these two filtered sky frames: http://mars.jpl.nasa.gov/msl-raw-images/ms...2000D1_DXXX.jpg http://mars.jpl.nasa.gov/msl-raw-images/ms...1000D1_DXXX.jpg I think that makes sense, since the visibility in a filtered image will depend on the position of the speck with respect to the Bayer pattern. So their visibility in a filtered sky image could be greater than in a full-colour sky image, which is some weighted average over the four subpixels. |
|
|
May 10 2013, 09:50 PM
Post
#284
|
|
Junior Member Group: Members Posts: 53 Joined: 15-July 09 Member No.: 4867 |
Do you about the schedule for MARDI images release ? I mean, the raw images, not the lossy-compressed ones.
And could you tell me if the EDL sensor suite raw data (IMU, radar, etc...) that were stored onboard have been downloaded and, if yes, are they available to the public ? Thanks. |
|
|
Jun 3 2013, 06:24 PM
Post
#285
|
|
Newbie Group: Members Posts: 3 Joined: 13-January 12 Member No.: 6312 |
Just a thought.. Saw this picture today http://mars.jpl.nasa.gov/msl/multimedia/ra..._DXXX&s=293
MAHLI took that at night with its LED's if I'm not mistaken... which brought up the thought.. How cool would it be for Curiosity to take a self portrait at night! (like this at night http://www.nasa.gov/mission_pages/msl/mult...;/pia16239.html) Is this possible? -------------------- |
|
|
Lo-Fi Version | Time is now: 29th April 2024 - 11:20 AM |
RULES AND GUIDELINES Please read the Forum Rules and Guidelines before posting. IMAGE COPYRIGHT |
OPINIONS AND MODERATION Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators. |
SUPPORT THE FORUM Unmannedspaceflight.com is funded by the Planetary Society. Please consider supporting our work and many other projects by donating to the Society or becoming a member. |