IPB

Welcome Guest ( Log In | Register )

36 Pages V  « < 33 34 35 36 >  
Reply to this topicStart new topic
MSL Images & Cameras, technical discussions of images, image processing and cameras
mcaplinger
post Apr 20 2016, 08:45 PM
Post #511


Senior Member
****

Group: Members
Posts: 2504
Joined: 13-September 05
Member No.: 497



QUOTE (Herobrine @ Apr 20 2016, 12:27 PM) *
Well, really, we've roughly removed shading due to lighting conditions, and only for the pixels that had associated spatial data.

If you're happy I'm happy, but it seems like an awful lot of work for something with not a lot of practical utility to me.

You only need the optical depth if it's changed significantly between the images you're trying to compare, otherwise since you are only computing a relative albedo it doesn't matter how it might be scaled. And there are all kinds of secondary effects from the brightness distribution of the sky, and in the near field, light reflecting off the rover, that I think you're ignoring. Though I'm not sure how large those effects are.

As to how you compute the optical depth in the first place -- obviously you start by looking at the levels in the sun images and normalizing them to absolute radiance units. I have to confess that I glazed over a bit while reading the discussion of what happens next in, e.g., http://arxiv.org/abs/1403.4234 -- and a lot of the described MER processing is unneeded for MSL, or at least different.


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
fredk
post Apr 20 2016, 09:56 PM
Post #512


Senior Member
****

Group: Members
Posts: 4246
Joined: 17-January 05
Member No.: 152



QUOTE (Herobrine @ Apr 20 2016, 09:27 PM) *
Assuming... a local mean albedo of 0.2...
Dividing the observed radiance by the calculated approximate total incident solar irradiance gives me the albedo

Sorry, I haven't followed your procedure in complete detail, but if you've assumed the same mean albedo for the two frames to begin with, then the fact that the brightnesses of the two final frames in your post agree so well doesn't sound surprizing.

But knowing the angular size of the sun in the sky and the tau, and using the absolutely calibrated PDS images, you should be able to divide out the solar illumination to get the true albedo, no? Or is that what you've actually done?
Go to the top of the page
 
+Quote Post
scalbers
post Apr 20 2016, 10:52 PM
Post #513


Senior Member
****

Group: Members
Posts: 1621
Joined: 5-March 05
From: Boulder, CO
Member No.: 184



Interesting to see this type of analysis. One possible refinement to the consideration of "albedo" is to express it as a reflectance. The so-called Bidirectional Reflectance Distribution Function (BRDF), or Anisotropic Reflectance Factor (ARF) is a way to see how the reflectance varies depends on the view angle and where this is compared to the sun. For example opposite the sun there is a bright spot in the ground where there is a higher reflectance. The angular averaged value of the reflectance would equate back to the albedo. I've been gradually trying to develop some simple BRDF functions for various types of surfaces on the Earth. Maybe a sandy or dusty surface on Earth could serve as an analog for what we see on Mars?

The azimuth difference between the view angle and the sun is thus important and the ground will look darker when looking towards a similar azimuth as the sun and will be brighter when looking opposite the sun. This all becomes more important at larger zenith angles of illumination.


--------------------
Steve [ my home page and planetary maps page ]
Go to the top of the page
 
+Quote Post
Herobrine
post Apr 21 2016, 08:11 PM
Post #514


Member
***

Group: Members
Posts: 244
Joined: 2-March 15
Member No.: 7408



QUOTE (mcaplinger @ Apr 20 2016, 03:45 PM) *
If you're happy I'm happy, but it seems like an awful lot of work for something with not a lot of practical utility to me.

I'm mostly interested in it for the prospect of removing some of the effects of the lighting conditions that existed when the image was acquired. I'm doing work with generating a color, 3-D environment from mountains of MSL data. I'd like to be able to light the scene myself so I can change the time of day, rather than being stuck with the lighting conditions that existed at the time an image was acquired, conditions that will not be consistent when combining data from different times.
QUOTE (mcaplinger @ Apr 20 2016, 03:45 PM) *
And there are all kinds of secondary effects from the brightness distribution of the sky, and in the near field, light reflecting off the rover, that I think you're ignoring. Though I'm not sure how large those effects are.

There are a lot of factors I'm ignoring. I'm guessing (read "hoping") that their effects are smaller than the error introduced by the less-than-ideal spatial data so I'm okay with ignoring them for now. For me, close is better than nowhere.
QUOTE (mcaplinger @ Apr 20 2016, 03:45 PM) *
I have to confess that I glazed over a bit while reading the discussion of what happens next in, e.g., http://arxiv.org/abs/1403.4234 -- and a lot of the described MER processing is unneeded for MSL, or at least different.

I don't know if I've missed that paper until now or if I came across it and glazed over as well and moved on, hoping for one specific to MSL. Either way, thank you for linking it; it does look like it contains the information I need. smile.gif

QUOTE (fredk @ Apr 20 2016, 04:56 PM) *
Sorry, I haven't followed your procedure in complete detail, but if you've assumed the same mean albedo for the two frames to begin with, then the fact that the brightnesses of the two final frames in your post agree so well doesn't sound surprizing.

I only use the "local mean albedo" when calculating the irradiance contribution reflected by the surrounding terrain. This affects how much light my calculations think vertical/slanted surfaces received. Increasing that value has the effect of making those vertical/slanted surfaces darker in my final image.

QUOTE (scalbers @ Apr 20 2016, 05:52 PM) *
Interesting to see this type of analysis. One possible refinement to the consideration of "albedo" is to express it as a reflectance. The so-called Bidirectional Reflectance Distribution Function (BRDF), or Anisotropic Reflectance Factor (ARF) is a way to see how the reflectance varies depends on the view angle and where this is compared to the sun.

One of the things I was hoping I might eventually be able to do with the output of this process was actually to try to generate a rough approximation of the BRDF of different points on the surface, if I had enough imagery containing that spot at different times of day. If I could come up with something even remotely close, I think it would enhance the realism of a realtime 3-D rendered scene if I used BRDFs generated by crunching data from multiple observations (rocks would look more rocky, dust more dusty, etc.). I don't know if I'll ever get to that point, though. I tend to work slowly.


I might as well just explain exactly what I did to make those images.
What I've written isn't exactly in a sharable, easily reusable state, but here's the basic process that produces that result. I'll use degrees for angles here.

CODE
First:
Define a constant for solar irradiance at 1 AU: { s_1AU = 1371 (W * m^-2) }
Define a unit vector for the local zenith in the site frame: { ZENITH = (0, 0, -1) }
Define a function (called "f()" below) of optical depth and zenith angle, that yields a normalized net irradiance factor. For my purposes, I wrote a 2-dimensional lookup table and a function that interpolates the values. My lookup table can be found at http://mc.herobrinesarmy.com/msl/normalized_net_irradiance_factor.csv. The first row lists solar zenith angle in degrees; the first column lists optical depths.

Then, per frame:
Define a local mean albedo (used to calculate the ground-reflected contribution to irradiance): { A = 0.2 }
Define the optical depth: { OD = 0.5 }
Read in MXY data (use this to exclude pixels containing rover structure)
Read in UVW data (called "normals" below).
Read in RAD data (called "values" below).
Read in metadata from LBL as follows: {
rover_quaternion = LBL>ROVER_COORD_SYSTEM_PARMS>ORIGIN_ROTATION_QUATERNION
solar_elevation = LBL>SITE_DERIVED_GEOMETRY_PARMS>SOLAR_ELEVATION
solar_azimuth = LBL>SITE_DERIVED_GEOMETRY_PARMS>SOLAR_AZIMUTH
radiance_offset = LBL>DERIVED_IMAGE_PARMS>MSL:RADIANCE_OFFSET
radiance_scaling_factor = LBL>DERIVED_IMAGE_PARMS>MSL:RADIANCE_SCALING_FACTOR
start_time = LBL>START_TIME
}
Calculate Julian date for start_time (too complicated to include here)
Calculate heliocentric distance for Mars in AU (called "r" below) using Julian date (too complicated to include here)
Calculate solar irradiance incident at top of Martian atmosphere: { s_toa = S_1AU / (r^2) }
Create unit vector toward Sun in site frame: {
sun_x = cos(-solar_elevation) * sin(solar_azimuth)
sun_y = cos(-solar_elevation) * cos(solar_azimuth)
sun_z = sin(-solar_elevation)
v_sun = (sun_x, sun_y, sun_z)
}
Calculate the solar zenith angle: { solar_zenith = 90 - solar_elevation }
Fetch net irradiance factor: { net = f(OD, solar_zenith) }

Then, per pixel:
Calculate the angle between the surface normal and the Sun: { a_s = normals(x,y).angleTo(v_sun) } Clamp it to no more than 90 degrees.
Calculate the angle between the surface normal and the zenith: { a_z = normals(x,y).angleTo(ZENITH) } Clamp it to no more than 180 degrees.
Calculate direct irradiance: { s_dir = s_toa * cos(a_s) * exp(-OD / cos(solar_zenith)) }
Calculate diffuse irradiance: { s_dif = s_toa * cos(solar_zenith) * cos^2(a_z / 2) * (net - exp(-OD / cos(solar_zenith)) }
Calculate surface-reflected irradiance: { s_gnd = s_toa * cos(solar_zenith) * A * sin^2(a_z / 2) * net }
Calculate total irradiance: { s = s_dir + s_dif + s_gnd }

Calculate observed radiance { observed = values(x,y) * radiance_scaling_factor + radiance_offset }
Calculate scaled albedo { result = observed / s }


I won't describe how to take that result and derive actual albedo (0-1) because I doubt I did it correctly. I get reasonable values, but that could just as easily be luck.
The "result" value should be in a scale that is the same from frame to frame, so call that "brightness calibrated" if you like.
I'll say once more that there are many factors this process doesn't account for and it is far from perfect. In practice, it has proved useful for me; that's all I can say with any confidence. I'll also repeat that I had almost no idea what I was doing when I implemented this stuff, so if any of it looks wrong, it probably is. tongue.gif
Go to the top of the page
 
+Quote Post
PDP8E
post Feb 13 2017, 12:55 AM
Post #515


Member
***

Group: Members
Posts: 807
Joined: 10-October 06
From: Maynard Mass USA
Member No.: 1241



I noticed a 'vertical line' in this LEFT Mastcam image (to the left of a possible cap rock called 'Quimby' that may have tumbled off Mount Ireson)
Attached Image

I traced the left mastcam images back and found that it first showed up on Sol 834
I looked around in this thread to see if it was brought up before, and found nothing (correct me if I am wrong please)
It looks to this amateur, to be a slight imager/readout electronics failure? ... one too many cosmic ray hits?



--------------------
CLA CLL
Go to the top of the page
 
+Quote Post
JohnVV
post Feb 13 2017, 02:14 AM
Post #516


Member
***

Group: Members
Posts: 890
Joined: 18-November 08
Member No.: 4489



that line is also in the EDR dat files
did a check
CODE
./dat2img 1132ML0050460000501249E01_XXXX.DAT 1

cd 1

pds2isis from=1132ML0050460000501249E01_XXXX_00.IMG to=1132ML0050460000501249E01_XXXX_00.cub

it is there
Go to the top of the page
 
+Quote Post
mcaplinger
post Feb 13 2017, 04:06 AM
Post #517


Senior Member
****

Group: Members
Posts: 2504
Joined: 13-September 05
Member No.: 497



QUOTE (PDP8E @ Feb 12 2017, 04:55 PM) *
I traced the left mastcam images back and found that it first showed up on Sol 834

This is sol 1606 so this news is a bit old...

This is a hot pixel most likely caused by neutrons from the RTG. We've been hoping it would anneal out (like a similar pixel on the other Mastcam did) but so far no luck.


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
elakdawalla
post Apr 18 2017, 06:29 PM
Post #518


Administrator
****

Group: Admin
Posts: 5172
Joined: 4-August 05
From: Pasadena, CA, USA, Earth
Member No.: 454



An interesting trivium from Mike Malin's review of the Mastcam section in my book. In my text I mention that most Mastcam images are 1344 by 1200 pixels to crop away the sides, and he commented:
QUOTE
This has been changed to 1328 x 1184 between sol 1587 and 1589 (1/22/2017 and 1/24/2017). The reason was for more efficient memory managment in the DEA. 1344x1200 led to much greater fragmentation of the flash memory that is broken down into integer units of 512KByte blocks, so 1344÷15 x 1200÷16 = 6300 pages÷512= 12.3 blocks, whereas 1328÷16 x 1184÷16 = 6142 pages ÷ 512 = 11.996 blocks or just a tad less than 12 blocks...more efficient packing.


--------------------
My website - My Patreon - @elakdawalla on Twitter - Please support unmannedspaceflight.com by donating here.
Go to the top of the page
 
+Quote Post
mcaplinger
post Apr 18 2017, 07:43 PM
Post #519


Senior Member
****

Group: Members
Posts: 2504
Joined: 13-September 05
Member No.: 497



QUOTE (elakdawalla @ Apr 18 2017, 10:29 AM) *
In my text I mention that most Mastcam images are 1344 by 1200 pixels to crop away the sides...

The original idea was that all images would be no larger than 1200x1200 since the zoom couldn't fill more than that area. When we went with the fixed-focal-length optics the constraint became the size of the filter. If it had been me I still would have stuck with 1200x1200 but people insisted on getting a little more coverage per frame at the cost of more vignetting in the corners, memory fragmentation, etc.

You can perform some amusing numerology by looking at sequences and seeing which ones are 1200x1200 and which ones are larger; it provides clues about who wrote the sequences and why, and how recently I had pleaded with people to stick with the smaller size smile.gif


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
Ant103
post Jun 12 2017, 10:56 AM
Post #520


Senior Member
****

Group: Members
Posts: 1619
Joined: 12-February 06
From: Bergerac - FR
Member No.: 678



And again, an other full color panoramic coming in an awful bayer-jpeg style : http://www.midnightplanets.com/web/MSL/sol/01688.html

I'm wondering why full Mastcam34 imagery is not coming in colors anymore. This is at least the third one. It's not about the long process of "debayer" each image, it's about the bad quality of each image after colorization process. There is artifacts everywhere, the Bayer matrix is still visible after it (because of jpeg compression).

So, what happened ?


--------------------
Go to the top of the page
 
+Quote Post
fredk
post Jun 12 2017, 01:46 PM
Post #521


Senior Member
****

Group: Members
Posts: 4246
Joined: 17-January 05
Member No.: 152



I'd guess it's just a matter of downlink volumes - if they have have the room and the images have high enough priority then it makes sense to get the better quality of the un-jpegged raw images. Once they make it to the pds we'll end up with better quality images, too.
Go to the top of the page
 
+Quote Post
JohnVV
post Nov 6 2017, 03:31 PM
Post #522


Member
***

Group: Members
Posts: 890
Joined: 18-November 08
Member No.: 4489



the "raw" tongue.gif tongue.gif tongue.gif jpeg's are the NON scientific early published images
the research teams have FIRST use of the data

also the jpg's are value stretched aprox 5% to 95%
the bottom 5 % are moved to all black and the top 5% are moved to all white , then " Normalized "
Go to the top of the page
 
+Quote Post
jccwrt
post Nov 13 2017, 05:18 PM
Post #523


Member
***

Group: Members
Posts: 306
Joined: 4-October 14
Member No.: 7273



QUOTE (Ant103 @ Jun 12 2017, 04:56 AM) *
And again, an other full color panoramic coming in an awful bayer-jpeg style : http://www.midnightplanets.com/web/MSL/sol/01688.html

I'm wondering why full Mastcam34 imagery is not coming in colors anymore. This is at least the third one. It's not about the long process of "debayer" each image, it's about the bad quality of each image after colorization process. There is artifacts everywhere, the Bayer matrix is still visible after it (because of jpeg compression).

So, what happened ?


There's no pipeline for processing the full-quality unbayered images to color before they're posted for public release. All of the public raws are compressed versions (so as to slow people outside of the science team from "scooping" any new findings), and the bayer pattern is disrupted by the compression. This imaging data is the highest quality that can be downlinked, you just need to wait until they are uploaded to the PDS.
Go to the top of the page
 
+Quote Post
elakdawalla
post Nov 28 2017, 04:24 PM
Post #524


Administrator
****

Group: Admin
Posts: 5172
Joined: 4-August 05
From: Pasadena, CA, USA, Earth
Member No.: 454



From the Mastcam chapter in my forthcoming book...

QUOTE
The JPEG compression means that most images have some compression artifacts. JPEG compression works on 8-by-8-pixel blocks, and the boundaries of those blocks are often visible. JPEG compression is more effective in places with smooth variations in color and brightness, but can introduce strange artifacts in areas of high contrast. Areas where there is high contrast or a lot of variation also tend to be areas of scientific interest – for instance, in an area of very finely laminated rock layers in alternate Sun and shadow (Figure 7.7, bottom). Where the compression artifacts make it difficult to interpret the geology, the team can choose to re-transmit the image with less compression, or even losslessly – as long as the original image is still stored in the camera’s flash memory. As of early 2017, not quite half of all of the Mastcam data had been returned a second time with lossless compression. At that time, the mission switched to returning all non-time-critical Mastcam science data losslessly, accepting delayed data return in exchange for a larger proportion of losslessly compressed data and a reduction in the complexity of data curation (Michael Malin, personal communication, email dated 14 Apr 2017).


--------------------
My website - My Patreon - @elakdawalla on Twitter - Please support unmannedspaceflight.com by donating here.
Go to the top of the page
 
+Quote Post
mcaplinger
post Nov 28 2017, 11:08 PM
Post #525


Senior Member
****

Group: Members
Posts: 2504
Joined: 13-September 05
Member No.: 497



QUOTE (jccwrt @ Nov 13 2017, 09:18 AM) *
There's no pipeline for processing the full-quality unbayered images to color before they're posted for public release.

Yet somehow this happens for MAHLI. rolleyes.gif


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post

36 Pages V  « < 33 34 35 36 >
Reply to this topicStart new topic

 



RSS Lo-Fi Version Time is now: 28th March 2024 - 05:13 PM
RULES AND GUIDELINES
Please read the Forum Rules and Guidelines before posting.

IMAGE COPYRIGHT
Images posted on UnmannedSpaceflight.com may be copyrighted. Do not reproduce without permission. Read here for further information on space images and copyright.

OPINIONS AND MODERATION
Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators.
SUPPORT THE FORUM
Unmannedspaceflight.com is funded by the Planetary Society. Please consider supporting our work and many other projects by donating to the Society or becoming a member.