IPB

Welcome Guest ( Log In | Register )

12 Pages V  « < 9 10 11 12 >  
Reply to this topicStart new topic
Perseverance Imagery, technical discussion of processing, cameras, etc.
Andreas Plesch
post Jun 5 2021, 03:31 PM
Post #151


Member
***

Group: Members
Posts: 248
Joined: 25-February 21
From: Waltham, Massachussetts, U.S.A.
Member No.: 8974



Here are some notes I gathered studying and experimenting with the C A H V O R E model, using Gennery (2002) and Di and Li (2004).

Each letter in CAHVORE represents a parameter of the camera model which allows for projecting a 3d point onto an image plane. The model is based on a conceptual description of the projection and its parameters are tuned to a specific camera in a careful, experimental calibration process. It is important to understand that this process is a least square fitting minimization of residuals tuning all 7 parameters simultaneously. This means, while each parameters has a meaning in the concept to derive the model, that the result of the calibration is not strictly required to coincide with a physical measurement. In practice, I believe the parameters mostly do follow an actual measure because they use careful starting guesses and because there are some constraints on the calibration.

C A H V characterizes a perspective view with a focal distance and an orientation where straight lines stay straight. O is needed when the optical axis is a bit off the camera axis.

R stands for rho and is the radial lens distortion parameter. E stands for epsilon and is the fisheye parameter. E is needed because R cannot capture distortion of very wide angle (>140 or so) lenses.

The model first applies E by considering a variable lens pupil location which depends on the incidence angle of the entering ray rather than a fixed pin hole. This has the effect of an apparent forward shift of the camera center C resulting in an effective center C' and can thus characterize wide angle lenses. Note that the description of how E is defined differs in Di and Li (2004) and Gennery (2002). Di and Li (2004) only briefly mention E and do not use it. Gennery (2002) is the basis for the NASA/JPL use of the model.

Then R is applied by using it to determine an apparent shift in the position of the imaged 3d point P to an effective location P'. This shift describes the lens distortion. Finally, P' is projected onto the 2d image plane using the other parameters.

There are three types of the mathematical model: the general model, the fisheye model, and the perspective model. I believe they are distinguished for historical reasons and because they allow for simplified computations. The general model can represent the other models by using an additional parameter (L or P).

Since the calibrations occurs for all parameters simultaneously it is not really possible to apply the model partially. For example, for fisheye lenses both parameters R and E have to be applied. In fact, the exact mathematics described in Gennery (2002) should be probably used even if there may be other ways to describe the concept mathematically since it is those equations which are used for the calibration, at least these are the only ones which are accessible. Following these the model can be used to project a 3d point onto a 2d image, and Gennery (2002) describes a sanctioned process to do the reverse. However, only the orientation of the entering 3d ray can be recovered from a 2d pixel. Unfortunately, this involves numerically solving an equation for each pixel.

The MARSCAHV VICAR utility attempts to remove distortion by transforming an image from the CAHVORE model to the CAHV model. Apparently, the utility is not available publicly. From its description, it computes the orientation of the entering ray for each pixel as described, assumes a 3d sphere (of radius 1m) as the 3d point location, and then projects that 3d point onto an 2d image using the CAHV parameters.

I may try to reproduce that approach. If it works, an interesting experiment for the Nav helicopter images may be to assume a horizontal plane (normal to the camera axis) at 10m distance rather than 1m sphere to intersect the computed ray with. And then simply look at the map view, eg. take the x and y and ignore the z for an image.

Similarly, for the color RTE images, assuming a plane at a 22 degree angle to the camera axis for intersecting the reconstructed rays and looking at the map view could be tried.


--------------------
--
Andreas Plesch, andreasplesch at gmail dot com
Go to the top of the page
 
+Quote Post
Andreas Plesch
post Jun 7 2021, 07:40 PM
Post #152


Member
***

Group: Members
Posts: 248
Joined: 25-February 21
From: Waltham, Massachussetts, U.S.A.
Member No.: 8974



Let me share a quick experiment:

Since I have an old stereoscope,

Attached Image


I wanted to use it to view Greenish's stereo pair:

http://www.unmannedspaceflight.com/index.p...mp;#entry252728

From an original stereo photograph card I made a template and used it to scale and print Greenish's Ingenuity parallel stereo pair:

Attached Image


Perhaps that template could be useful. Here is a link to a shared Google doc slide ready to print at 100%: https://docs.google.com/presentation/d/17ki...dit?usp=sharing

Clamping the printed and cropped paper between the original card in the card holder and behind the corner wires worked pretty well. The stereo view through the stereoscope is immediate and effortless. The main limitation is the dpi and quality of the print (the nominal 1200dpi was pretty good but could be better). It may be worth having images printed at the drugstore for that purpose if it is possible to control the exact size.


--------------------
--
Andreas Plesch, andreasplesch at gmail dot com
Go to the top of the page
 
+Quote Post
Greenish
post Jun 8 2021, 02:40 AM
Post #153


Member
***

Group: Members
Posts: 219
Joined: 14-November 11
From: Washington, DC
Member No.: 6237



That's pretty classy, and entirely by chance that aspect ratio was pretty close. If I'd known I'd have been less slapdash with the upscale, crop & sharpen.
Go to the top of the page
 
+Quote Post
tau
post Jun 11 2021, 12:37 PM
Post #154


Member
***

Group: Members
Posts: 665
Joined: 9-May 21
From: Germany
Member No.: 9017



QUOTE (Andreas Plesch @ Jun 7 2021, 09:40 PM) *
. . . Since I have an old stereoscope, . . .
Since I don't have a stereoscope, I came up with another method. Here is one for owners of an old* 3D HD TV with polarized glasses to view stereoscopic Mars images in full color. Take a Mastcam-Z right eye image as the first layer in an image editing program. The corresponding left eye image needs to be rotated 0.6° counterclockwise to correct for the slightly different orientation. Then copy and paste it as a second layer onto the right eye image. Make the left eye image semi-transparent and move it so that the nearest objects match those of the right image. Cut off parts which contain content from only one image. Then reset the transparency of the second (left layer) to 0. If the image width is greater than 1920 pixels or the image height greater than 1080 pixels (for a full HD TV), scale down or crop the image. Smaller images can be scaled up. Now, if the image doesn't exactly fill 1920 x 1080 pixels, expand the canvas accordingly and fill the added parts with a neutral color. Any image processing (color adjustment, sharpening, etc.) must be done before the next step. Then copy the image "1920x1080_stereo_mask.png" and past it as a third layer over the others. Make a selection of all black lines. Make the third layer invisible, but keep the selection. Activate the second layer (left eye) and delete from it the selected areas (lines), then unselect. Remove the third layer. Reduce both layers to one in the background and save the image. If the computer is connected directly to the TV (e.g. with HDMI cable) and uses it as monitor, the image must be displayed in full screen mode. Now get your polarizing glasses and view Mars in 3D.
Another option is to save the image to a USB device and connect it to the TV. My TV only accepts the jpg format with the baseline option, all other formats were ignored. It is recommented to save jpg in high quality.

Drawback of this method: the vertical resolution is reduced to 540 lines per eye.
If the picture looks strange (banding), check the overscan of the TV and set it to "off". If the 3D TV screen has a resolution other than HD, the method should be adjusted accordingly.

*old, because I'm afraid that 3D TVs are no longer produced

Attachments:
1) A stereo image from sol 108 ready for use
2) An anaglyph to show that it works
3) The stereo mask file to do it yourself
1
Attached Image
. 2
Attached Image
. 3
Attached Image

Go to the top of the page
 
+Quote Post
fredk
post Aug 1 2021, 05:47 PM
Post #155


Senior Member
****

Group: Members
Posts: 4246
Joined: 17-January 05
Member No.: 152



I'm not sure what we're seeing with these new navcams, eg:



That appears to be the sun at centre, with diameter about right for a full-res navcam frame. Maybe some new version of the bleed-off subtraction mode? But I don't see how you could get an unsaturated solar disc image without a solar filter.
Go to the top of the page
 
+Quote Post
mcaplinger
post Aug 1 2021, 06:59 PM
Post #156


Senior Member
****

Group: Members
Posts: 2511
Joined: 13-September 05
Member No.: 497



QUOTE (fredk @ Aug 1 2021, 10:47 AM) *
but I don't see how you could get an unsaturated solar disc image without a solar filter.

google for "cmos black sun".


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
Bill Harris
post Aug 1 2021, 10:04 PM
Post #157


Senior Member
****

Group: Members
Posts: 2998
Joined: 30-October 04
Member No.: 105



The digital version of the old film "solarized" image.


--------------------
Go to the top of the page
 
+Quote Post
Brian Swift
post Aug 1 2021, 10:46 PM
Post #158


Member
***

Group: Members
Posts: 406
Joined: 18-September 17
Member No.: 8250



QUOTE (fredk @ Aug 1 2021, 09:47 AM) *
...
That appears to be the sun at centre, with diameter about right for a full-res navcam frame. Maybe some new version of the bleed-off subtraction mode? But I don't see how you could get an unsaturated solar disc image without a solar filter.

Interesting bit to me, the little blob in the lower right of some of the images. I wonder if they were trying to capture an eclipse/transit. If it appeared in all the image, I'd just assume it was a lens flare.
https://mars.nasa.gov/mars2020/multimedia/r...PP00606_00_0LLJ
Go to the top of the page
 
+Quote Post
fredk
post Aug 2 2021, 11:11 PM
Post #159


Senior Member
****

Group: Members
Posts: 4246
Joined: 17-January 05
Member No.: 152



QUOTE (mcaplinger @ Aug 1 2021, 07:59 PM) *
google for "cmos black sun".

Thanks for the tip. The black sun effect does seem to explain the large dark area inside the magenta ring. But I don't see how you can get what looks like an image of the solar disc at the centre of the dark area (it has about the right diameter to be the sun). If the surrounding pixels were saturated enough (due to scattering in the atmosphere or optics) for the reference levels to also be saturated, leading to near-black on subtraction, then surely the solar disc should also be saturated and also look black. Why wouldn't the black sun effect also darken the solar disc?

Maki etal describe the black sun effect as a problem for centroiding, and say they use a technique to bleed off charge from the reference so subtraction from the pixel value no longer gives black - I guess this is what's previously been done, but apparently not with these new images.

The only way I can think of to resolve the solar disc without an ND filter is through optical reflections - maybe that's what we're seeing, though it's not clear why that wouldn't simply further saturate those pixels and so darken the output values.
Go to the top of the page
 
+Quote Post
James Sorenson
post Aug 3 2021, 07:01 AM
Post #160


Member
***

Group: Members
Posts: 691
Joined: 21-December 07
From: Clatskanie, Oregon
Member No.: 3988



I have no idea if this function is onboard, but what about an onboard merge of a very short exposure and a longer exposure?
The shortest possible exposure for the ECAM's is 411 microseconds and the longest possible exposure is 3.3 seconds.
Go to the top of the page
 
+Quote Post
fredk
post Aug 3 2021, 02:15 PM
Post #161


Senior Member
****

Group: Members
Posts: 4246
Joined: 17-January 05
Member No.: 152



I was wondering about that too, but I'd be surprized if an unfiltered short exposure would show the disc, considering the dust in the air and scattering in the optics. Try that with your camera on Earth and you end up with an overexposed glow around the sun. Maybe with a good enough dynamic range sensor.
Go to the top of the page
 
+Quote Post
erentar
post Aug 20 2021, 08:57 PM
Post #162


Newbie
*

Group: Members
Posts: 8
Joined: 21-February 21
Member No.: 8960



Is there automation available to rectify and project (at least mastcam) images?
Go to the top of the page
 
+Quote Post
Greenish
post Aug 20 2021, 09:00 PM
Post #163


Member
***

Group: Members
Posts: 219
Joined: 14-November 11
From: Washington, DC
Member No.: 6237



Mars 2020 Mission Release 1 is live in PDS...time to start digging in to the documentation & calibrated goodies!
Gonna start with some light reading: Mars2020_Camera_SIS.pdf

The NASA Planetary Data System announces the first release of data from the
Mars 2020 Perseverance Rover mission to Mars. This release contains raw,
calibrated, and derived data products covering the first 90 sols of the
mission, February 18 to May 20, 2021. The data are archived at various PDS
Nodes. Links to all the archives can be found at
https://pds-geosciences.wustl.edu/missions/mars2020/

Data from the following science investigations are included in this release.
Mastcam-Z – Mast-mounted Zoom Camera System
Engineering Hazard and Navigation Cameras
Helicopter Cameras
EDLCam – Entry, Descent, and Landing System Cameras
MEDA – Mars Environmental Dynamics Analyzer
MOXIE – Mars Oxygen ISRU Experiment
PIXL – Planetary Instrument for X-ray Lithochemistry
RIMFAX – Radar Imager for Mars Subsurface Exploration
SHERLOC – Scanning Habitable Environments with Raman and Luminescence for Organics and Chemicals
SuperCam – LIBS, Raman, Time-Resolved Fluorescence, VIS/IR spectrometers, Remote Micro-Imager and microphone
SPICE – Observation geometry and ancillary data

Future releases covering sols during which sampling events occurred will
include ‘sample dossiers’ – collections of science and engineering data
relevant to each returnable sample or blank. No sampling events occurred
during the first 90 sols of the mission.

To subscribe to future announcements of Mars2020 data releases:
https://pds.nasa.gov/tools/subscription_service/top.cfm


Edit: looks like there's now a thread for Percy PDS data. I do lean towards continuing in this thread the conversation about the technical aspects of imagery now that the answers to many of our speculations are published...
Go to the top of the page
 
+Quote Post
tau
post Jun 19 2022, 04:00 PM
Post #164


Member
***

Group: Members
Posts: 665
Joined: 9-May 21
From: Germany
Member No.: 9017



QUOTE (Keltos @ Jun 17 2022, 05:53 PM) *
Can you tell me (i have photoshop) how you do that : turn the raw into those enhanced images that show different minerals / different colors ?

I use Photoshop to convert the raw png images to uncompressed bmp format before computation, and after creating the principal components image to add the credit line,
sometimes for some minor optimizations (e.g. slight brightness and sharpness adjustment), and to save the image as compressed jpg format for upload to UMSF.

The main part is done with self-developed algorithms and source code in C++ and Visual C++.
The principal components algorithm I translated and adapted from a Basic language algorithm I found in a book from the year 1988.
The other important part is the scaling, translation, and rotation of the numerical results and their projection into the RGB color space.
This part of the code is reworked with each new raw image series to get a satisfactory result.
The resulting image does not always look good (I do not upload these to UMSF).

It should be noted that the resulting principal component coefficients depend on the overall image content.
Any change in the multispectral values in one part of the image will result in an overall change in the colors in the entire image.
In addition, the flatfield inhomogeneities of the multispectral filters (especially the infrared filters of the Mastcam Z right eye) have a strong disturbing effect.

The resulting colors are somewhat dependent on chance, and it is not guaranteed that different minerals will appear in different colors.

Edit
Meanwhile, I use multispectral sky images from sol 604 for a rough flatfield correction.
A visual explanation of the image processing you can find in post #170.
Go to the top of the page
 
+Quote Post
mcaplinger
post Nov 10 2022, 10:51 PM
Post #165


Senior Member
****

Group: Members
Posts: 2511
Joined: 13-September 05
Member No.: 497



We have had some inquiries over the years about color calibration for the Mars2020 cameras. For WATSON, the best I can offer is the attached image from the calibration dataset (see https://www.researchgate.net/publication/34...ibration_Report ) This image was taken with 100 msec exposure time, 8-bit linear divide by 8 encoding, and is Bayer-interpolated but otherwise straight out of the camera. It was illuminated by a commercial halogen worklamp (Britek Halo Flood Twin 1200S) with two 500 watt bulbs with a stated color temperature of 3150K. The light source was not calibrated and the geometry was not recorded.

The target is a commercial "Macbeth ColorChecker" ( https://en.wikipedia.org/wiki/ColorChecker ) made by X-Rite and purchased sometime around 2010, I'm guessing. These charts have gone though a variety of manufacturers (they're now made by Calibrite) and I don't know how their colors may have varied; we made no attempt to measure this one.

Attached Image


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post

12 Pages V  « < 9 10 11 12 >
Reply to this topicStart new topic

 



RSS Lo-Fi Version Time is now: 26th April 2024 - 02:35 PM
RULES AND GUIDELINES
Please read the Forum Rules and Guidelines before posting.

IMAGE COPYRIGHT
Images posted on UnmannedSpaceflight.com may be copyrighted. Do not reproduce without permission. Read here for further information on space images and copyright.

OPINIONS AND MODERATION
Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators.
SUPPORT THE FORUM
Unmannedspaceflight.com is funded by the Planetary Society. Please consider supporting our work and many other projects by donating to the Society or becoming a member.