IPB

Welcome Guest ( Log In | Register )

 
Reply to this topicStart new topic
Mars rover image wishlist
pgrindrod
post May 19 2016, 08:35 AM
Post #1


Junior Member
**

Group: Members
Posts: 59
Joined: 18-July 07
From: London, UK
Member No.: 2873



I couldn’t find this anywhere else, so hopefully this is the right place for this request…

I’m aware of many excellent examples of how people here, and elsewhere, have used images from spacecraft, and in particular from rovers and landers.

But if there was a hypothetical rover on Mars in the near future, taking stereo images on a daily basis, and you could help define how those images are released, what would be on the dream wish list for those images, assuming a similar policy to MER and MSL?

So things like: what image format and processing would be best, the preferred type of website/database serving those images up, what metadata is essential/would be good, and anything else that would be useful.

I have also been trying to collate examples of how these images are used, but it would be good to be directed to any cases that people found to be particularly inventive/impressive/useable. Of course, it might not be possible to predict all the ways that images might be used in the future, but having a list of best practices for sharing these images would be useful.

Many thanks for any input people might have.
Go to the top of the page
 
+Quote Post
JohnVV
post May 19 2016, 09:29 PM
Post #2


Member
***

Group: Members
Posts: 890
Joined: 18-November 08
Member No.: 4489



well the mission scientists need first access
but

16 bit depth images ( or the 12 bit raw )
the current and predicted naif spice kernels !!!
-- important for pointing and location data , just look at the thread on celestiamatters about the Jovian and Pluto/Charon mutual events
( http://forum.celestialmatters.org/ )
http://forum.celestialmatters.org/viewtopi...?f=11&t=819
http://forum.celestialmatters.org/viewtopi...?f=11&t=823

as to use of the spice data
just look at the cyclops sire they are using it for the up and coming Cassini images
http://www.ciclops.org/news/looking_ahead.php?js=1
( and the titan posts HERE )
Go to the top of the page
 
+Quote Post
scalbers
post May 20 2016, 07:26 PM
Post #3


Senior Member
****

Group: Members
Posts: 1620
Joined: 5-March 05
From: Boulder, CO
Member No.: 184



I would vote for all-sky images or better yet 360 degree spherical imagery. This might call for a wide angle camera of some type, as we do not get sufficient coverage from stitching the typical camera images.


--------------------
Steve [ my home page and planetary maps page ]
Go to the top of the page
 
+Quote Post
Herobrine
post May 20 2016, 08:41 PM
Post #4


Member
***

Group: Members
Posts: 244
Joined: 2-March 15
Member No.: 7408



@pgrindrod Are you asking for our dream list for daily release images, or for PDS data products?

I'll second the spice kernels. I've generally found the CAHV(OR(E)) model data invaluable, so definitely that, along with reference frame stuff. If there's any way we can magically reduce the rate of frame drift to zero, that would be nice.
One thing that I've often been uncertain about is exactly what a given data product had been through, processing-wise. I think that information is supposed to be able to be determined from the LBL data and the product type combined with the SIS, but I've felt...less-than-fully confident in the past, when working with imagery, that I had a correct understanding of what processing steps had already occurred and with what values.
One thing I've been particularly unclear about is shutter correction. With these electronic shutters, you get this readout smear that supposedly can be best corrected by doing a 0-second exposure immediately before/after the image is acquired and subtracting it from the acquired image (or something like that) before storing the data. I've read that the active rovers have the ability to do this, but it's never been clear to me whether, in practice, they ever actually do this. There are flags for it, but I can't recall ever seeing one set to TRUE. It's entirely possible I've just missed it. Assuming they don't normally do it, I'm sure someone has figured out that it rarely make a significant difference, but this is my dream list, so I'd include the on-board shutter correction to be done by default.

QUOTE (scalbers @ May 20 2016, 02:26 PM) *
I would vote for all-sky images or better yet 360 degree spherical imagery. This might call for a wide angle camera of some type, as we do not get sufficient coverage from stitching the typical camera images.

Could kill two birds with one stone....

Try not to think too hard about the decision to put sponsor decals on the side of a Mars rover
Go to the top of the page
 
+Quote Post
mcaplinger
post May 20 2016, 10:31 PM
Post #5


Senior Member
****

Group: Members
Posts: 2502
Joined: 13-September 05
Member No.: 497



QUOTE (Herobrine @ May 20 2016, 12:41 PM) *
One thing I've been particularly unclear about is shutter correction...

Can be done for the engineering cameras, does not exist for Mastcam/MAHLI/MARDI because the amount of smear is vastly smaller with their interline sensors than for the frame transfer sensors of the engineering cameras.

From the MSL camera SIS http://pds-imaging.jpl.nasa.gov/data/msl/M..._SIS_latest.PDF :
QUOTE
Keyword SHUTTER_CORRECTION_MODE_ID
Specifies whether shutter subtraction will be performed.

• Eng. Cameras
0 = “NONE”
1 = “CONDITIONAL”
2 = “ALWAYS”
• MMM Cameras “N/A”



--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
pgrindrod
post May 23 2016, 09:22 AM
Post #6


Junior Member
**

Group: Members
Posts: 59
Joined: 18-July 07
From: London, UK
Member No.: 2873



Thanks for all the comments, they're really useful.

I should say that I was thinking about the daily release images - the PDS (or other archive) will have its own standards and deliverables, this idea was just related to the daily images in order to maximise what people might be able to do before archived data are available if they have the right information.
Go to the top of the page
 
+Quote Post
Herobrine
post May 23 2016, 02:49 PM
Post #7


Member
***

Group: Members
Posts: 244
Joined: 2-March 15
Member No.: 7408



QUOTE (mcaplinger @ May 20 2016, 06:31 PM) *
Can be done for the engineering cameras, does not exist for Mastcam/MAHLI/MARDI because the amount of smear is vastly smaller with their interline sensors than for the frame transfer sensors of the engineering cameras.

That explains a lot. Thanks! smile.gif
I'd note that, with a Bayer pattern, even a little smear can make good debayering challenging. Start really saturating red pixels here and there and it can make it a real pain to generate a good green channel.
Then again, my primary experience with that issue was trying to process the descent frames, and the 0-second exposures I wished for above wouldn't have worked for them anyway, due to the motion.

That makes me think of another wishlist item, though for PDS, not for the dailies.
I don't fully understand what goes into producing the ground calibration files, but the flat IMGs' data doesn't take the same format as the EDRs', so it seems to me that at some point prior to what's published, there must have been some set of raw sensor data that was processed to produce the flat IMG. It looked like the data in the flat IMG had undergone some sort of desmear, but I couldn't find information as to how exactly they produced the file. I remember finding stuff about 'we imaged a uniform sphere at such and such' or whatever, but not what steps had occurred to process raw sensor data into the published IMG. I believed I had a way to perform a better desmear, but if the published flat IMG had already had a desmear performed on the source data that was used to created it, then I couldn't even really test my theory unless I managed to un-desmear the flat file, which I actually tried to do for a while before just giving up on the whole thing. It's more than a little possible I'd chased a school of red herring far into troubled waters, but at any rate, for future missions, it would be nice to have the raw data that was used to produce the flat IMGs and some explanation of the processing that was done to produce it. (I'm so needy, I know)

QUOTE (pgrindrod @ May 23 2016, 05:22 AM) *
I should say that I was thinking about the daily release images - the PDS (or other archive) will have its own standards and deliverables, this idea was just related to the daily images in order to maximise what people might be able to do before archived data are available if they have the right information.

In that case, two things come to mind.
1. LBLs for the dailies would be very helpful. I could have sworn there was a time when I could see LBLs somewhere for the dailies. .... Looking into it a little, it looks like I can for some cameras, just not the Malin cameras. Using the JSON API, I can still get timestamps, CAHV(OR(E)) data, and the vector and quaternion from the site frame to the rover nav frame for most of those, which is a lot better than nothing. If they threw solar elevation and azimuth in there, it would save me a lot of time. The JSON API started doing this thing a while back where it starts giving you 403s (Forbidden) if you go back more than a few dozen sols, which is unfortunate; it would be nice if it at least covered the range that hasn't made it into PDS yet, though frankly, I'm just thankful it exists at all.
2. PNGs (I can always dream). I guess what my dream list for dailies boils down to is... as close to what's going to be in PDS as they're willing to give us. tongue.gif

I guess a third thing that comes to mind would be the most recently measured optical depth, which would be helpful for certain applications. I guess publishing that with the dailies would probably rob some people of some papers, so I'll not hold my breath for that; I don't think manual respiration is possible while you're dreaming, anyway. wink.gif
Go to the top of the page
 
+Quote Post
mcaplinger
post May 23 2016, 04:52 PM
Post #8


Senior Member
****

Group: Members
Posts: 2502
Joined: 13-September 05
Member No.: 497



QUOTE (Herobrine @ May 23 2016, 06:49 AM) *
for future missions, it would be nice to have the raw data that was used to produce the flat IMGs and some explanation of the processing that was done to produce it.

Have you read all of https://www.researchgate.net/publication/27...ibration_status including the supplemental data file?

Maybe I should start a thread about my dream list of what I want from the amateur community. rolleyes.gif


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
Herobrine
post May 23 2016, 07:19 PM
Post #9


Member
***

Group: Members
Posts: 244
Joined: 2-March 15
Member No.: 7408



QUOTE (mcaplinger @ May 23 2016, 12:52 PM) *
Have you read all of https://www.researchgate.net/publication/27...ibration_status including the supplemental data file?

Maybe I should start a thread about my dream list of what I want from the amateur community. rolleyes.gif

If it's not in PDS, then I probably haven't read it. I can't rely on finding information I need in published papers on third-party sites; most of the times I've tried, the information was behind a paywall, so I don't usually bother trying. I haven't looked at what you've linked, but if that has the information I mentioned, then that's precisely the information that I'm wishing was included in the PDS releases. Information regarding processing is included for the EDRs and RDRs; why it wouldn't be included for the calibration files is beyond me. rolleyes.gif
Go to the top of the page
 
+Quote Post
algorithm
post May 23 2016, 07:21 PM
Post #10


Member
***

Group: Members
Posts: 334
Joined: 11-December 12
From: The home of Corby Crater (Corby-England)
Member No.: 6783



"Maybe I should start a thread about my dream list of what I want from the amateur community."

Possibly one of the best responses from an expert I have read. laugh.gif
Go to the top of the page
 
+Quote Post
Herobrine
post May 23 2016, 08:20 PM
Post #11


Member
***

Group: Members
Posts: 244
Joined: 2-March 15
Member No.: 7408



QUOTE (mcaplinger @ May 23 2016, 12:52 PM) *
Have you read all of https://www.researchgate.net/publication/27...ibration_status including the supplemental data file?

Now that I've looked at your link...
To actually answer your question, at the time I was dealing with the calibration data, no, for several reasons. The most insurmountable one being that the paper you linked didn't exist when I was doing my amateur research. The original version of that paper, the one with an error in a key equation, wasn't published until 3 years into the surface mission and the corrected version wasn't published until 2015-10-05, 4 days after I'd given up trying to work with the files without help, and posted about it here, on 2015-10-01. I'm glad to see that it actually is in PDS now, starting at the end of last year. The first PDS volume to include that paper was MSLMHL_0010. The 9 that preceded didn't have that information at all.
Another insurmountable reason was that I was working with the MARDI flat IMG, which is not covered by the MAHLI paper.
To further answer your question, yes, I actually did read the paper you linked later, after I'd given up with MARDI, and was trying to do stuff with Mastcam EDRs because the paper existed at that time and when I came across it, I hoped that it might explain how the other MSSS cameras' flat files were processed, but it doesn't; it only covers MAHLI.

Edit: Removed erroneous statement about the location of the raw 8-bit flat file source data for MAHLI.
Go to the top of the page
 
+Quote Post
mcaplinger
post May 23 2016, 09:30 PM
Post #12


Senior Member
****

Group: Members
Posts: 2502
Joined: 13-September 05
Member No.: 497



QUOTE (Herobrine @ May 23 2016, 12:20 PM) *
I hoped that it might explain how the other MSSS cameras' flat files were processed, but it doesn't; it only covers MAHLI.

All of the MSSS cameras are electrically identical from the detector back and all of the ground calibration was done in basically the same way, so it's a safe assumption that the processing was as similar as it could have been.

Mods: you might consider moving this part of the discussion to the MSL cameras thread, as the intent of this thread was "hypothetical near-future rovers".


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
mhoward
post May 24 2016, 08:36 PM
Post #13


Senior Member
****

Group: Moderator
Posts: 3431
Joined: 11-August 04
From: USA
Member No.: 98



I’m coming from the point of view of apps, e.g. Midnight Planets.

Basically what I’m looking for is the LBL files. If the same information can be put into JSON format, great. It needs to be easily and efficiently accessible to a bot - i.e., it should be just simple web pages, accessed through well-defined URLs.

When I say accessible to a ’bot’, I mean a single server program, not lots of client programs. I can’t image any scenario in which I would release an app that directly accesses data that I don’t control. What I do is download everything - image data and metadata, wherever I can get it - to my server, re-process and re-package what I need, and host it myself. In addition to hopefully being more polite, this is just basic sanity for me: apps updates are hell, and I can’t risk being put into a situation by a change in a server that isn't mine. (This is not a hypothetical concern. Such changes have happened, and I have dealt with them on the server side.) So, what I need is just the data and metadata. There doesn’t need to be any thought put into formatting the data for a hypothetical large-scale client program; from my point of view, that effort can be counterproductive. I could probably think of real examples of metadata being excluded, images being overcompressed, thumbnail sizes that are useless to me, arbitrary pagination of data, etc., which may have been intended to be helpful, but weren’t, at least not to me. Just the straight data, as much of it as possible, please.

I have no problem with compressed JPGs, as long as they’re not too compressed. (Which has happened.) What I need (and have never had, AFAIK), is the information about how the image was stretched (mapped to 256 grays, or whatever was done to it) - so that process can be undone programmatically, not through guesswork. Also I guess exposure would come into play, unless we were talking about calibrated images, which we probably wouldn't be. It's been a while since I've looked at this stuff, so take this comment with a grain of salt.

One nice thing about JPG files is, some metadata can be included with them. It would NOT make sense to include all the LBL file within the JPG, but it could include the stretching parameters used to map the image to 256 grays, for example. I do exactly that in Midnight Planets for MER-A and MER-B. My server program downloads the calibrated images from the PDS, converts them to high-quality JPGs with embedded EXIF information including the stretch used, and re-hosts them for the client program. The result in the client app is nearly indistinguishable from displaying the source PDS images, in most cases, but uses a fraction of the storage space and bandwidth.

I hope to use the CAHVOR data more in the future. Particularly with the camera specs I’ve heard for a hypothetical Mars rover, seems like it’d be even more necessary for what I'd like to do.

I would prefer to use the image metadata directly, not try to extract pieces of it that have been processed into SPICE data. In fact, hypothetically, if the metadata was only available through SPICE, I would probably not spend the time and effort to extract it, with the experience of MSL behind me. I would just wait for the data to hit the PDS. Which would be a shame; I like live data.

Basic information about the cameras, especially the image filename format, needs to be available. I spent an absurd amount of time trying to ‘guess’ this information for MSL. I will probably never have the time to do that again, even if I thought that was fun. Which it wasn’t.

Another basic point about the filename formats: even for raw web-released products, the filename format should match what will end up in the PDS. Dealing with multiple filename formats for the same MSL image products has been ugly, on this end.

I hope this is helpful in some way.

- Mike
Reason for edit: added signature
Go to the top of the page
 
+Quote Post

Reply to this topicStart new topic

 



RSS Lo-Fi Version Time is now: 19th March 2024 - 06:40 AM
RULES AND GUIDELINES
Please read the Forum Rules and Guidelines before posting.

IMAGE COPYRIGHT
Images posted on UnmannedSpaceflight.com may be copyrighted. Do not reproduce without permission. Read here for further information on space images and copyright.

OPINIONS AND MODERATION
Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators.
SUPPORT THE FORUM
Unmannedspaceflight.com is funded by the Planetary Society. Please consider supporting our work and many other projects by donating to the Society or becoming a member.