IPB

Welcome Guest ( Log In | Register )

24 Pages V  « < 22 23 24  
Reply to this topicStart new topic
Traversing the Clay-Bearing Unit Along the Base of VRR, Site 73-, sol 2297-, 22 Jan 2019-
PaulH51
post Sep 3 2019, 11:07 PM
Post #346


Senior Member
****

Group: Members
Posts: 1741
Joined: 30-January 13
From: Penang, Malaysia.
Member No.: 6853



I have roughly reconstructed the bayer filter on what I believe to be the fin of bedrock at 'Joppa Shore' 2472MR0131130010704300C00 using GIMP, please excuse the green compression artefacts.
Joppa Shore is one of the sol 2472 Mastcam multispectral observations described in the mission update as a unique fin of bedrock sticking out at "Joppa Shore"
I'd missed this due to the bayer encoding, but Elisabetta Bonora tweeted a rather nice anaglyph of the target today.
I'm seeking a little help, as I used to use Joe's page for camera pointing information, but as most here will already know, Joe's page is no longer being updated following some recent changes in the way the mission stores its images.
I've found the sol in a JSON image manifest, that points to the page for sol 2472 which provides me a host of information on my R-MastCam as well as the rover.
I'm hoping the pointing is encoded somewhere in the data for the R-MastCam image. Can someone either point me in the right direction re any pointing info that's in JSON, or to a document that will help me decode the JSON data?
I've previously gone down the rabbit hole with NAIF in the past so hope that's not where I need to go... TIA
In the meantime... Enjoy Joppa Shore
Attached Image
Go to the top of the page
 
+Quote Post
atomoid
post Sep 5 2019, 06:28 PM
Post #347


Member
***

Group: Members
Posts: 776
Joined: 15-March 05
From: Santa Cruz, CA
Member No.: 196



Great image! I haven't attempted such ambitious ends myself, but it seems "camera_origin" and "camera_vector" along with "rover_attitude" may help.
Firefox has a nice JSON renderer apparently built-into it that can aid mere humans desiring to browse such files
This very dense document seems relevant, section 3.3
Go to the top of the page
 
+Quote Post
mcaplinger
post Sep 5 2019, 11:36 PM
Post #348


Senior Member
****

Group: Members
Posts: 1883
Joined: 13-September 05
Member No.: 497



QUOTE (PaulH51 @ Sep 3 2019, 03:07 PM) *
I'm seeking a little help, as I used to use Joe's page for camera pointing information...

What kind of pointing information are you looking for?


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
PaulH51
post Sep 6 2019, 04:17 AM
Post #349


Senior Member
****

Group: Members
Posts: 1741
Joined: 30-January 13
From: Penang, Malaysia.
Member No.: 6853



QUOTE (mcaplinger @ Sep 6 2019, 07:36 AM) *
What kind of pointing information are you looking for?

I'll check the link from Atomoid, but I'm looking for the compass bearing of the camera when the image was acquired, and if possible the elevation of the camera. I used to get this from Joe's page, it was very useful in finding those targets in the NavCam stereo pairs for understanding the distance and approximating scale using AlgorimancerPG. I eventually found this target on the NavCam's, it had proved a little difficult by the target being somewhat underexposed in the NavCam images.
Go to the top of the page
 
+Quote Post
mcaplinger
post Sep 6 2019, 09:42 PM
Post #350


Senior Member
****

Group: Members
Posts: 1883
Joined: 13-September 05
Member No.: 497



QUOTE (atomoid @ Sep 5 2019, 10:28 AM) *
I haven't attempted such ambitious ends myself, but it seems "camera_origin" and "camera_vector" along with "rover_attitude" may help.

This is a highly-informed guess but I have never done this specifically myself so I may be mistaken about the details.

camera_origin is the position of the camera in the rover frame (so-called RNAV frame, X pointed forward, Z pointed down), which varies only slightly as a function of pointing and can probably be ignored. camera_vector is the camera pointing vector in the rover frame. To get to actual compass direction you have to account for how the rover is oriented. That's given in rover_attitude as a quaternion that transforms from RNAV to the local level (LL) frame, where X axis points north and Z axis points at the center of Mars. rover_xyz is the position of the rover in the "site frame" which is zeroed every once in a while when dead-reckoning errors have built up beyond some threshold (or something, I don't really understand the details, but which site is current is in the JSON also.)

In SPICE it would take a few function calls to turn rover_attitude into a rotation matrix (q2m) and transform camera_vector into LL (mxv), and then it's just some trig to go from the vector to az/el (or SPICE reclat and some arithmetic.)

CODE
def azel(camera_vector, rover_attitude):
c = c2m(rover_attitude)
v = mxv(c, camera_vector)
rad, az, el = reclat(v)
el = -el
return az, el


I'm less sure what frame the CAHVOR ( https://agupubs.onlinelibrary.wiley.com/doi...29/2003JE002199 ) parameters are in (C is just a copy of camera_origin, looks like), but you don't need those to just get the pointing, they describe the camera field of view, distortion, etc and can be safely ignored unless you were doing detailed photogrammetry.



--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
atomoid
post Sep 6 2019, 10:54 PM
Post #351


Member
***

Group: Members
Posts: 776
Joined: 15-March 05
From: Santa Cruz, CA
Member No.: 196



Sounds like a great start!
I hadnt recently looked at Joe's curiosityrover page that up until 7/10/2019 so nicely converted all those numbers into digestible bearing/elevation terms and map them out. He also references Spice on the about page but doesn't go into much detail. Unless i overlooked it, I didnt see anything useful to extract from the page's javascript resources.
On that note here's to wishing for an opensource collaboration to create a web interface that integrates all the best of midnightplanets, curiosityrover and other sites into a unified browsing system just in time for the 2020 rover (well actually why not make it extendable to process data from any mission).
Go to the top of the page
 
+Quote Post
PaulH51
post Sep 7 2019, 02:00 AM
Post #352


Senior Member
****

Group: Members
Posts: 1741
Joined: 30-January 13
From: Penang, Malaysia.
Member No.: 6853



QUOTE (atomoid @ Sep 7 2019, 06:54 AM) *
Sounds like a great start!
On that note here's to wishing for an opensource collaboration to create a web interface that integrates all the best of midnightplanets, curiosityrover and other sites into a unified browsing system just in time for the 2020 rover (well actually why not make it extendable to process data from any mission).

I've read section 3.3 of the document in the first reply, there's some good stuff in there, but sadly a lot of it goes over my head, I did look at SPICE and NAIF many years ago and some of the specifications that talked of sites etc and how the rover reset its bearings every time it drifted out of specification. But making any sense of it looks to be beyond my limited skill set.

I also hope something can be done for 2020, but I'd wager the JPL raw servers will likely be a copy the new MSL front end which has fewer features than the one it replaced.

I know there is an on-line distance and measuring tool in the PDS that performs a splendid job on the stereo pair images from Curiosity, hopefully something like that will be provided for 2020. I guess it would be asking too much if that tool could be used on the raw data that's not yet in the PDS.

I'm not sure what info will be provided with raw MastCam-Z frames, I hope we will get the zoom count, & focus count, similar to how we get Curi's MAHLI focus-motor counts, that way users may be able to work out what zoom was selected, and also a rough idea of distance for the in-focus parts of the image?

I'm not sure if AlgorimancerPG will be updated for the new cameras on 2020, but I hope it is, or at least someone else creates a similar utility, as I find it very useful.

I'm not sure at this time what impact the wider angle 2020 cameras will have on some of the simpler stitching tools for creating mosaics. I'm sure certain members will have no major issues with the tools they use.

I guess it's just a case of wait and see...

Thanks for pointing me in the right direction guys, really appreciate it...
Go to the top of the page
 
+Quote Post
PaulH51
post Sep 9 2019, 11:17 AM
Post #353


Senior Member
****

Group: Members
Posts: 1741
Joined: 30-January 13
From: Penang, Malaysia.
Member No.: 6853



From Curi, post conjunction, sol 2520
Attached Image
Go to the top of the page
 
+Quote Post
fredk
post Sep 12 2019, 08:17 PM
Post #354


Senior Member
****

Group: Members
Posts: 3940
Joined: 17-January 05
Member No.: 152



QUOTE (atomoid @ Sep 6 2019, 11:54 PM) *
On that note here's to wishing for an opensource collaboration to create a web interface that integrates all the best of midnightplanets, curiosityrover and other sites into a unified browsing system just in time for the 2020 rover (well actually why not make it extendable to process data from any mission).

Fabulous idea. I'd hope I could chip in somehow. Automating the (crude) deBayering of Bayered images would be simple.
Go to the top of the page
 
+Quote Post
PaulH51
post Sep 16 2019, 04:14 AM
Post #355


Senior Member
****

Group: Members
Posts: 1741
Joined: 30-January 13
From: Penang, Malaysia.
Member No.: 6853



Looking at the sol 2527 thumbnails it looks like we had a drill attempt at 'Glen Etive 2', after a drill press on sol 2525. Hopefully we'll get the full frames soon so we can see if it was successful smile.gif

EDIT Looks good to me smile.gif GIF using 2526 and 2567 HazCam frames for a before and after
Attached Image
Go to the top of the page
 
+Quote Post
PaulH51
post Sep 17 2019, 05:44 AM
Post #356


Senior Member
****

Group: Members
Posts: 1741
Joined: 30-January 13
From: Penang, Malaysia.
Member No.: 6853



Does anyone know why we haven't had any ChemCam 'ENHANCED Data Products' since conjunction, only the normal ChemCam images?
Go to the top of the page
 
+Quote Post

24 Pages V  « < 22 23 24
Reply to this topicStart new topic

 



RSS Lo-Fi Version Time is now: 21st September 2019 - 07:28 AM
RULES AND GUIDELINES
Please read the Forum Rules and Guidelines before posting.

IMAGE COPYRIGHT
Images posted on UnmannedSpaceflight.com may be copyrighted. Do not reproduce without permission. Read here for further information on space images and copyright.

OPINIONS AND MODERATION
Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators.
SUPPORT THE FORUM
Unmannedspaceflight.com is a project of the Planetary Society and is funded by donations from visitors and members. Help keep this forum up and running by contributing here.