Juno perijoves 2 and 3, October 19 and December 11, 2016 |
Juno perijoves 2 and 3, October 19 and December 11, 2016 |
Oct 26 2016, 04:44 PM
Post
#1
|
|
Junior Member Group: Members Posts: 23 Joined: 13-October 13 Member No.: 7013 |
A lot has happened and it seemed like a good time to start a new post. We will be staying in 53 day orbits until the project has a full understanding of the risks that may or may not be associated with reducing the orbit period to 14 days per our previous plan.
|
|
|
Feb 3 2020, 11:48 PM
Post
#2
|
|
IMG to PNG GOD Group: Moderator Posts: 2254 Joined: 19-February 04 From: Near fire and ice Member No.: 38 |
There is also clearly some misalignment far from the limb that can be seen by 'blinking' the red/green/blue channels rapidly in high contrast areas. The misalignment should be smaller. One important issue is that the x (sample) of the optical axis is *not* at the center (i.e. x=824) in the framelets. Depending on how you are creating these images this may be of importance but I'm not sure it's enough to cause all of the misalignment (in particular not the misalignment near the center of your images).
|
|
|
Feb 4 2020, 10:58 PM
Post
#3
|
|
Member Group: Members Posts: 425 Joined: 18-September 17 Member No.: 8250 |
... 'blinking' the red/green/blue channels rapidly in high contrast areas. Doh. I should have already been doing this. I've just been looking at bue/red fringing in high contrast areas. Björn, are you using the standard camera model, or have you developed your own? |
|
|
Feb 7 2020, 01:01 AM
Post
#4
|
|
IMG to PNG GOD Group: Moderator Posts: 2254 Joined: 19-February 04 From: Near fire and ice Member No.: 38 |
Björn, are you using the standard camera model, or have you developed your own? This depends on how you define "...using the standard camera model" . I'm using software written by myself for the geometric processing (reprojecting the framelets to a simple cylindrical and/or polar map etc.). However, some of the code is directly based on information/code from the IK kernel, in particular the distort/undistort code, the location of R/G/B on the CCD, the FOV etc.. I'm also using the SPICE toolkit. Works wonderfully now, especially after lots of improvements I did in November and December 2019 (what was supposed to be a minor improvement in early November triggered a flood of new ideas for improving the software, resulting in faster processing, improved/proper flatfielding, a new (?) function for removing limb darkening, better photometric parameters, an empirical model of the skylight illumination near the terminator, easier limb fits etc.). The only issue I'm working on now is the value I need to add to START_TIME. Using a fairly large sample of images it has become absolutely clear that in my case the average value I need to use is lower than the correct value (0.068). The average value I use is close to 0.040 but it varies and is sometimes close to 0.068. This means that something is wrong. This has no visual effect though and is in that sense not a serious problem (in particular there is negligible misalignment between the R/G/B channels or adjacent framelets from the same color channel). I can think of at least three plausible reasons for this (in fact all of them might contribute to the error) and now that I'm almost finished with the PJ24 images (at least for the time being) I plan on taking a detailed look at this issue. |
|
|
Feb 7 2020, 06:55 AM
Post
#5
|
|
Member Group: Members Posts: 425 Joined: 18-September 17 Member No.: 8250 |
... some of the code is directly based on information/code from the IK kernel, in particular the distort/undistort code, That is what I was I was referring to. I and (I believe) Gerrald have our own distort/undistort code, Kevin (I believe) is using the standard code via ISIS, but I didn't know if you had your own or used the standard code. QUOTE ... in faster processing, improved/proper flatfielding, a new (?) function for removing limb darkening, better photometric parameters, an empirical model of the skylight illumination near the terminator, easier limb fits etc.). Awesome. Wish I had time to do a proper illumination removal model. When you combine multiple images, is there residual brightness variation that needs to be adjusted to eliminate boundaries between images? QUOTE The only issue I'm working on now is the value I need to add to START_TIME. Using a fairly large sample of images it has become absolutely clear that in my case the average value I need to use is lower than the correct value (0.068). The average value I use is close to 0.040 but it varies and is sometimes close to 0.068. Note, the START_TIME_BIAS in juno_junocam_v03.ti is 0.06188 not .0688. My start times (based on limb fit) range from .03 to .08 |
|
|
Feb 7 2020, 11:25 AM
Post
#6
|
|
Senior Member Group: Members Posts: 2346 Joined: 7-December 12 Member No.: 6780 |
That is what I was I was referring to. I and (I believe) Gerrald have our own distort/undistort code... Yes, so it is. I've implemented almost everything from scratch down to pixel operations in RAM or basic 3D libraries. The only essential thing I didn't implement for JunoCam images is the determination of the s/c trajectory from images. Instead, I'm using the NAIF/SPICE spy.exe utility to save trajectory data to text files. And for movie productions or image file conversions I'm usually making use of ffmpeg. (I've worked on reading some of the image file formats professionally on a binary level for long enough, so that stuff is less interesting from my point of view.) [Spoiler alert: Don't follow the links below, if you are ambitious to find your own solutions independently.] During EPCS2018, I explained one of several possible approaches of in-flight geometrical camera calibration on the basis of Marbe Movie images: PDF version PPT version (with animations) In contrast to some of the JunoCam IK documentation, I'm working with the CCD pixel layout. One advantage is the easy adjustment to the changing position of the readout region for the methane band images by 90 pixels. But otherwise, the more recent IKs (instrument kernels) are looking good. My assumptions about camera pointing are imperfect, and I'm doing some manual limb fitting of each image. But any significant reduction of manual work would only be achieved, if automated pointing data would be better than manual adjustment is able to do. So, further refinement of automated pointing currently isn't my top priority. And here links to an EPSC2017 talk about polynomial illumination adjustment over the cosines of the incidence and the emission angle: PDF version PPT version There is still some space for further improvements. I might prepare such a solution for EPSC2020, if time allows. Note also, that Jupiter's appearance is changing, and so are best-fit illumination models. But my current focus is on understanding the dynamics of the atmosphere. I presume, that this will also be required for an accurate understanding of light scattering and according models. So, there is significant effort in software development and test runs, and less time for discussions. Some of the results may be worth to be released in formal journal papers, the preparation of which is taking time, too. As usual, I'm interested in in-depth understanding of all theoretical, methodological and technical detail as seamlessly as possible. And learning means inventing, or re-inventing, if partial solutions do already exist. Regarding the limb fit: Note, that the opacity of the hazes, and so the apparent limb, is varying with latitude. This applies to Jupiter's equipotential wrt to a spheroid, too. (AFAIK, full detail of the latter isn't publicly accessible, yet.) |
|
|
Lo-Fi Version | Time is now: 23rd September 2024 - 04:11 AM |
RULES AND GUIDELINES Please read the Forum Rules and Guidelines before posting. IMAGE COPYRIGHT |
OPINIONS AND MODERATION Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators. |
SUPPORT THE FORUM Unmannedspaceflight.com is funded by the Planetary Society. Please consider supporting our work and many other projects by donating to the Society or becoming a member. |