IPB

Welcome Guest ( Log In | Register )

2 Pages V  < 1 2  
Reply to this topicStart new topic
Beginner level projection
Brian Swift
post Nov 10 2019, 08:19 AM
Post #16


Junior Member
**

Group: Members
Posts: 96
Joined: 18-September 17
Member No.: 8250



Having a non-cartographic technique that renders rather than discards pixels that don't intercept Jupiter supports creation of images like this showing both moon (Europa?) and Jupiter from PJ8_109.
Attached Image

Go to the top of the page
 
+Quote Post
Brian Swift
post Nov 10 2019, 08:41 AM
Post #17


Junior Member
**

Group: Members
Posts: 96
Joined: 18-September 17
Member No.: 8250



QUOTE (mcaplinger @ Nov 7 2019, 03:24 PM) *
Our measurements show that the timing for that image was off by 3 pixels or 3*3.2 milliseconds.

I'm curious, what is your process to estimate the error? I'm just doing Jupiter limb fits.
Go to the top of the page
 
+Quote Post
mcaplinger
post Nov 10 2019, 03:15 PM
Post #18


Senior Member
****

Group: Members
Posts: 1898
Joined: 13-September 05
Member No.: 497



QUOTE (Brian Swift @ Nov 10 2019, 12:19 AM) *
Having a non-cartographic technique that renders rather than discards pixels that don't intercept Jupiter supports creation of images like this showing both moon (Europa?) and Jupiter from PJ8_109.

In theory (I've never tried it) it would be possible to remove the camera distortions and stitch the framelets together accounting for spacecraft spin without calling sincpt at all or doing any intercept calculations.


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
adamg
post Nov 10 2019, 05:15 PM
Post #19


Newbie
*

Group: Members
Posts: 10
Joined: 31-October 19
Member No.: 8699



Absolutely Brian, that's exactly the behaviour I'm looking for, the planet edge looks far more realistic than sphere mapped images. That's why I picked the test image I did, because it had that moon poking out too. Normally I would have just used your script but I think I've become too fascinated with the way SPICE can partition work between groups to not spend a bit more time with it.
Go to the top of the page
 
+Quote Post
mcaplinger
post Nov 10 2019, 07:52 PM
Post #20


Senior Member
****

Group: Members
Posts: 1898
Joined: 13-September 05
Member No.: 497



QUOTE (adamg @ Nov 6 2019, 03:50 PM) *
rotationMatrix = spice.pxform('JUNO_JUNOCAM','IAU_JUPITER',et-jupiterLt)

BTW, this isn't the right way to do this. The right way is to use pxfrm2, which takes two different epochs. See the example in the comments for pxfrm2. That may be the source of your error.

I have to confess that my processing chain uses pxform and doesn't correct for the speed of light at all. When it was originally written for low Mars orbiters, this made very little difference. For Juno it makes a little more difference near perijove, and progressively more and more towards apojove.


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
adamg
post Nov 12 2019, 05:59 PM
Post #21


Newbie
*

Group: Members
Posts: 10
Joined: 31-October 19
Member No.: 8699



Thanks mcaplinger, I've changed it to the following, with the terrible frac in there to line the sphere thing up with the visible edge, I think you can tell what I'm up to from the plot now.

[jupiterPos, jupiterLt] = spice.spkpos('JUPITER', et0, 'IAU_JUPITER', 'NONE', 'JUNO')
pos = frac*direction*jupiterDistance/np.linalg.norm(direction)
rotationMatrix = spice.pxfrm2('JUNO_JUNOCAM','IAU_JUPITER',et, et-jupiterLt*frac)
pos = spice.mxv(rotationMatrix, pos)
pos -= jupiterPos

I've reasoned that the first pos assignment gets where it should be in the camera frame, the rotation does all the light time magic and the subtraction gets the resulting vector translated to Jupiter. It seems to work but it would be comforting if someone could say if this is nonsense that worked by chance. My use of the word "magic" gives an idea of my proficiency in these things.

I switched to the juno_sc_rec_170831_170902_v01.bc kernel like you said and I had to add 10ms to get the edge to line up which agrees with what you said earlier.

Brian, I see your code uses limbpt to find the limb, is that the easiest way do you think?

Many thanks


Adam
Attached thumbnail(s)
Attached Image
 
Go to the top of the page
 
+Quote Post
adamg
post Nov 12 2019, 10:52 PM
Post #22


Newbie
*

Group: Members
Posts: 10
Joined: 31-October 19
Member No.: 8699



I tried the limbpt function (below), not that I really understand the half plane thing, hence stuffing the same z vector in that the example code has. It kind of lines up, I added the points it returns as red and the view is from the underside, I'm assuming that the difference is from the non-roundness of the planet.

limbRet = spice.limbpt('TANGENT/ELLIPSOID','JUPITER',et0,'IAU_JUPITER','LT+S','CENTER','JUNO',[0.0,0.0,0.1],0.01,100,1e-4,1e-6,100)

Attached thumbnail(s)
Attached Image
 
Go to the top of the page
 
+Quote Post
mcaplinger
post Nov 12 2019, 11:09 PM
Post #23


Senior Member
****

Group: Members
Posts: 1898
Joined: 13-September 05
Member No.: 497



QUOTE (adamg @ Nov 12 2019, 02:52 PM) *
I'm assuming that the difference is from the non-roundness of the planet

limbpt is supposed to using the spheroid, so this doesn't seem like a good explanation, but I use edlimb, not limbpt (warning, speed of light not properly handled.)
CODE
to_targ = spkezr(targ, t, "iau_"+targ, "LT+S", "juno")[0][0:3]
e = edlimb(radii[0], radii[1], radii[2], vminus(to_targ))
c = pxform("iau_"+targ, "juno_junocam", t)
org = vadd(e.center, to_targ)
for th in range(-1800, 1800):
p = vadd(org, vadd(vscl(math.cos(math.radians(th/10.0)), e.semi_major), vscl(
math.sin(math.radians(th/10.0)), e.semi_minor)))
x, y = junocamlib.vector2xy(mxv(c, p), band)


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
Brian Swift
post Nov 12 2019, 11:44 PM
Post #24


Junior Member
**

Group: Members
Posts: 96
Joined: 18-September 17
Member No.: 8250



QUOTE (adamg @ Nov 12 2019, 09:59 AM) *
Brian, I see your code uses limbpt to find the limb, is that the easiest way do you think?

I don't recall being aware of edlimb when I developed the code that uses limbpt.
edlimb looks like a simpler interface, and is probably much faster than limbpt.
However, I notice the edlimb example uses a light time correction relative to
Jupiter's center. In limbpt, specifying the aberration correction
locus corloc="ELLIPSOID LIMB" applies light time correction to each limb point.
Since the distance to the limb can be as low as 22000km, there could be a difference
in light times of 151ms which is equivalent to about 47 pixels.

When everything is working correctly, (and you have a good estimate for the image start time)
you should be able to identify spice limb pixels on both the moon and jupiter.
(Staying aware that spice limb pixels on jupiter may be slightly within the junocam visible limb).

When the "Limb Marking" code in the Juno3D pipeline is enabled, it marks pixels along
the limb in the raw data prior to processing, which can be handy for debugging and
start time jitter analysis.

Go to the top of the page
 
+Quote Post
adamg
post Nov 14 2019, 11:34 PM
Post #25


Newbie
*

Group: Members
Posts: 10
Joined: 31-October 19
Member No.: 8699



Update: I think my issue is that the transform doesn't do light stuff which I was assuming. So the first has apparent position and the second doesn't. I've tried shifting everything by the difference between the corrected and uncorrected limb to get it equivalent to the apparent poisition but not luck yet :-/
---
I've been looking at this pretty carefully and I'm not seeing my error, when I map this back to the juno location it comes out evenly spaced (so I'm mapping back to the same spot) but there's a gap. So the result from one branch is being mapped a little bit off the other. If someone could give me a hint that would be great. I used the LT method thinking it would match method wise but I tried a few and it makes very little difference.

Thanks for any help.


CODE
[jupiterPos, jupiterLt] = spice.spkpos('JUPITER', et0, 'IAU_JUPITER', 'NONE', 'JUNO')
frac = 0.63
jupiterDistance = np.linalg.norm(jupiterPos)

# This fragment has the pixel direction vectors already calculated as "direction"
with spice.no_found_check():
[point, trgepc, srfvec, found ] = spice.sincpt(
'Ellipsoid', 'JUPITER', et,
'IAU_JUPITER', 'LT', 'JUNO',
'JUNO_JUNOCAM', v)
if found:
cloundInd = (thisSlice - sliceLow)*1648*128 + y*1648 + x
pointCloud[cloundInd,:] = point
else:
direction = np.array(v)
pos = frac*direction*jupiterDistance/np.linalg.norm(direction)
rotationMatrix = spice.pxfrm2('JUNO_JUNOCAM','IAU_JUPITER',et, et-jupiterLt*frac)
pos = spice.mxv(rotationMatrix, pos)
pos -= jupiterPos
cloundInd = (thisSlice - sliceLow)*1648*128 + y*1648 + x
pointCloud[cloundInd,:] = pos

Attached thumbnail(s)
Attached Image
 
Go to the top of the page
 
+Quote Post
adamg
post Today, 12:57 AM
Post #26


Newbie
*

Group: Members
Posts: 10
Joined: 31-October 19
Member No.: 8699



Took a while. It wasn't light travel time because the error introduced is too small. It was that I tried hand fitting the limb which gave a timing error which gave an angle offset from the spacecraft spin. So doing a limb search got the timing to line up. I scaled everything by its dot product to the Jupiter vector so it ended up on a plane. I dropped the extra dimension by taking the first row as one basis vector and then taking a column as another vector, made it orthonormal then projected everyting onto those vectors. It gave the first picture and I did a KD nearest neigbour search to rasterize it, second image. I kind of like the projection to a plane as I feel it makes it more like a real single camera, at least it'll behave just like a pinhole camera.

The projection to an imaginary Juno centered sphere is nice because it doesn't need any spice ray projections so takes no time and as I'm unprojecting it from the same point of view the Jupiter mapping bit isn't actually doing anything (other than some super serious finding of errors!). I feel the sphere project -> plane -> dimension reduction can easily be made into a single stage so you wouldn't actually have to do the imaginary sphere thing (Though honesty it's not that bad as it is). That should get it way faster than what I have so then a super simple pipeline can use all the hard work that went into the SPICE kernels to make nicely stitched pictures. I'm sure the nearest neighbour search can be improved on too because the data has structure so you always know which direction will get you closer, anyone know of an implementation of such a thing?
Attached thumbnail(s)
Attached Image

Attached Image
 
Go to the top of the page
 
+Quote Post

2 Pages V  < 1 2
Reply to this topicStart new topic

 



RSS Lo-Fi Version Time is now: 22nd November 2019 - 08:42 AM
RULES AND GUIDELINES
Please read the Forum Rules and Guidelines before posting.

IMAGE COPYRIGHT
Images posted on UnmannedSpaceflight.com may be copyrighted. Do not reproduce without permission. Read here for further information on space images and copyright.

OPINIONS AND MODERATION
Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators.
SUPPORT THE FORUM
Unmannedspaceflight.com is a project of the Planetary Society and is funded by donations from visitors and members. Help keep this forum up and running by contributing here.