IPB

Welcome Guest ( Log In | Register )

5 Pages V  < 1 2 3 4 > »   
Reply to this topicStart new topic
Beginner level projection
Brian Swift
post Nov 10 2019, 08:19 AM
Post #16


Member
***

Group: Members
Posts: 403
Joined: 18-September 17
Member No.: 8250



Having a non-cartographic technique that renders rather than discards pixels that don't intercept Jupiter supports creation of images like this showing both moon (Europa?) and Jupiter from PJ8_109.

Attached Image

Go to the top of the page
 
+Quote Post
Brian Swift
post Nov 10 2019, 08:41 AM
Post #17


Member
***

Group: Members
Posts: 403
Joined: 18-September 17
Member No.: 8250



QUOTE (mcaplinger @ Nov 7 2019, 03:24 PM) *
Our measurements show that the timing for that image was off by 3 pixels or 3*3.2 milliseconds.

I'm curious, what is your process to estimate the error? I'm just doing Jupiter limb fits.
Go to the top of the page
 
+Quote Post
mcaplinger
post Nov 10 2019, 03:15 PM
Post #18


Senior Member
****

Group: Members
Posts: 2507
Joined: 13-September 05
Member No.: 497



QUOTE (Brian Swift @ Nov 10 2019, 12:19 AM) *
Having a non-cartographic technique that renders rather than discards pixels that don't intercept Jupiter supports creation of images like this showing both moon (Europa?) and Jupiter from PJ8_109.

In theory (I've never tried it) it would be possible to remove the camera distortions and stitch the framelets together accounting for spacecraft spin without calling sincpt at all or doing any intercept calculations.


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
adamg
post Nov 10 2019, 05:15 PM
Post #19


Junior Member
**

Group: Members
Posts: 31
Joined: 31-October 19
Member No.: 8699



Absolutely Brian, that's exactly the behaviour I'm looking for, the planet edge looks far more realistic than sphere mapped images. That's why I picked the test image I did, because it had that moon poking out too. Normally I would have just used your script but I think I've become too fascinated with the way SPICE can partition work between groups to not spend a bit more time with it.
Go to the top of the page
 
+Quote Post
mcaplinger
post Nov 10 2019, 07:52 PM
Post #20


Senior Member
****

Group: Members
Posts: 2507
Joined: 13-September 05
Member No.: 497



QUOTE (adamg @ Nov 6 2019, 03:50 PM) *
rotationMatrix = spice.pxform('JUNO_JUNOCAM','IAU_JUPITER',et-jupiterLt)

BTW, this isn't the right way to do this. The right way is to use pxfrm2, which takes two different epochs. See the example in the comments for pxfrm2. That may be the source of your error.

I have to confess that my processing chain uses pxform and doesn't correct for the speed of light at all. When it was originally written for low Mars orbiters, this made very little difference. For Juno it makes a little more difference near perijove, and progressively more and more towards apojove.


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
adamg
post Nov 12 2019, 05:59 PM
Post #21


Junior Member
**

Group: Members
Posts: 31
Joined: 31-October 19
Member No.: 8699



Thanks mcaplinger, I've changed it to the following, with the terrible frac in there to line the sphere thing up with the visible edge, I think you can tell what I'm up to from the plot now.

[jupiterPos, jupiterLt] = spice.spkpos('JUPITER', et0, 'IAU_JUPITER', 'NONE', 'JUNO')
pos = frac*direction*jupiterDistance/np.linalg.norm(direction)
rotationMatrix = spice.pxfrm2('JUNO_JUNOCAM','IAU_JUPITER',et, et-jupiterLt*frac)
pos = spice.mxv(rotationMatrix, pos)
pos -= jupiterPos

I've reasoned that the first pos assignment gets where it should be in the camera frame, the rotation does all the light time magic and the subtraction gets the resulting vector translated to Jupiter. It seems to work but it would be comforting if someone could say if this is nonsense that worked by chance. My use of the word "magic" gives an idea of my proficiency in these things.

I switched to the juno_sc_rec_170831_170902_v01.bc kernel like you said and I had to add 10ms to get the edge to line up which agrees with what you said earlier.

Brian, I see your code uses limbpt to find the limb, is that the easiest way do you think?

Many thanks


Adam
Attached thumbnail(s)
Attached Image
 
Go to the top of the page
 
+Quote Post
adamg
post Nov 12 2019, 10:52 PM
Post #22


Junior Member
**

Group: Members
Posts: 31
Joined: 31-October 19
Member No.: 8699



I tried the limbpt function (below), not that I really understand the half plane thing, hence stuffing the same z vector in that the example code has. It kind of lines up, I added the points it returns as red and the view is from the underside, I'm assuming that the difference is from the non-roundness of the planet.

limbRet = spice.limbpt('TANGENT/ELLIPSOID','JUPITER',et0,'IAU_JUPITER','LT+S','CENTER','JUNO',[0.0,0.0,0.1],0.01,100,1e-4,1e-6,100)

Attached thumbnail(s)
Attached Image
 
Go to the top of the page
 
+Quote Post
mcaplinger
post Nov 12 2019, 11:09 PM
Post #23


Senior Member
****

Group: Members
Posts: 2507
Joined: 13-September 05
Member No.: 497



QUOTE (adamg @ Nov 12 2019, 02:52 PM) *
I'm assuming that the difference is from the non-roundness of the planet

limbpt is supposed to using the spheroid, so this doesn't seem like a good explanation, but I use edlimb, not limbpt (warning, speed of light not properly handled.)
CODE
to_targ = spkezr(targ, t, "iau_"+targ, "LT+S", "juno")[0][0:3]
e = edlimb(radii[0], radii[1], radii[2], vminus(to_targ))
c = pxform("iau_"+targ, "juno_junocam", t)
org = vadd(e.center, to_targ)
for th in range(-1800, 1800):
p = vadd(org, vadd(vscl(math.cos(math.radians(th/10.0)), e.semi_major), vscl(
math.sin(math.radians(th/10.0)), e.semi_minor)))
x, y = junocamlib.vector2xy(mxv(c, p), band)


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
Brian Swift
post Nov 12 2019, 11:44 PM
Post #24


Member
***

Group: Members
Posts: 403
Joined: 18-September 17
Member No.: 8250



QUOTE (adamg @ Nov 12 2019, 09:59 AM) *
Brian, I see your code uses limbpt to find the limb, is that the easiest way do you think?

I don't recall being aware of edlimb when I developed the code that uses limbpt.
edlimb looks like a simpler interface, and is probably much faster than limbpt.
However, I notice the edlimb example uses a light time correction relative to
Jupiter's center. In limbpt, specifying the aberration correction
locus corloc="ELLIPSOID LIMB" applies light time correction to each limb point.
Since the distance to the limb can be as low as 22000km, there could be a difference
in light times of 151ms which is equivalent to about 47 pixels.

When everything is working correctly, (and you have a good estimate for the image start time)
you should be able to identify spice limb pixels on both the moon and jupiter.
(Staying aware that spice limb pixels on jupiter may be slightly within the junocam visible limb).

When the "Limb Marking" code in the Juno3D pipeline is enabled, it marks pixels along
the limb in the raw data prior to processing, which can be handy for debugging and
start time jitter analysis.

Go to the top of the page
 
+Quote Post
adamg
post Nov 14 2019, 11:34 PM
Post #25


Junior Member
**

Group: Members
Posts: 31
Joined: 31-October 19
Member No.: 8699



Update: I think my issue is that the transform doesn't do light stuff which I was assuming. So the first has apparent position and the second doesn't. I've tried shifting everything by the difference between the corrected and uncorrected limb to get it equivalent to the apparent poisition but not luck yet :-/
---
I've been looking at this pretty carefully and I'm not seeing my error, when I map this back to the juno location it comes out evenly spaced (so I'm mapping back to the same spot) but there's a gap. So the result from one branch is being mapped a little bit off the other. If someone could give me a hint that would be great. I used the LT method thinking it would match method wise but I tried a few and it makes very little difference.

Thanks for any help.


CODE
[jupiterPos, jupiterLt] = spice.spkpos('JUPITER', et0, 'IAU_JUPITER', 'NONE', 'JUNO')
frac = 0.63
jupiterDistance = np.linalg.norm(jupiterPos)

# This fragment has the pixel direction vectors already calculated as "direction"
with spice.no_found_check():
[point, trgepc, srfvec, found ] = spice.sincpt(
'Ellipsoid', 'JUPITER', et,
'IAU_JUPITER', 'LT', 'JUNO',
'JUNO_JUNOCAM', v)
if found:
cloundInd = (thisSlice - sliceLow)*1648*128 + y*1648 + x
pointCloud[cloundInd,:] = point
else:
direction = np.array(v)
pos = frac*direction*jupiterDistance/np.linalg.norm(direction)
rotationMatrix = spice.pxfrm2('JUNO_JUNOCAM','IAU_JUPITER',et, et-jupiterLt*frac)
pos = spice.mxv(rotationMatrix, pos)
pos -= jupiterPos
cloundInd = (thisSlice - sliceLow)*1648*128 + y*1648 + x
pointCloud[cloundInd,:] = pos

Attached thumbnail(s)
Attached Image
 
Go to the top of the page
 
+Quote Post
adamg
post Nov 22 2019, 12:57 AM
Post #26


Junior Member
**

Group: Members
Posts: 31
Joined: 31-October 19
Member No.: 8699



Took a while. It wasn't light travel time because the error introduced is too small. It was that I tried hand fitting the limb which gave a timing error which gave an angle offset from the spacecraft spin. So doing a limb search got the timing to line up. I scaled everything by its dot product to the Jupiter vector so it ended up on a plane. I dropped the extra dimension by taking the first row as one basis vector and then taking a column as another vector, made it orthonormal then projected everyting onto those vectors. It gave the first picture and I did a KD nearest neigbour search to rasterize it, second image. I kind of like the projection to a plane as I feel it makes it more like a real single camera, at least it'll behave just like a pinhole camera.

The projection to an imaginary Juno centered sphere is nice because it doesn't need any spice ray projections so takes no time and as I'm unprojecting it from the same point of view the Jupiter mapping bit isn't actually doing anything (other than some super serious finding of errors!). I feel the sphere project -> plane -> dimension reduction can easily be made into a single stage so you wouldn't actually have to do the imaginary sphere thing (Though honesty it's not that bad as it is). That should get it way faster than what I have so then a super simple pipeline can use all the hard work that went into the SPICE kernels to make nicely stitched pictures. I'm sure the nearest neighbour search can be improved on too because the data has structure so you always know which direction will get you closer, anyone know of an implementation of such a thing?
Attached thumbnail(s)
Attached Image

Attached Image
 
Go to the top of the page
 
+Quote Post
scalbers
post Nov 22 2019, 08:07 PM
Post #27


Senior Member
****

Group: Members
Posts: 1624
Joined: 5-March 05
From: Boulder, CO
Member No.: 184



Good to see - I would guess this also makes it easier to show the atmosphere if one is zoomed in near the limb.

QUOTE (Brian Swift @ Nov 10 2019, 08:19 AM) *
Having a non-cartographic technique that renders rather than discards pixels that don't intercept Jupiter supports creation of images like this showing both moon (Europa?) and Jupiter from PJ8_109.

Attached Image


--------------------
Steve [ my home page and planetary maps page ]
Go to the top of the page
 
+Quote Post
adamg
post Nov 24 2019, 01:34 AM
Post #28


Junior Member
**

Group: Members
Posts: 31
Joined: 31-October 19
Member No.: 8699



Tried doing color, seems to work. The KD tree was way too slow so I just idiot histogrammed it. I probably need a hit count pass to clean it a bit but it'll do for now. Thanks for the help.
Attached thumbnail(s)
Attached Image
Attached Image
 
Go to the top of the page
 
+Quote Post
adamg
post Nov 25 2019, 09:43 PM
Post #29


Junior Member
**

Group: Members
Posts: 31
Joined: 31-October 19
Member No.: 8699



Anyone have any good advice on scaling the RGB to give the closest to true color? I tried these two [0.487, 0.408, 0.172] [0.444, 0.341, 0.155] from Brian's flow and I get a strong orange hue. I also tried the coefficients [0.51, 0.63, 1.0] from Gerald's slides and it comes out pretty blue looking. I tried a few and felt like [1,1,0.95] seemed close but I've only seen artistically processed images from Junocam processing that look so different to pictures of Jupiter, I simply don't know any more!

Many Thanks
Attached thumbnail(s)
Attached Image

 
Go to the top of the page
 
+Quote Post
Bjorn Jonsson
post Nov 25 2019, 11:01 PM
Post #30


IMG to PNG GOD
****

Group: Moderator
Posts: 2250
Joined: 19-February 04
From: Near fire and ice
Member No.: 38



I usually multiply by [1, 1.12, 2.3] after decompanding. I'm surprised that [1, 1, 0.95] seemed close - are you decompanding the images before applying these?
Go to the top of the page
 
+Quote Post

5 Pages V  < 1 2 3 4 > » 
Reply to this topicStart new topic

 



RSS Lo-Fi Version Time is now: 16th April 2024 - 04:51 AM
RULES AND GUIDELINES
Please read the Forum Rules and Guidelines before posting.

IMAGE COPYRIGHT
Images posted on UnmannedSpaceflight.com may be copyrighted. Do not reproduce without permission. Read here for further information on space images and copyright.

OPINIONS AND MODERATION
Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators.
SUPPORT THE FORUM
Unmannedspaceflight.com is funded by the Planetary Society. Please consider supporting our work and many other projects by donating to the Society or becoming a member.