IPB

Welcome Guest ( Log In | Register )

5 Pages V  « < 2 3 4 5 >  
Reply to this topicStart new topic
Beginner level projection
mcaplinger
post Aug 3 2020, 03:24 PM
Post #46


Senior Member
****

Group: Members
Posts: 2511
Joined: 13-September 05
Member No.: 497



QUOTE (Bill Dudney @ Aug 3 2020, 06:01 AM) *
// I tried two approaches to finding the pointing vector,
// bilinear interpolate over the pointing vectors in the kernel


This is certainly not the right thing to be doing. I know you tried it both ways and the second way is the correct approach.

Post the code by which you actually go from pixel x,y to Junocam frame pointing vector unless it's identical to the code given in the kernel. You want to be going in the correct direction (distorted to undistorted).

It would also be diagnostic if you computed the Io-to-Juno vector and looked at the angle between the pointing vector and that vector (in the same frame) to see how close they are. It's easy for sincpt to miss with the smallest of errors, and Io is obviously very small.


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
Bill Dudney
post Aug 3 2020, 06:42 PM
Post #47


Newbie
*

Group: Members
Posts: 15
Joined: 1-August 20
Member No.: 8847



QUOTE (mcaplinger @ Aug 3 2020, 03:24 PM) *
This is certainly not the right thing to be doing. I know you tried it both ways and the second way is the correct approach.

Post the code by which you actually go from pixel x,y to Junocam frame pointing vector unless it's identical to the code given in the kernel. You want to be going in the correct direction (distorted to undistorted).

It would also be diagnostic if you computed the Io-to-Juno vector and looked at the angle between the pointing vector and that vector (in the same frame) to see how close they are. It's easy for sincpt to miss with the smallest of errors, and Io is obviously very small.


Thanks Mike,

Sorry to ask you to translate from Swift.

Here is my code:

CODE
let pointing = normalize(pointingVectorFor(pixelCoord: SIMD2<Double>(x: Double(154.0), y: Double(124.0))))


CODE
// distortionModel is my struct keeping track of all the camera model parameters.
//         INS-61503_DISTORTION_K1         =   -5.9624209455667325e-08
//         INS-61503_DISTORTION_K2         =    2.7381910042256151e-14
//         INS-61503_DISTORTION_X           =  814.21
//         INS-61503_DISTORTION_Y           = -151.52
    public func pointingVectorFor(pixelCoord: SIMD2<Double>) -> SIMD3<Double> {
        /*
         and the following takes an XY coordinate in Junocam framelet
         coordinates and produces a vector in the JUNO_JUNOCAM reference
         frame (of arbitrary length).

            cam[0] = x-cx
            cam[1] = y-cy
            cam = undistort(cam)
            v = [cam[0], cam[1], fl]

         */
        let undistorted = undistort(pixelCoord: pixelCoord - distortionModel.offset)
        
        return SIMD3<Double>(x: undistorted.x, y: undistorted.y, z: distortionModel.fl)
    }


CODE
    func distort(pixelCoord: SIMD2<Double>) -> SIMD2<Double> {
        /*
         def distort(c):
           xd, yd = c[0], c[1]
           r2 = (xd**2+yd**2)
           dr = 1+k1*r2+k2*r2*r2
           xd *= dr
           yd *= dr
           return [xd, yd]
         */
        let r2 = length_squared(pixelCoord - distortionModel.offset)
        let dr = 1.0 + distortionModel.k1 * r2 + distortionModel.k2 * r2 * r2
        return pixelCoord * dr
    }
    
    func undistort(pixelCoord: SIMD2<Double>) -> SIMD2<Double> {
        /*
         def undistort(c):
           xd, yd = c[0], c[1]
           for i in range(5): # fixed number of iterations for simplicity
             r2 = (xd**2+yd**2)
             dr = 1+k1*r2+k2*r2*r2
             xd = c[0]/dr
             yd = c[1]/dr
           return [xd, yd]
         */
        var current: SIMD2<Double> = pixelCoord
        for _ in 0..<5 {
            let r2 = length_squared(current - distortionModel.offset)
            let dr = 1.0 + distortionModel.k1 * r2 + distortionModel.k2 * r2 * r2
            current = pixelCoord / dr
        }
        return current
    }


On the pointing vs position...

CODE
spkezr_c("JUNO", 6.26092e+08, "IAU_IO", "LT+S", "IO")

//spkezr_c returned state vector = {383394, -72976.8, 2221.98, -21.9864, -1.4944, -53.9273}, light time 1.30185
// position normalized:
// (0.98234650459544559, -0.18698377487311099, 0.0056932278465209587)


CODE
let pointing = normalize(pointingVectorFor(pixelCoord: SIMD2<Double>(x: Double(154), y: Double(124))))
// -0.39825207416281955, 0.1661992570141925, 0.90209372705553315
dot(positionNormalized, pointing)
// -0.41716227233230763
// not correct...


Thanks again!
Go to the top of the page
 
+Quote Post
mcaplinger
post Aug 3 2020, 08:26 PM
Post #48


Senior Member
****

Group: Members
Posts: 2511
Joined: 13-September 05
Member No.: 497



QUOTE (Bill Dudney @ Aug 3 2020, 10:42 AM) *
On the pointing vs position...
...// not correct...[/code]

You didn't transform the Io vector into the Junocam frame.

I suggest we take this offline since I doubt it's of much interest to the forum. You should have my email because we went over a lot of this back in 2016. Though actually, maybe someone else will have more insight into what's going on with your code.


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
Bjorn Jonsson
post Aug 3 2020, 09:31 PM
Post #49


IMG to PNG GOD
****

Group: Moderator
Posts: 2250
Joined: 19-February 04
From: Near fire and ice
Member No.: 38



I haven't looked at the code in any detail so I don't know if this is of relevance here but my single biggest lesson with SPICE is probably: Understand the SPICE reference frames and use them a lot (of course there are many more things you need to know). Knowing how to use them results in shorter, simpler code among other things (and almost certainly fewer bugs).

That said, I often find discussion like the posts above interesting.
Go to the top of the page
 
+Quote Post
Bill Dudney
post Aug 3 2020, 10:02 PM
Post #50


Newbie
*

Group: Members
Posts: 15
Joined: 1-August 20
Member No.: 8847



QUOTE (Bjorn Jonsson @ Aug 3 2020, 09:31 PM) *
I haven't looked at the code in any detail so I don't know if this is of relevance here but my single biggest lesson with SPICE is probably: Understand the SPICE reference frames and use them a lot (of course there are many more things you need to know). Knowing how to use them results in shorter, simpler code among other things (and almost certainly fewer bugs).

That said, I often find discussion like the posts above interesting.

Thanks Bjorn,

I'm glad I could provide an interesting conversation smile.gif The crazy thing about this whole endeavor has been how much there is to learn. I have some experience in 3D programming and lots of general programming exp. My degrees is in aero-engineering so I'm no stranger to this kind of stuff, but I'm still clueless despite many many hrs reading, studying any playing around with Spice.

I love it, it's so much fun to build things like this!

And thanks for all you've done with the Juno data. It is one of the many inspirations of me wanting to keep going.

TTFN,

-bd
Go to the top of the page
 
+Quote Post
Bill Dudney
post Aug 3 2020, 10:09 PM
Post #51


Newbie
*

Group: Members
Posts: 15
Joined: 1-August 20
Member No.: 8847



QUOTE (mcaplinger @ Aug 3 2020, 08:26 PM) *
You didn't transform the Io vector into the Junocam frame.

I suggest we take this offline since I doubt it's of much interest to the forum. You should have my email because we went over a lot of this back in 2016. Though actually, maybe someone else will have more insight into what's going on with your code.


Thanks for the response!

I have read and re-read the email exchange from 2016. Thank you so much for taking the time to help me out again! This project has been on a shelf for quite a while and I'm just starting to get back to the point that I understand most of what I'm trying to do.

I did that pointing vs position code in a hurry over lunch so clear 'in a hurry' mistake.

I take from the conversation thus far that I'm on the right track I'm just making boneheaded mistakes like this one. If that's true I will keep checking and double checking my code. If the original code path is off in the weeds I'd love to hear about that as well.

I'll do the transform and repost the results when I get a chance to get back to it this evening.

I'm happy to take this off line, but in case there are others out there trying to do stuff with this data it might be meaningful to see it written up where google can find it.

Thanks again,

-bd
Go to the top of the page
 
+Quote Post
Bill Dudney
post Aug 4 2020, 02:29 PM
Post #52


Newbie
*

Group: Members
Posts: 15
Joined: 1-August 20
Member No.: 8847



QUOTE (mcaplinger @ Aug 3 2020, 08:26 PM) *
You didn't transform the Io vector into the Junocam frame.

I suggest we take this offline since I doubt it's of much interest to the forum. You should have my email because we went over a lot of this back in 2016. Though actually, maybe someone else will have more insight into what's going on with your code.


I have a misunderstanding or some assumption that is off somewhere. I really appreciate you taking the time to help me find and correct it.

I expect these two calls to be equal and opposite:
CODE
spkezr_c("JUNO", 6.26092e+08, "IAU_IO", "LT+S", "IO")
    spkezr_c returned state vector = {383394, -72976.8, 2221.98, -21.9864, -1.4944, -53.9273}, time = 1.30185
    length(383394, -72976.8, 2221.98, -21.9864,) 390231.12745828484

spkezr_c("IO", 6.26092e+08, "IAU_IO", "LT+S", "JUNO")
    spkezr_c returned state vector = {-383346, 72946.4, -2221.68, 21.9648, 1.50007, 53.9199}, time = 1.30167
    length(-383346, 72946.4, -2221.68) 390231.12745828496

    position delta = (47.994598865217995, -30.373592960997485, 0.29506775923073292)


Distances are more or less the same, direction is more or less the same but the position delta is more than I expect. I temper my expectations with the fact that the difference is <1e-4 which is pretty small. But that still seems a lot with double precision. But, maybe with all the floating point math going on behind that simple looking facade it's remarkable that the diff is that small.

I'd also expect this:

CODE
spkezr_c("IO", 6.26092e+08, "JUNO_JUNOCAM", "LT+S", "JUNO")
    spkezr_c returned state vector = {-158406, 67136.3, 350258, -86.7244, -73116.8, 13951.5}, time = 1.30167


to have the same magnitude, but it's off by ~50km. That's also pretty small (<1e-4) in the big picture. So maybe I don't misunderstand and these are 'the same'.

So then back to the original correction:

CODE
pointing = (-0.39825207416281955, 0.1661992570141925, 0.90209372705553315)

position = (-0.40592858453022934, 0.17204230427314171, 0.89756527885255921)

pointing dot position = 0.99994321157209098


which is pretty close.

So maybe it's just that Io is very small and I should not expect spkezr to find good intersection points.

Thanks again for your time!
Go to the top of the page
 
+Quote Post
mcaplinger
post Aug 4 2020, 03:01 PM
Post #53


Senior Member
****

Group: Members
Posts: 2511
Joined: 13-September 05
Member No.: 497



QUOTE (Bill Dudney @ Aug 4 2020, 06:29 AM) *
I expect these two calls to be equal and opposite:
[code]spkezr_c("JUNO", 6.26092e+08, "IAU_IO", "LT+S", "IO")
spkezr_c("IO", 6.26092e+08, "IAU_IO", "LT+S", "JUNO")

"LT+S" corrections depend on observer velocity so these are not equivalent. If you used "NONE" or "LT" I think you'd find the answers were exactly the same.

Note that if you're using six significant digits for time you have about 1000 seconds of slop at the current epoch, but I assume you're just eliding for presentation.


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
Bill Dudney
post Aug 4 2020, 06:25 PM
Post #54


Newbie
*

Group: Members
Posts: 15
Joined: 1-August 20
Member No.: 8847



QUOTE (mcaplinger @ Aug 4 2020, 03:01 PM) *
"LT+S" corrections depend on observer velocity so these are not equivalent. If you used "NONE" or "LT" I think you'd find the answers were exactly the same.

Note that if you're using six significant digits for time you have about 1000 seconds of slop at the current epoch, but I assume you're just eliding for presentation.


Ah of course. I'll try again with NONE or LT and see what's up.

Yeah, those time values are both doubles.

So back to my original question, should I expect to see sincpt_c hitting Io? if I take the pointing vector as calculated?

I don't understand the discrepancy between what sincpt_c reports and what fovtrg_c is reporting. As a test I iterated through all the pixels and did an intercept test with sincpt_c and got zero hits on 29/red. I do get hits on 30red, 31green and 32blue but they are off quite a bit compared to the data.

Thanks again.
Go to the top of the page
 
+Quote Post
mcaplinger
post Aug 4 2020, 07:17 PM
Post #55


Senior Member
****

Group: Members
Posts: 2511
Joined: 13-September 05
Member No.: 497



QUOTE (Bill Dudney @ Aug 4 2020, 10:25 AM) *
So back to my original question, should I expect to see sincpt_c hitting Io?

If Io subtends more than a couple of pixels, and you iterate over all pixels, you should obviously expect to get at least one hit*. This is of course a very stressing case. For debugging it makes more sense to overlay pixel hits on Jupiter on an image of Jupiter since it's much harder to miss. 1-2 pixel errors aren't unexpected once the start times have been corrected, larger ones would be unusual.

On the FOV pyramids, note that the I kernel says "They are provided for visualization purposes, do not represent the geometry of the actual FOV sides, and are not intended for quantitative analysis."

*Assuming you have the camera model correct. If you've got a bug associated with the focal length/pixel scale, then clearly you could still miss.


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
Bill Dudney
post Aug 5 2020, 02:16 AM
Post #56


Newbie
*

Group: Members
Posts: 15
Joined: 1-August 20
Member No.: 8847



QUOTE (mcaplinger @ Aug 4 2020, 07:17 PM) *
If Io subtends more than a couple of pixels, and you iterate over all pixels, you should obviously expect to get at least one hit*. This is of course a very stressing case. For debugging it makes more sense to overlay pixel hits on Jupiter on an image of Jupiter since it's much harder to miss. 1-2 pixel errors aren't unexpected once the start times have been corrected, larger ones would be unusual.

On the FOV pyramids, note that the I kernel says "They are provided for visualization purposes, do not represent the geometry of the actual FOV sides, and are not intended for quantitative analysis."

*Assuming you have the camera model correct. If you've got a bug associated with the focal length/pixel scale, then clearly you could still miss.


I suspect I've got a bug. The hard part has been finding it smile.gif

I saw that note. I think that means 'don't use the pointing vectors for doing geometry'. I only use them in my bi-linear interpolation test as a 'sanity' test trying to see if my code is way off or not. The bilinear interpolation should at least put a bounds more or less across the frustum. I wouldn't expect that approach to be accurate but I'd expect that if fovtgr says the body is in the frustum the bilinear interpolation should give at least a hit or two.

I assume it does not mean that I can't reproduce a 3d model with image and the geometry data in the kernel files. Especially since I see what was done in stuff like this post. I'm sure I'm doing something boneheaded and just need to figure out what it is.

When I put the pixels referenced in the kernel's comments through my code I get results that are close to the values listed in the kernel, but not exact.

Here is my pointing vector to pixel code:

CODE
public func pixeleCoordFor(pointingVector: SIMD3<Double>) -> SIMD2<Double> {
    /*
     given a vector v in the JUNO_JUNOCAM reference frame, the following
     computes an XY coordinate in Junocam framelet coordinates with 0,0
     in the upper left:
       alpha = v[2]/fl
       cam = [v[0]/alpha, v[1]/alpha]
       cam = distort(cam)
       x = cam[0]+cx
       y = cam[1]+cy
     */
     //  INS-61504_FOCAL_LENGTH           = ( 10.95637 )
     //  INS-61504_PIXEL_SIZE             = (  0.0074  )
    let alpha = pointingVector.z / distortionModel.fl //   focalLength / pixelSize => 10.95637 / 0.0074
    let cam = SIMD2<Double>(x: pointingVector.x / alpha, y: pointingVector.y / alpha)
    let distorted = distort(pixelCoord: cam)
    return SIMD2<Double>(x: distorted.x + distortionModel.cx, y: distorted.y + distortionModel.cy)
}


And my pixel to pointing vector:

CODE
public func pointingVectorFor(pixelCoord: SIMD2<Double>) -> SIMD3<Double> {
    /*
      and the following takes an XY coordinate in Junocam framelet
      coordinates and produces a vector in the JUNO_JUNOCAM reference
      frame (of arbitrary length).

        cam[0] = x-cx
        cam[1] = y-cy
        cam = undistort(cam)
        v = [cam[0], cam[1], fl]
     */
    let undistorted = undistort(pixelCoord: pixelCoord - distortionModel.offset)
        
    return SIMD3<Double>(x: undistorted.x, y: undistorted.y, z: distortionModel.fl)
}


I run the following test to ensure that the values match. When I do the top left value of {23.5, 0.5} I get a vector that's close to the one in the kernel but not exactly the same.

Not sure if that means I have a bug in my camera model code or if there is some numerical difference between my code and the code used to create the values in that kernel.

CODE
// red.bounds[0] comes from the kernel (-0.47949605975108789, 0.092186759952144759, 0.872688449546977)
let topLeft = red.pointingVectorFor(pixelCoord: SIMD2<Double>(x: 23.5, y: 0.5))
// topLeft = (-0.45867789369502981, 0.088184307014605154, 0.88421610358093161)
XCTAssertEqual(dot(normalize(topLeft), normalize(red.bounds[0])), 1.0, accuracy: 0.0005) // assert the dot product is less that 0.0005 different than 1
// the acos of the dot product 0.024131529862876068 or about 1.3 degrees, not very different but off by more than I'd expect


If I put the pointing vector in the kernel through my pointing vector to pixel routine I get (-33, 11) which is clearly not correct. But I don't see what's wrong with the code.

But my pointing vector/pixel coord conversion code seems to be more or less symmetrical.

CODE
(lldb) p normalize(Instrument.instrumentFor(id: .red).pointingVectorFor(pixelCoord: SIMD2<Double>(x: 23.5, y: 0.5)))
(SIMD3<Double>) $R22 = (-0.45867789369502981, 0.088184307014605154, 0.88421610358093161)
(lldb) p Instrument.instrumentFor(id: .red).pixeleCoordFor(pointingVector: $R22)
(SIMD2<Double>) $R24 = (23.509524706956768, 0.4981688027828568)


I have uploaded the test image that I've generated from the raw IMG file as well as an image I made from coloring every pixel that sinctp_c says is an intersection. I pulled both into Pixelmator (like Photoshop) and overlaid them with transparency. That shows how far off the intersections are from the captured image. I can upload the full framelet image too if it helps see Jupiter in the frame. The intersection is about 14 pixels to the left and 4 pixels down from the captured image. This is from frame 30, the red filter.

Attached Image


I know it's not easy to take time away from your real job to help me, and I very much appreciate it!
Go to the top of the page
 
+Quote Post
mcaplinger
post Aug 5 2020, 05:55 PM
Post #57


Senior Member
****

Group: Members
Posts: 2511
Joined: 13-September 05
Member No.: 497



QUOTE (Bill Dudney @ Aug 4 2020, 06:16 PM) *
When I do the top left value of {23.5, 0.5} I get a vector that's close to the one in the kernel but not exactly the same.

When I do that I produce an answer that matches the kernel value to numerical precision.

Here's my code. There's not much to it.

Are you sure you translated undistort correctly? I'm suspicious.

CODE

def undistort©:
xd, yd = c[0], c[1]
for i in range(5):
r2 = (xd**2+yd**2)
dr = 1+k1*r2+k2*r2*r2
xd = c[0]/dr
yd = c[1]/dr
return [xd, yd]

def xy2vector(x, y, band):
cam = [x-cx, y-cy[band]]
cam = undistort(cam)
v = (cam[0], cam[1], fl)
return v

>>> spice.vhat(xy2vector(23.5, 0.5, RED))
(-0.47949601447452606, 0.09218674877062065, 0.8726884756052113)




--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
Bill Dudney
post Aug 6 2020, 01:59 AM
Post #58


Newbie
*

Group: Members
Posts: 15
Joined: 1-August 20
Member No.: 8847



QUOTE (mcaplinger @ Aug 5 2020, 06:55 PM) *
When I do that I produce an answer that matches the kernel value to numerical precision.

Here's my code. There's not much to it.

Are you sure you translated undistort correctly? I'm suspicious.

CODE

def undistort©:
xd, yd = c[0], c[1]
for i in range(5):
r2 = (xd**2+yd**2)
dr = 1+k1*r2+k2*r2*r2
xd = c[0]/dr
yd = c[1]/dr
return [xd, yd]

def xy2vector(x, y, band):
cam = [x-cx, y-cy[band]]
cam = undistort(cam)
v = (cam[0], cam[1], fl)
return v

>>> spice.vhat(xy2vector(23.5, 0.5, RED))
(-0.47949601447452606, 0.09218674877062065, 0.8726884756052113)


Like I said, I'm a bonehead, and likely doing something silly. I was double applying the x/y offset from the camera model. The hits on Io in Frame 29 for the red framelet are now off by 1 pixel up and 3 pixels to the right. I'd guess that's pretty good for tiny little Io. I might try some anti-aliasing (8x as many rays, 4 in each direction, average the results) and see if I can get things looking closer smile.gif. The frame 32 blue framelet is only off by one pixel to the right. Great progress, thanks again for helping me get over this hump!

Thanks again for all your help, I definitely owe you a beverage of your choice smile.gif
Go to the top of the page
 
+Quote Post
Bill Dudney
post Nov 5 2020, 12:36 PM
Post #59


Newbie
*

Group: Members
Posts: 15
Joined: 1-August 20
Member No.: 8847



Hi All,

Finally got to the point that I have something cool to show. Still plenty more to do but I'm stoked at where I am. It's interactive, projected onto a sphere, and will display any image from any data set from the PDS archive. The rainbow in the back is the normal to help me be sure I'm putting the data in roughly the correct spot.

Couldn't have done it without your help. Thanks again!
Attached thumbnail(s)
Attached Image
 
Go to the top of the page
 
+Quote Post
Sean
post Nov 5 2020, 06:24 PM
Post #60


Member
***

Group: Members
Posts: 923
Joined: 10-November 15
Member No.: 7837



This is very cool. Beautiful work!


--------------------
Go to the top of the page
 
+Quote Post

5 Pages V  « < 2 3 4 5 >
Reply to this topicStart new topic

 



RSS Lo-Fi Version Time is now: 25th April 2024 - 01:12 PM
RULES AND GUIDELINES
Please read the Forum Rules and Guidelines before posting.

IMAGE COPYRIGHT
Images posted on UnmannedSpaceflight.com may be copyrighted. Do not reproduce without permission. Read here for further information on space images and copyright.

OPINIONS AND MODERATION
Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators.
SUPPORT THE FORUM
Unmannedspaceflight.com is funded by the Planetary Society. Please consider supporting our work and many other projects by donating to the Society or becoming a member.