Beginner level projection |
Beginner level projection |
Nov 6 2019, 11:50 PM
Post
#1
|
|
Junior Member Group: Members Posts: 31 Joined: 31-October 19 Member No.: 8699 |
Hi,
I'm trying to do the mapping just using SPICE for the Junocam images rather than going through ISIS. It might seem like I'm making life difficult for myself but I just wanted to make sure I'm doing the bare minimum amount of processing so it looks as close to a non-spinny-push-frame image as I can. I'm making all the beginner mistakes but I can't seem to figure out what they are so I was hoping someone might give me a hint? First I want to figure out how far Jupiter is because I want to have anything that didn't hit Jupiter get mapped to a sphere that hits Jupiter. This means any lens effects can get mapped in space near where they originated from which should hopefully make the edges of the planet seem less processed: [jupiterPos, jupiterLt] = spice.spkpos('JUPITER', et0, 'JUNO_SPACECRAFT', 'NONE', 'JUNO') jupiterDistance = np.linalg.norm(jupiterPos) Then I go through all the undistorted Junocam pixel vectors and figure Jupiter intercept points in the IAU_JUPITER frame so now I have a point cloud of all the planet mapped pixels: [point, trgepc, srfvec, found ] = spice.sincpt( 'Ellipsoid', 'JUPITER', et, 'IAU_JUPITER', 'LT+S', 'JUNO_SPACECRAFT', 'JUNO_JUNOCAM', v) If it didn't find an intersection then I figure out where that invisible sphere is going to be by extending out the pixel vector by the separation and move that to the IAU_JUPITER frame: direction = np.array(v) pos = direction*jupiterDistance/np.linalg.norm(direction) rotationMatrix = spice.pxform('JUNO_JUNOCAM','IAU_JUPITER',et-jupiterLt) pos = spice.mxv(rotationMatrix, pos) It works terribly! I seem to have a timing error in the frame offsets that seems suspiciously close to the light time between Jupiter and Juno and the sphere that catches all the sincpt misses is completely miles off. I've been staring at the code for a while and not figuring my mistake, I was hoping someone might point out where I've gone wrong. Many thanks Adam |
|
|
Aug 3 2020, 02:01 PM
Post
#2
|
|
Newbie Group: Members Posts: 15 Joined: 1-August 20 Member No.: 8847 |
Hello All,
I wasn't sure if I should post here or make a new topic. I'm happy to do that if it fits better with this forum. I see a discrepancy in what fovtrg_c tells me vs what I get from sincpt_c. I'm sure I'm doing something wrong but I've been hacking around for a week or so (nights and weekends project and I've not made much progress. So, I figured I'd ask the experts. My goal here is to understand the data, eventually I want to turn my code into an image processing pipeline. When I powered through and got to a 3d image it looks OK, but is very blurry. The color does not align well etc. So I decided to step back to the beginning and try to ensure everything is doing what I expect. That's when I ran across this image that has Io in it. Using that as a test vehicle has brought into sharp focus the miss that's happening in my code. So I figured it would make a great subject to get my code straightened out before moving back to my 3d image creation. I've listed my code below with some of the results in the comments. Any pointers to what I'm doing wrong would be most welcome. Thanks! I load and parse these files. JNCE_2019307_23C00028_V01.LBL JNCE_2019307_23C00028_V01.IMG furnsh_c("JNO_SCLKSCET.00102.tsc") furnsh_c("juno_junocam_v03.ti") furnsh_c("juno_struct_v04.bsp") furnsh_c("juno_v12.tf") furnsh_c("naif0012.tls") furnsh_c("pck00010.tpc") furnsh_c("trimmed_jup310.bsp") furnsh_c("juno_sc_rec_191102_191104_v01.bc") furnsh_c("spk_rec_191010_191201_191210.bsp") // startTime (from LBL) - 2019-11-03T22:26:16.510 // frame 29 // from the juno_junocam_v03.ti kernel // INS-61500_START_TIME_BIAS = 0.06188 // INS-61500_INTERFRAME_DELTA = 0.001 // from the LBL file // INTERFRAME_DELAY = 0.371 <s> // 2019-11-03T22:26:16.510 + INS-61500_START_TIME_BIAS + (0.371 + 0.001) * frameNumber // so frame time is: // 2019-11-03T22:26:27.35988 str2et_c("2019 Nov 03 22:26:27.360", &frameTime) str2et_c returned 6.26092e+08 fovtrg_c("JUNO_JUNOCAM_RED", "IO", "ELLIPSOID", "IAU_IO", "LT+S", "JUNO", 6.26092e+08) fovtrg_c returned true // Examaning the image from the red part of frame 29 I see Io is present on // pixel 154, 124 // and several others, but that one is kind of in the 'center' of the visible Io // I'm happy to upload the image I'm referring to. My code takes the IMG data from the PSD file (mentioned above) and turns that into a PNG file. // I tried two approaches to finding the pointing vector, // bilinear interpolate over the pointing vectors in the kernel // INS-61503_FOV_BOUNDARY_CORNERS = ( // -0.47949606, 0.09218676, 0.87268845 // -0.47518685, 0.16768048, 0.86375964 // 0.48724863, 0.16654879, 0.85723408 // 0.49166330, 0.09156385, 0.86595800 // ) // u 0.081468 ((154 - 23) / 1608) // v 0.96875 (124 / 128) bilinearInterpolate(0.081468, 0.96875, [ {-0.479496, 0.0921868, 0.872688} {-0.475187, 0.16768, 0.86376} {0.487249, 0.166549, 0.857234} {0.491663, 0.0915639, 0.865958} ] pointing = {-0.396892, 0.16523, 0.863507} sincpt_c("Ellipsoid", "IO", 6.26092e+08, "IAU_IO", "CN+S", "JUNO", "JUNO_JUNOCAM", {-0.39689193, 0.1652304, 0.86350652}) sincpt_c returned false // I also implemented the distord/undistort camera model // pixel = {154, 124} // pointing = {-0.398252, 0.166199, 0.902094} sincpt_c("Ellipsoid", "IO", 6.26092e+08, "IAU_IO", "CN+S", "JUNO", "JUNO_JUNOCAM", {-0.39825207, 0.16619926, 0.90209373}) sincpt_c returned false // I also spent some time adding an arbitrary time offset (up to 0.1 seconds, by a 0.025 increment) // and I do end up with intersections but they are several pixels to the left of the captured image. |
|
|
Aug 3 2020, 03:24 PM
Post
#3
|
|
Senior Member Group: Members Posts: 2517 Joined: 13-September 05 Member No.: 497 |
// I tried two approaches to finding the pointing vector, // bilinear interpolate over the pointing vectors in the kernel This is certainly not the right thing to be doing. I know you tried it both ways and the second way is the correct approach. Post the code by which you actually go from pixel x,y to Junocam frame pointing vector unless it's identical to the code given in the kernel. You want to be going in the correct direction (distorted to undistorted). It would also be diagnostic if you computed the Io-to-Juno vector and looked at the angle between the pointing vector and that vector (in the same frame) to see how close they are. It's easy for sincpt to miss with the smallest of errors, and Io is obviously very small. -------------------- Disclaimer: This post is based on public information only. Any opinions are my own.
|
|
|
Aug 3 2020, 06:42 PM
Post
#4
|
|
Newbie Group: Members Posts: 15 Joined: 1-August 20 Member No.: 8847 |
This is certainly not the right thing to be doing. I know you tried it both ways and the second way is the correct approach. Post the code by which you actually go from pixel x,y to Junocam frame pointing vector unless it's identical to the code given in the kernel. You want to be going in the correct direction (distorted to undistorted). It would also be diagnostic if you computed the Io-to-Juno vector and looked at the angle between the pointing vector and that vector (in the same frame) to see how close they are. It's easy for sincpt to miss with the smallest of errors, and Io is obviously very small. Thanks Mike, Sorry to ask you to translate from Swift. Here is my code: CODE let pointing = normalize(pointingVectorFor(pixelCoord: SIMD2<Double>(x: Double(154.0), y: Double(124.0)))) CODE // distortionModel is my struct keeping track of all the camera model parameters. // INS-61503_DISTORTION_K1 = -5.9624209455667325e-08 // INS-61503_DISTORTION_K2 = 2.7381910042256151e-14 // INS-61503_DISTORTION_X = 814.21 // INS-61503_DISTORTION_Y = -151.52 public func pointingVectorFor(pixelCoord: SIMD2<Double>) -> SIMD3<Double> { /* and the following takes an XY coordinate in Junocam framelet coordinates and produces a vector in the JUNO_JUNOCAM reference frame (of arbitrary length). cam[0] = x-cx cam[1] = y-cy cam = undistort(cam) v = [cam[0], cam[1], fl] */ let undistorted = undistort(pixelCoord: pixelCoord - distortionModel.offset) return SIMD3<Double>(x: undistorted.x, y: undistorted.y, z: distortionModel.fl) } CODE func distort(pixelCoord: SIMD2<Double>) -> SIMD2<Double> { /* def distort(c): xd, yd = c[0], c[1] r2 = (xd**2+yd**2) dr = 1+k1*r2+k2*r2*r2 xd *= dr yd *= dr return [xd, yd] */ let r2 = length_squared(pixelCoord - distortionModel.offset) let dr = 1.0 + distortionModel.k1 * r2 + distortionModel.k2 * r2 * r2 return pixelCoord * dr } func undistort(pixelCoord: SIMD2<Double>) -> SIMD2<Double> { /* def undistort(c): xd, yd = c[0], c[1] for i in range(5): # fixed number of iterations for simplicity r2 = (xd**2+yd**2) dr = 1+k1*r2+k2*r2*r2 xd = c[0]/dr yd = c[1]/dr return [xd, yd] */ var current: SIMD2<Double> = pixelCoord for _ in 0..<5 { let r2 = length_squared(current - distortionModel.offset) let dr = 1.0 + distortionModel.k1 * r2 + distortionModel.k2 * r2 * r2 current = pixelCoord / dr } return current } On the pointing vs position... CODE spkezr_c("JUNO", 6.26092e+08, "IAU_IO", "LT+S", "IO") //spkezr_c returned state vector = {383394, -72976.8, 2221.98, -21.9864, -1.4944, -53.9273}, light time 1.30185 // position normalized: // (0.98234650459544559, -0.18698377487311099, 0.0056932278465209587) CODE let pointing = normalize(pointingVectorFor(pixelCoord: SIMD2<Double>(x: Double(154), y: Double(124)))) // -0.39825207416281955, 0.1661992570141925, 0.90209372705553315 dot(positionNormalized, pointing) // -0.41716227233230763 // not correct... Thanks again! |
|
|
Lo-Fi Version | Time is now: 13th June 2024 - 04:43 AM |
RULES AND GUIDELINES Please read the Forum Rules and Guidelines before posting. IMAGE COPYRIGHT |
OPINIONS AND MODERATION Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators. |
SUPPORT THE FORUM Unmannedspaceflight.com is funded by the Planetary Society. Please consider supporting our work and many other projects by donating to the Society or becoming a member. |