IPB

Welcome Guest ( Log In | Register )

Beginner level projection
adamg
post Nov 6 2019, 11:50 PM
Post #1


Junior Member
**

Group: Members
Posts: 31
Joined: 31-October 19
Member No.: 8699



Hi,

I'm trying to do the mapping just using SPICE for the Junocam images rather than going through ISIS. It might seem like I'm making life difficult for myself but I just wanted to make sure I'm doing the bare minimum amount of processing so it looks as close to a non-spinny-push-frame image as I can. I'm making all the beginner mistakes but I can't seem to figure out what they are so I was hoping someone might give me a hint?

First I want to figure out how far Jupiter is because I want to have anything that didn't hit Jupiter get mapped to a sphere that hits Jupiter. This means any lens effects can get mapped in space near where they originated from which should hopefully make the edges of the planet seem less processed:

[jupiterPos, jupiterLt] = spice.spkpos('JUPITER', et0, 'JUNO_SPACECRAFT', 'NONE', 'JUNO')
jupiterDistance = np.linalg.norm(jupiterPos)

Then I go through all the undistorted Junocam pixel vectors and figure Jupiter intercept points in the IAU_JUPITER frame so now I have a point cloud of all the planet mapped pixels:

[point, trgepc, srfvec, found ] = spice.sincpt(
'Ellipsoid', 'JUPITER', et,
'IAU_JUPITER', 'LT+S', 'JUNO_SPACECRAFT',
'JUNO_JUNOCAM', v)

If it didn't find an intersection then I figure out where that invisible sphere is going to be by extending out the pixel vector by the separation and move that to the IAU_JUPITER frame:

direction = np.array(v)
pos = direction*jupiterDistance/np.linalg.norm(direction)
rotationMatrix = spice.pxform('JUNO_JUNOCAM','IAU_JUPITER',et-jupiterLt)
pos = spice.mxv(rotationMatrix, pos)

It works terribly! I seem to have a timing error in the frame offsets that seems suspiciously close to the light time between Jupiter and Juno and the sphere that catches all the sincpt misses is completely miles off. I've been staring at the code for a while and not figuring my mistake, I was hoping someone might point out where I've gone wrong.

Many thanks


Adam
Go to the top of the page
 
+Quote Post
 
Start new topic
Replies
Bill Dudney
post Aug 3 2020, 02:01 PM
Post #2


Newbie
*

Group: Members
Posts: 15
Joined: 1-August 20
Member No.: 8847



Hello All,

I wasn't sure if I should post here or make a new topic. I'm happy to do that if it fits better with this forum.

I see a discrepancy in what fovtrg_c tells me vs what I get from sincpt_c. I'm sure I'm doing something wrong but I've been hacking around for a week or so (nights and weekends project smile.gif and I've not made much progress. So, I figured I'd ask the experts.

My goal here is to understand the data, eventually I want to turn my code into an image processing pipeline. When I powered through and got to a 3d image it looks OK, but is very blurry. The color does not align well etc. So I decided to step back to the beginning and try to ensure everything is doing what I expect. That's when I ran across this image that has Io in it. Using that as a test vehicle has brought into sharp focus the miss that's happening in my code. So I figured it would make a great subject to get my code straightened out before moving back to my 3d image creation.

I've listed my code below with some of the results in the comments. Any pointers to what I'm doing wrong would be most welcome.

Thanks!

I load and parse these files.
JNCE_2019307_23C00028_V01.LBL
JNCE_2019307_23C00028_V01.IMG

furnsh_c("JNO_SCLKSCET.00102.tsc")
furnsh_c("juno_junocam_v03.ti")
furnsh_c("juno_struct_v04.bsp")
furnsh_c("juno_v12.tf")
furnsh_c("naif0012.tls")
furnsh_c("pck00010.tpc")
furnsh_c("trimmed_jup310.bsp")
furnsh_c("juno_sc_rec_191102_191104_v01.bc")
furnsh_c("spk_rec_191010_191201_191210.bsp")

// startTime (from LBL) - 2019-11-03T22:26:16.510
// frame 29
// from the juno_junocam_v03.ti kernel
// INS-61500_START_TIME_BIAS = 0.06188
// INS-61500_INTERFRAME_DELTA = 0.001
// from the LBL file
// INTERFRAME_DELAY = 0.371 <s>
// 2019-11-03T22:26:16.510 + INS-61500_START_TIME_BIAS + (0.371 + 0.001) * frameNumber
// so frame time is:
// 2019-11-03T22:26:27.35988

str2et_c("2019 Nov 03 22:26:27.360", &frameTime)
str2et_c returned 6.26092e+08
fovtrg_c("JUNO_JUNOCAM_RED", "IO", "ELLIPSOID", "IAU_IO", "LT+S", "JUNO", 6.26092e+08)
fovtrg_c returned true

// Examaning the image from the red part of frame 29 I see Io is present on
// pixel 154, 124
// and several others, but that one is kind of in the 'center' of the visible Io
// I'm happy to upload the image I'm referring to. My code takes the IMG data from the PSD file (mentioned above) and turns that into a PNG file.

// I tried two approaches to finding the pointing vector,
// bilinear interpolate over the pointing vectors in the kernel
// INS-61503_FOV_BOUNDARY_CORNERS = (
// -0.47949606, 0.09218676, 0.87268845
// -0.47518685, 0.16768048, 0.86375964
// 0.48724863, 0.16654879, 0.85723408
// 0.49166330, 0.09156385, 0.86595800
// )
// u 0.081468 ((154 - 23) / 1608)
// v 0.96875 (124 / 128)
bilinearInterpolate(0.081468, 0.96875, [
{-0.479496, 0.0921868, 0.872688}
{-0.475187, 0.16768, 0.86376}
{0.487249, 0.166549, 0.857234}
{0.491663, 0.0915639, 0.865958}
]

pointing = {-0.396892, 0.16523, 0.863507}

sincpt_c("Ellipsoid", "IO", 6.26092e+08, "IAU_IO", "CN+S", "JUNO", "JUNO_JUNOCAM", {-0.39689193, 0.1652304, 0.86350652})
sincpt_c returned false

// I also implemented the distord/undistort camera model
// pixel = {154, 124}
// pointing = {-0.398252, 0.166199, 0.902094}
sincpt_c("Ellipsoid", "IO", 6.26092e+08, "IAU_IO", "CN+S", "JUNO", "JUNO_JUNOCAM", {-0.39825207, 0.16619926, 0.90209373})
sincpt_c returned false

// I also spent some time adding an arbitrary time offset (up to 0.1 seconds, by a 0.025 increment)
// and I do end up with intersections but they are several pixels to the left of the captured image.
Go to the top of the page
 
+Quote Post
mcaplinger
post Aug 3 2020, 03:24 PM
Post #3


Senior Member
****

Group: Members
Posts: 2511
Joined: 13-September 05
Member No.: 497



QUOTE (Bill Dudney @ Aug 3 2020, 06:01 AM) *
// I tried two approaches to finding the pointing vector,
// bilinear interpolate over the pointing vectors in the kernel


This is certainly not the right thing to be doing. I know you tried it both ways and the second way is the correct approach.

Post the code by which you actually go from pixel x,y to Junocam frame pointing vector unless it's identical to the code given in the kernel. You want to be going in the correct direction (distorted to undistorted).

It would also be diagnostic if you computed the Io-to-Juno vector and looked at the angle between the pointing vector and that vector (in the same frame) to see how close they are. It's easy for sincpt to miss with the smallest of errors, and Io is obviously very small.


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
Bill Dudney
post Aug 3 2020, 06:42 PM
Post #4


Newbie
*

Group: Members
Posts: 15
Joined: 1-August 20
Member No.: 8847



QUOTE (mcaplinger @ Aug 3 2020, 03:24 PM) *
This is certainly not the right thing to be doing. I know you tried it both ways and the second way is the correct approach.

Post the code by which you actually go from pixel x,y to Junocam frame pointing vector unless it's identical to the code given in the kernel. You want to be going in the correct direction (distorted to undistorted).

It would also be diagnostic if you computed the Io-to-Juno vector and looked at the angle between the pointing vector and that vector (in the same frame) to see how close they are. It's easy for sincpt to miss with the smallest of errors, and Io is obviously very small.


Thanks Mike,

Sorry to ask you to translate from Swift.

Here is my code:

CODE
let pointing = normalize(pointingVectorFor(pixelCoord: SIMD2<Double>(x: Double(154.0), y: Double(124.0))))


CODE
// distortionModel is my struct keeping track of all the camera model parameters.
//         INS-61503_DISTORTION_K1         =   -5.9624209455667325e-08
//         INS-61503_DISTORTION_K2         =    2.7381910042256151e-14
//         INS-61503_DISTORTION_X           =  814.21
//         INS-61503_DISTORTION_Y           = -151.52
    public func pointingVectorFor(pixelCoord: SIMD2<Double>) -> SIMD3<Double> {
        /*
         and the following takes an XY coordinate in Junocam framelet
         coordinates and produces a vector in the JUNO_JUNOCAM reference
         frame (of arbitrary length).

            cam[0] = x-cx
            cam[1] = y-cy
            cam = undistort(cam)
            v = [cam[0], cam[1], fl]

         */
        let undistorted = undistort(pixelCoord: pixelCoord - distortionModel.offset)
        
        return SIMD3<Double>(x: undistorted.x, y: undistorted.y, z: distortionModel.fl)
    }


CODE
    func distort(pixelCoord: SIMD2<Double>) -> SIMD2<Double> {
        /*
         def distort(c):
           xd, yd = c[0], c[1]
           r2 = (xd**2+yd**2)
           dr = 1+k1*r2+k2*r2*r2
           xd *= dr
           yd *= dr
           return [xd, yd]
         */
        let r2 = length_squared(pixelCoord - distortionModel.offset)
        let dr = 1.0 + distortionModel.k1 * r2 + distortionModel.k2 * r2 * r2
        return pixelCoord * dr
    }
    
    func undistort(pixelCoord: SIMD2<Double>) -> SIMD2<Double> {
        /*
         def undistort(c):
           xd, yd = c[0], c[1]
           for i in range(5): # fixed number of iterations for simplicity
             r2 = (xd**2+yd**2)
             dr = 1+k1*r2+k2*r2*r2
             xd = c[0]/dr
             yd = c[1]/dr
           return [xd, yd]
         */
        var current: SIMD2<Double> = pixelCoord
        for _ in 0..<5 {
            let r2 = length_squared(current - distortionModel.offset)
            let dr = 1.0 + distortionModel.k1 * r2 + distortionModel.k2 * r2 * r2
            current = pixelCoord / dr
        }
        return current
    }


On the pointing vs position...

CODE
spkezr_c("JUNO", 6.26092e+08, "IAU_IO", "LT+S", "IO")

//spkezr_c returned state vector = {383394, -72976.8, 2221.98, -21.9864, -1.4944, -53.9273}, light time 1.30185
// position normalized:
// (0.98234650459544559, -0.18698377487311099, 0.0056932278465209587)


CODE
let pointing = normalize(pointingVectorFor(pixelCoord: SIMD2<Double>(x: Double(154), y: Double(124))))
// -0.39825207416281955, 0.1661992570141925, 0.90209372705553315
dot(positionNormalized, pointing)
// -0.41716227233230763
// not correct...


Thanks again!
Go to the top of the page
 
+Quote Post
mcaplinger
post Aug 3 2020, 08:26 PM
Post #5


Senior Member
****

Group: Members
Posts: 2511
Joined: 13-September 05
Member No.: 497



QUOTE (Bill Dudney @ Aug 3 2020, 10:42 AM) *
On the pointing vs position...
...// not correct...[/code]

You didn't transform the Io vector into the Junocam frame.

I suggest we take this offline since I doubt it's of much interest to the forum. You should have my email because we went over a lot of this back in 2016. Though actually, maybe someone else will have more insight into what's going on with your code.


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
Bill Dudney
post Aug 4 2020, 02:29 PM
Post #6


Newbie
*

Group: Members
Posts: 15
Joined: 1-August 20
Member No.: 8847



QUOTE (mcaplinger @ Aug 3 2020, 08:26 PM) *
You didn't transform the Io vector into the Junocam frame.

I suggest we take this offline since I doubt it's of much interest to the forum. You should have my email because we went over a lot of this back in 2016. Though actually, maybe someone else will have more insight into what's going on with your code.


I have a misunderstanding or some assumption that is off somewhere. I really appreciate you taking the time to help me find and correct it.

I expect these two calls to be equal and opposite:
CODE
spkezr_c("JUNO", 6.26092e+08, "IAU_IO", "LT+S", "IO")
    spkezr_c returned state vector = {383394, -72976.8, 2221.98, -21.9864, -1.4944, -53.9273}, time = 1.30185
    length(383394, -72976.8, 2221.98, -21.9864,) 390231.12745828484

spkezr_c("IO", 6.26092e+08, "IAU_IO", "LT+S", "JUNO")
    spkezr_c returned state vector = {-383346, 72946.4, -2221.68, 21.9648, 1.50007, 53.9199}, time = 1.30167
    length(-383346, 72946.4, -2221.68) 390231.12745828496

    position delta = (47.994598865217995, -30.373592960997485, 0.29506775923073292)


Distances are more or less the same, direction is more or less the same but the position delta is more than I expect. I temper my expectations with the fact that the difference is <1e-4 which is pretty small. But that still seems a lot with double precision. But, maybe with all the floating point math going on behind that simple looking facade it's remarkable that the diff is that small.

I'd also expect this:

CODE
spkezr_c("IO", 6.26092e+08, "JUNO_JUNOCAM", "LT+S", "JUNO")
    spkezr_c returned state vector = {-158406, 67136.3, 350258, -86.7244, -73116.8, 13951.5}, time = 1.30167


to have the same magnitude, but it's off by ~50km. That's also pretty small (<1e-4) in the big picture. So maybe I don't misunderstand and these are 'the same'.

So then back to the original correction:

CODE
pointing = (-0.39825207416281955, 0.1661992570141925, 0.90209372705553315)

position = (-0.40592858453022934, 0.17204230427314171, 0.89756527885255921)

pointing dot position = 0.99994321157209098


which is pretty close.

So maybe it's just that Io is very small and I should not expect spkezr to find good intersection points.

Thanks again for your time!
Go to the top of the page
 
+Quote Post
mcaplinger
post Aug 4 2020, 03:01 PM
Post #7


Senior Member
****

Group: Members
Posts: 2511
Joined: 13-September 05
Member No.: 497



QUOTE (Bill Dudney @ Aug 4 2020, 06:29 AM) *
I expect these two calls to be equal and opposite:
[code]spkezr_c("JUNO", 6.26092e+08, "IAU_IO", "LT+S", "IO")
spkezr_c("IO", 6.26092e+08, "IAU_IO", "LT+S", "JUNO")

"LT+S" corrections depend on observer velocity so these are not equivalent. If you used "NONE" or "LT" I think you'd find the answers were exactly the same.

Note that if you're using six significant digits for time you have about 1000 seconds of slop at the current epoch, but I assume you're just eliding for presentation.


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
Bill Dudney
post Aug 4 2020, 06:25 PM
Post #8


Newbie
*

Group: Members
Posts: 15
Joined: 1-August 20
Member No.: 8847



QUOTE (mcaplinger @ Aug 4 2020, 03:01 PM) *
"LT+S" corrections depend on observer velocity so these are not equivalent. If you used "NONE" or "LT" I think you'd find the answers were exactly the same.

Note that if you're using six significant digits for time you have about 1000 seconds of slop at the current epoch, but I assume you're just eliding for presentation.


Ah of course. I'll try again with NONE or LT and see what's up.

Yeah, those time values are both doubles.

So back to my original question, should I expect to see sincpt_c hitting Io? if I take the pointing vector as calculated?

I don't understand the discrepancy between what sincpt_c reports and what fovtrg_c is reporting. As a test I iterated through all the pixels and did an intercept test with sincpt_c and got zero hits on 29/red. I do get hits on 30red, 31green and 32blue but they are off quite a bit compared to the data.

Thanks again.
Go to the top of the page
 
+Quote Post
mcaplinger
post Aug 4 2020, 07:17 PM
Post #9


Senior Member
****

Group: Members
Posts: 2511
Joined: 13-September 05
Member No.: 497



QUOTE (Bill Dudney @ Aug 4 2020, 10:25 AM) *
So back to my original question, should I expect to see sincpt_c hitting Io?

If Io subtends more than a couple of pixels, and you iterate over all pixels, you should obviously expect to get at least one hit*. This is of course a very stressing case. For debugging it makes more sense to overlay pixel hits on Jupiter on an image of Jupiter since it's much harder to miss. 1-2 pixel errors aren't unexpected once the start times have been corrected, larger ones would be unusual.

On the FOV pyramids, note that the I kernel says "They are provided for visualization purposes, do not represent the geometry of the actual FOV sides, and are not intended for quantitative analysis."

*Assuming you have the camera model correct. If you've got a bug associated with the focal length/pixel scale, then clearly you could still miss.


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
Bill Dudney
post Aug 5 2020, 02:16 AM
Post #10


Newbie
*

Group: Members
Posts: 15
Joined: 1-August 20
Member No.: 8847



QUOTE (mcaplinger @ Aug 4 2020, 07:17 PM) *
If Io subtends more than a couple of pixels, and you iterate over all pixels, you should obviously expect to get at least one hit*. This is of course a very stressing case. For debugging it makes more sense to overlay pixel hits on Jupiter on an image of Jupiter since it's much harder to miss. 1-2 pixel errors aren't unexpected once the start times have been corrected, larger ones would be unusual.

On the FOV pyramids, note that the I kernel says "They are provided for visualization purposes, do not represent the geometry of the actual FOV sides, and are not intended for quantitative analysis."

*Assuming you have the camera model correct. If you've got a bug associated with the focal length/pixel scale, then clearly you could still miss.


I suspect I've got a bug. The hard part has been finding it smile.gif

I saw that note. I think that means 'don't use the pointing vectors for doing geometry'. I only use them in my bi-linear interpolation test as a 'sanity' test trying to see if my code is way off or not. The bilinear interpolation should at least put a bounds more or less across the frustum. I wouldn't expect that approach to be accurate but I'd expect that if fovtgr says the body is in the frustum the bilinear interpolation should give at least a hit or two.

I assume it does not mean that I can't reproduce a 3d model with image and the geometry data in the kernel files. Especially since I see what was done in stuff like this post. I'm sure I'm doing something boneheaded and just need to figure out what it is.

When I put the pixels referenced in the kernel's comments through my code I get results that are close to the values listed in the kernel, but not exact.

Here is my pointing vector to pixel code:

CODE
public func pixeleCoordFor(pointingVector: SIMD3<Double>) -> SIMD2<Double> {
    /*
     given a vector v in the JUNO_JUNOCAM reference frame, the following
     computes an XY coordinate in Junocam framelet coordinates with 0,0
     in the upper left:
       alpha = v[2]/fl
       cam = [v[0]/alpha, v[1]/alpha]
       cam = distort(cam)
       x = cam[0]+cx
       y = cam[1]+cy
     */
     //  INS-61504_FOCAL_LENGTH           = ( 10.95637 )
     //  INS-61504_PIXEL_SIZE             = (  0.0074  )
    let alpha = pointingVector.z / distortionModel.fl //   focalLength / pixelSize => 10.95637 / 0.0074
    let cam = SIMD2<Double>(x: pointingVector.x / alpha, y: pointingVector.y / alpha)
    let distorted = distort(pixelCoord: cam)
    return SIMD2<Double>(x: distorted.x + distortionModel.cx, y: distorted.y + distortionModel.cy)
}


And my pixel to pointing vector:

CODE
public func pointingVectorFor(pixelCoord: SIMD2<Double>) -> SIMD3<Double> {
    /*
      and the following takes an XY coordinate in Junocam framelet
      coordinates and produces a vector in the JUNO_JUNOCAM reference
      frame (of arbitrary length).

        cam[0] = x-cx
        cam[1] = y-cy
        cam = undistort(cam)
        v = [cam[0], cam[1], fl]
     */
    let undistorted = undistort(pixelCoord: pixelCoord - distortionModel.offset)
        
    return SIMD3<Double>(x: undistorted.x, y: undistorted.y, z: distortionModel.fl)
}


I run the following test to ensure that the values match. When I do the top left value of {23.5, 0.5} I get a vector that's close to the one in the kernel but not exactly the same.

Not sure if that means I have a bug in my camera model code or if there is some numerical difference between my code and the code used to create the values in that kernel.

CODE
// red.bounds[0] comes from the kernel (-0.47949605975108789, 0.092186759952144759, 0.872688449546977)
let topLeft = red.pointingVectorFor(pixelCoord: SIMD2<Double>(x: 23.5, y: 0.5))
// topLeft = (-0.45867789369502981, 0.088184307014605154, 0.88421610358093161)
XCTAssertEqual(dot(normalize(topLeft), normalize(red.bounds[0])), 1.0, accuracy: 0.0005) // assert the dot product is less that 0.0005 different than 1
// the acos of the dot product 0.024131529862876068 or about 1.3 degrees, not very different but off by more than I'd expect


If I put the pointing vector in the kernel through my pointing vector to pixel routine I get (-33, 11) which is clearly not correct. But I don't see what's wrong with the code.

But my pointing vector/pixel coord conversion code seems to be more or less symmetrical.

CODE
(lldb) p normalize(Instrument.instrumentFor(id: .red).pointingVectorFor(pixelCoord: SIMD2<Double>(x: 23.5, y: 0.5)))
(SIMD3<Double>) $R22 = (-0.45867789369502981, 0.088184307014605154, 0.88421610358093161)
(lldb) p Instrument.instrumentFor(id: .red).pixeleCoordFor(pointingVector: $R22)
(SIMD2<Double>) $R24 = (23.509524706956768, 0.4981688027828568)


I have uploaded the test image that I've generated from the raw IMG file as well as an image I made from coloring every pixel that sinctp_c says is an intersection. I pulled both into Pixelmator (like Photoshop) and overlaid them with transparency. That shows how far off the intersections are from the captured image. I can upload the full framelet image too if it helps see Jupiter in the frame. The intersection is about 14 pixels to the left and 4 pixels down from the captured image. This is from frame 30, the red filter.

Attached Image


I know it's not easy to take time away from your real job to help me, and I very much appreciate it!
Go to the top of the page
 
+Quote Post
mcaplinger
post Aug 5 2020, 05:55 PM
Post #11


Senior Member
****

Group: Members
Posts: 2511
Joined: 13-September 05
Member No.: 497



QUOTE (Bill Dudney @ Aug 4 2020, 06:16 PM) *
When I do the top left value of {23.5, 0.5} I get a vector that's close to the one in the kernel but not exactly the same.

When I do that I produce an answer that matches the kernel value to numerical precision.

Here's my code. There's not much to it.

Are you sure you translated undistort correctly? I'm suspicious.

CODE

def undistort©:
xd, yd = c[0], c[1]
for i in range(5):
r2 = (xd**2+yd**2)
dr = 1+k1*r2+k2*r2*r2
xd = c[0]/dr
yd = c[1]/dr
return [xd, yd]

def xy2vector(x, y, band):
cam = [x-cx, y-cy[band]]
cam = undistort(cam)
v = (cam[0], cam[1], fl)
return v

>>> spice.vhat(xy2vector(23.5, 0.5, RED))
(-0.47949601447452606, 0.09218674877062065, 0.8726884756052113)




--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
Bill Dudney
post Aug 6 2020, 01:59 AM
Post #12


Newbie
*

Group: Members
Posts: 15
Joined: 1-August 20
Member No.: 8847



QUOTE (mcaplinger @ Aug 5 2020, 06:55 PM) *
When I do that I produce an answer that matches the kernel value to numerical precision.

Here's my code. There's not much to it.

Are you sure you translated undistort correctly? I'm suspicious.

CODE

def undistort©:
xd, yd = c[0], c[1]
for i in range(5):
r2 = (xd**2+yd**2)
dr = 1+k1*r2+k2*r2*r2
xd = c[0]/dr
yd = c[1]/dr
return [xd, yd]

def xy2vector(x, y, band):
cam = [x-cx, y-cy[band]]
cam = undistort(cam)
v = (cam[0], cam[1], fl)
return v

>>> spice.vhat(xy2vector(23.5, 0.5, RED))
(-0.47949601447452606, 0.09218674877062065, 0.8726884756052113)


Like I said, I'm a bonehead, and likely doing something silly. I was double applying the x/y offset from the camera model. The hits on Io in Frame 29 for the red framelet are now off by 1 pixel up and 3 pixels to the right. I'd guess that's pretty good for tiny little Io. I might try some anti-aliasing (8x as many rays, 4 in each direction, average the results) and see if I can get things looking closer smile.gif. The frame 32 blue framelet is only off by one pixel to the right. Great progress, thanks again for helping me get over this hump!

Thanks again for all your help, I definitely owe you a beverage of your choice smile.gif
Go to the top of the page
 
+Quote Post
Bill Dudney
post Nov 5 2020, 12:36 PM
Post #13


Newbie
*

Group: Members
Posts: 15
Joined: 1-August 20
Member No.: 8847



Hi All,

Finally got to the point that I have something cool to show. Still plenty more to do but I'm stoked at where I am. It's interactive, projected onto a sphere, and will display any image from any data set from the PDS archive. The rainbow in the back is the normal to help me be sure I'm putting the data in roughly the correct spot.

Couldn't have done it without your help. Thanks again!
Attached thumbnail(s)
Attached Image
 
Go to the top of the page
 
+Quote Post
Bill Dudney
post Jan 25 2021, 10:01 PM
Post #14


Newbie
*

Group: Members
Posts: 15
Joined: 1-August 20
Member No.: 8847



QUOTE (Bill Dudney @ Nov 5 2020, 12:36 PM) *
Hi All,

Finally got to the point that I have something cool to show. Still plenty more to do but I'm stoked at where I am. It's interactive, projected onto a sphere, and will display any image from any data set from the PDS archive. The rainbow in the back is the normal to help me be sure I'm putting the data in roughly the correct spot.

Couldn't have done it without your help. Thanks again!

I finally had time to come back to this and get further along. Thought I'd share the latest UI. This is JNCE_2019202_21C00010_V01 from data set 12. Plenty more to do but good progress.

Attached Image
Go to the top of the page
 
+Quote Post

Posts in this topic
- adamg   Beginner level projection   Nov 6 2019, 11:50 PM
- - Gerald   Hi Adam, I'm not even using the SPICE library....   Nov 7 2019, 03:21 AM
- - Bjorn Jonsson   In addition to what Gerald mentioned, I'd star...   Nov 7 2019, 06:41 PM
- - mcaplinger   QUOTE (adamg @ Nov 6 2019, 03:50 PM) It w...   Nov 7 2019, 10:53 PM
|- - adamg   QUOTE (mcaplinger @ Nov 7 2019, 11:53 PM)...   Nov 7 2019, 11:27 PM
|- - Gerald   QUOTE (adamg @ Nov 8 2019, 12:27 AM) ... ...   Nov 8 2019, 04:39 PM
|- - Brian Swift   QUOTE (Gerald @ Nov 8 2019, 08:39 AM) Be ...   Nov 8 2019, 05:56 PM
- - adamg   Thanks for the replys. I actually did start off us...   Nov 7 2019, 11:16 PM
|- - mcaplinger   QUOTE (adamg @ Nov 7 2019, 03:16 PM) ...   Nov 7 2019, 11:24 PM
|- - adamg   QUOTE (mcaplinger @ Nov 8 2019, 12:24 AM)...   Nov 7 2019, 11:30 PM
||- - mcaplinger   QUOTE (adamg @ Nov 7 2019, 03:30 PM) Than...   Nov 7 2019, 11:51 PM
||- - Brian Swift   QUOTE (mcaplinger @ Nov 7 2019, 03:51 PM)...   Nov 10 2019, 07:40 AM
|- - Brian Swift   QUOTE (mcaplinger @ Nov 7 2019, 03:24 PM)...   Nov 10 2019, 08:41 AM
- - Brian Swift   
Hi Adam, Welcome to/from the JunoCam raw pr...   Nov 8 2019, 07:17 AM
|- - mcaplinger   QUOTE (Brian Swift @ Nov 7 2019, 11:17 PM...   Nov 8 2019, 03:27 PM
- - adamg   We're totally on the same page Brian. The hel...   Nov 9 2019, 12:15 AM
- - Brian Swift   Having a non-cartographic technique that renders r...   Nov 10 2019, 08:19 AM
|- - mcaplinger   QUOTE (Brian Swift @ Nov 10 2019, 12:19 A...   Nov 10 2019, 03:15 PM
|- - scalbers   Good to see - I would guess this also makes it eas...   Nov 22 2019, 08:07 PM
- - adamg   Absolutely Brian, that's exactly the behaviour...   Nov 10 2019, 05:15 PM
- - mcaplinger   QUOTE (adamg @ Nov 6 2019, 03:50 PM) rota...   Nov 10 2019, 07:52 PM
- - adamg   Thanks mcaplinger, I've changed it to the foll...   Nov 12 2019, 05:59 PM
|- - Brian Swift   QUOTE (adamg @ Nov 12 2019, 09:59 AM) Bri...   Nov 12 2019, 11:44 PM
- - adamg   I tried the limbpt function (below), not that I re...   Nov 12 2019, 10:52 PM
|- - mcaplinger   QUOTE (adamg @ Nov 12 2019, 02:52 PM) I...   Nov 12 2019, 11:09 PM
- - adamg   Update: I think my issue is that the transform doe...   Nov 14 2019, 11:34 PM
- - adamg   Took a while. It wasn't light travel time beca...   Nov 22 2019, 12:57 AM
- - adamg   Tried doing color, seems to work. The KD tree was ...   Nov 24 2019, 01:34 AM
- - adamg   Anyone have any good advice on scaling the RGB to ...   Nov 25 2019, 09:43 PM
|- - Brian Swift   QUOTE (adamg @ Nov 25 2019, 01:43 PM) Any...   Nov 26 2019, 10:33 AM
- - Bjorn Jonsson   I usually multiply by [1, 1.12, 2.3] after decompa...   Nov 25 2019, 11:01 PM
|- - Brian Swift   QUOTE (Bjorn Jonsson @ Nov 25 2019, 03:01...   Nov 26 2019, 10:54 AM
|- - fredk   QUOTE (Brian Swift @ Nov 26 2019, 11:54 A...   Nov 26 2019, 02:48 PM
- - adamg   Sounds like I'm miles off for some reason. I u...   Nov 26 2019, 12:26 PM
- - adamg   The RDR data set says "For planetary targets,...   Nov 26 2019, 06:14 PM
|- - mcaplinger   QUOTE (adamg @ Nov 26 2019, 10:14 AM) The...   Nov 26 2019, 09:44 PM
|- - Brian Swift   QUOTE (adamg @ Nov 26 2019, 10:14 AM) Reg...   Nov 27 2019, 04:47 AM
- - adamg   Interesting, [1,1.23,2.66] I actually dropped my 0...   Nov 27 2019, 11:30 AM
|- - Brian Swift   QUOTE (adamg @ Nov 27 2019, 03:30 AM) ......   Nov 27 2019, 06:59 PM
- - adamg   How are peoples' inter frame delays looking? ...   Dec 9 2019, 02:25 AM
|- - mcaplinger   QUOTE (adamg @ Dec 8 2019, 06:25 PM) How ...   Dec 9 2019, 05:50 PM
|- - Brian Swift   QUOTE (adamg @ Dec 8 2019, 06:25 PM) How ...   Dec 9 2019, 06:15 PM
- - Bjorn Jonsson   The value you add to the interframe delay should a...   Dec 9 2019, 06:29 PM
- - adamg   Thanks guys, apparently I'm a goose. I hard co...   Dec 10 2019, 11:20 PM
- - Bill Dudney   Hello All, I wasn't sure if I should post her...   Aug 3 2020, 02:01 PM
|- - mcaplinger   QUOTE (Bill Dudney @ Aug 3 2020, 06:01 AM...   Aug 3 2020, 03:24 PM
|- - Bill Dudney   QUOTE (mcaplinger @ Aug 3 2020, 03:24 PM)...   Aug 3 2020, 06:42 PM
|- - mcaplinger   QUOTE (Bill Dudney @ Aug 3 2020, 10:42 AM...   Aug 3 2020, 08:26 PM
|- - Bill Dudney   QUOTE (mcaplinger @ Aug 3 2020, 08:26 PM)...   Aug 3 2020, 10:09 PM
|- - Bill Dudney   QUOTE (mcaplinger @ Aug 3 2020, 08:26 PM)...   Aug 4 2020, 02:29 PM
|- - mcaplinger   QUOTE (Bill Dudney @ Aug 4 2020, 06:29 AM...   Aug 4 2020, 03:01 PM
|- - Bill Dudney   QUOTE (mcaplinger @ Aug 4 2020, 03:01 PM)...   Aug 4 2020, 06:25 PM
|- - mcaplinger   QUOTE (Bill Dudney @ Aug 4 2020, 10:25 AM...   Aug 4 2020, 07:17 PM
|- - Bill Dudney   QUOTE (mcaplinger @ Aug 4 2020, 07:17 PM)...   Aug 5 2020, 02:16 AM
|- - mcaplinger   QUOTE (Bill Dudney @ Aug 4 2020, 06:16 PM...   Aug 5 2020, 05:55 PM
|- - Bill Dudney   QUOTE (mcaplinger @ Aug 5 2020, 06:55 PM)...   Aug 6 2020, 01:59 AM
|- - Bill Dudney   Hi All, Finally got to the point that I have some...   Nov 5 2020, 12:36 PM
|- - Brian Swift   QUOTE (Bill Dudney @ Nov 5 2020, 04:36 AM...   Nov 9 2020, 06:00 AM
||- - Bill Dudney   QUOTE (Brian Swift @ Nov 9 2020, 06:00 AM...   Nov 9 2020, 01:46 PM
|- - Bill Dudney   QUOTE (Bill Dudney @ Nov 5 2020, 12:36 PM...   Jan 25 2021, 10:01 PM
- - Bjorn Jonsson   I haven't looked at the code in any detail so ...   Aug 3 2020, 09:31 PM
|- - Bill Dudney   QUOTE (Bjorn Jonsson @ Aug 3 2020, 09:31 ...   Aug 3 2020, 10:02 PM
- - Sean   This is very cool. Beautiful work!   Nov 5 2020, 06:24 PM
- - Bill Dudney   QUOTE (Sean @ Nov 5 2020, 06:24 PM) This ...   Nov 6 2020, 11:49 PM


Reply to this topicStart new topic

 



RSS Lo-Fi Version Time is now: 28th April 2024 - 12:09 PM
RULES AND GUIDELINES
Please read the Forum Rules and Guidelines before posting.

IMAGE COPYRIGHT
Images posted on UnmannedSpaceflight.com may be copyrighted. Do not reproduce without permission. Read here for further information on space images and copyright.

OPINIONS AND MODERATION
Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators.
SUPPORT THE FORUM
Unmannedspaceflight.com is funded by the Planetary Society. Please consider supporting our work and many other projects by donating to the Society or becoming a member.