IPB

Welcome Guest ( Log In | Register )

2 Pages V   1 2 >  
Reply to this topicStart new topic
Voyager camera pointing information
Brian Burns
post Jul 29 2016, 09:32 PM
Post #1


Junior Member
**

Group: Members
Posts: 54
Joined: 7-July 16
From: Austin, Texas
Member No.: 7991



Sorry if this has been discussed anywhere but I can't seem to find much information about it - why is it that Voyager's pointing is so haphazard, as in this video of all the RAW Jupiter images - https://www.youtube.com/watch?v=bf5QJ8iFxUs?

I came across this link which says that pointing information for the images exists - http://pds-rings.seti.org/voyager/ck.html, but have also read on this forum that it's not very accurate. Would the information be useful in automatically aligning composite images and mosaics, or is it too coarse? Would it at least be useful in getting general alignments that could be refined by hand?

And does anyone know why the cameras could not be pointed more accurately, or why accurate information could not be returned with the images? I assume it was some technical limitation, but just curious what it might be.
Go to the top of the page
 
+Quote Post
mcaplinger
post Jul 29 2016, 09:45 PM
Post #2


Senior Member
****

Group: Members
Posts: 2504
Joined: 13-September 05
Member No.: 497



QUOTE (Brian Burns @ Jul 29 2016, 01:32 PM) *
And does anyone know why the cameras could not be pointed more accurately, or why accurate information could not be returned with the images?

Voyager used an articulated scan platform with fairly coarse position feedback. It also used coarse sun and star sensors (the star sensors locked on to one bright star, no scanning of any kind) for attitude information, and for Voyager 1, one of the star sensors failed just after the Jupiter encounter -- https://oce.jpl.nasa.gov/mib/VOY-1.pdf

Remember that Voyager was designed in the early 70s before solid-state imaging even existed. Modern spacecraft use much better star cameras for attitude information, and usually don't have scan platforms (for better or worse) and even then the pointing is typically not accurate enough to perfectly line up colors.

That said, I have no idea how accurate any particular Voyager C kernel might be. Those derived from fitting of the limb in the image (so-called "C smithing") could be very accurate.


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post
Bjorn Jonsson
post Jul 29 2016, 11:22 PM
Post #3


IMG to PNG GOD
****

Group: Moderator
Posts: 2250
Joined: 19-February 04
From: Near fire and ice
Member No.: 38



The north azimuth that can be determined using the C kernels is accurate enough (and this applies to all of the Voyager C kernels) and this makes the available Voyager C kernels highly useful. The pointing itself (i.e. the location of the subspacecraft point in the image) is typically off by ~100 pixels and needs to be corrected. The C smithed kernels are an exception (they are much more accurate) but they are only available for the Voyager 1 Saturn images.
Go to the top of the page
 
+Quote Post
Brian Burns
post Jul 30 2016, 02:01 AM
Post #4


Junior Member
**

Group: Members
Posts: 54
Joined: 7-July 16
From: Austin, Texas
Member No.: 7991



Okay, thank you both for the information - it all makes more sense now.

I'm just starting to get stabilization working for the movies, which works for most simple cases - I was planning to try to crowd-source the rest (assuming such a crowd could be found - I figured some people on Reddit might be interested). But it sounds like once the C-smithed kernels are available that wouldn't be necessary - it would be possible to generate stable movies and composites somewhat automatically.

Of course, generating accurate C kernels would be where all the hard work is...

Well, I'll post movies as they get more stable, as much as can be automated anyway - I can't stop now...
Go to the top of the page
 
+Quote Post
Brian Burns
post Aug 3 2016, 08:58 PM
Post #5


Junior Member
**

Group: Members
Posts: 54
Joined: 7-July 16
From: Austin, Texas
Member No.: 7991



The links to the downloads on the C kernels page weren't working so I wrote to Mark Showalter - he is working on C-smithing the Cassini kernels right now, and plans to C-smith the Voyager kernels once the Cassini data is done, which might be another year or so.

So regarding the accuracy of the C kernels - for the kernels obtained from the SEDR data:

QUOTE
Long experience with the SEDR file has revealed its pointing information to be
accurate to typically ~ 100 narrow-angle pixels, which is equivalent to ~ 0.05
degrees or ~ 1 milliradian.

This C kernel was generated by the PDS Rings Node. We hope to generate improved
C kernels (continuous and C-smithed) at a later date.

http://pds-rings.seti.org/voyager/ck/vg1_j...e1_iss_sedr.txt


and for the C-smithed kernels (just Voyager 1 at Saturn at the moment), obtained from the SEDR data with some corrections:

QUOTE
The pointing information was "C-smithed" using two additional sources of data:

(1) Mark Showalter's personal collection of pointing corrections for 834
Voyager 1 images of Saturn and the rings.

(2) A collection of SEDR updates obtained from JPL/MIPL for 259 images
of Saturn's satellites. These udpates date back to the early 1990s.

In general, pointing information should be accurate to the level of a few tens
of narrow-angle pixels, i.e., ~ 0.01 degrees or ~0.2 milliradians. However, some
errors may be larger, particularly around the closest approach period when the
scan platform was moving rapidly.

http://pds-rings.seti.org/voyager/ck/V1SAT...SS_CSMITHED.txt


So at the moment they wouldn't be accurate enough to reconstruct the movies, but maybe having centered and corrected images would be helpful in constructing more accurate C kernels.

I'm still working on setting up a usable system - at the moment it's a bit laborious to correct the mis-centered images - ideally it would be tied into an image editor where you could outline the target with a circle and have it write a record to the centering corrections file. And then for aligning channels for the closeups it could start with the existing C kernels and run some image alignment code, then let the user do any fine tuning in an editor - similarly for the mosaics. There are thousands of images, so it would need to be pretty simple and easy to do.

Go to the top of the page
 
+Quote Post
JohnVV
post Aug 3 2016, 11:51 PM
Post #6


Member
***

Group: Members
Posts: 890
Joined: 18-November 08
Member No.: 4489



you have read through my post
"Voyager Images and Isis3, applications and methods"

http://www.unmannedspaceflight.com/index.php?showtopic=8198

use your preferred way of handling the truncated lines

adding a grid and then measuring the offset .You can then offset the non grided image by the x & y amount , this works well .

For non gas giants ,a control network can be used BUT this is easier IF the images are already close to accurate.

QUOTE
I came across this link which says that pointing information for the images exists - http://pds-rings.seti.org/voyager/ck.html, but have also read on this forum that it's not very accurate. Would the information be useful in automatically aligning composite images and mosaics, or is it too coarse? Would it at least be useful in getting general alignments that could be refined by hand?

"vger" predates pds and the isis3 ( and 1 and 2 ) tools and parts of vicar

these are reconstructed kernels built from reworked archive tapes and log books
the fact that they are THIS close

is a feat in of it's self

so for 100% automated ????? probably not
Go to the top of the page
 
+Quote Post
Brian Burns
post Aug 4 2016, 03:12 AM
Post #7


Junior Member
**

Group: Members
Posts: 54
Joined: 7-July 16
From: Austin, Texas
Member No.: 7991



Thanks for the link, I haven't tried ISIS yet - would like to someday though.

At the moment the goal is to make fairly rough movies, so motion of the target between images isn't accounted for, nor geometric distortion, but they look okay(ish) for the task. Here's the Voyager 1 Jupiter rotation movie in color, still kind of glitchy (v0.41) - http://imgur.com/MgNRzyE.

Maybe someday a later version could get into using ISIS to make more accurate versions, but that's kind of over the horizon at the moment.

But yeah, whatever can't be automated will have to be done by hand - will try to reduce that as much as possible though. smile.gif
Go to the top of the page
 
+Quote Post
Ian R
post Aug 4 2016, 05:30 PM
Post #8


Lord Of The Uranian Rings
***

Group: Members
Posts: 798
Joined: 18-July 05
From: Plymouth, UK
Member No.: 437



Without correction for geometric distortion, the Jupiter approach movies will look as though they were filmed underwater (I speak here from bitter experience).

I've already completed this particular video using a much more refined version of the method outlined in this clip from 2010:

https://www.youtube.com/watch?v=0XjW0vZZZXw

Here's a snippet of the final movie, which I can't yet release for various reasons: (But look out for it soon cool.gif )

https://youtu.be/ZLSD0_-3LTM?VQ=HD1080


--------------------
Go to the top of the page
 
+Quote Post
Brian Burns
post Aug 4 2016, 07:20 PM
Post #9


Junior Member
**

Group: Members
Posts: 54
Joined: 7-July 16
From: Austin, Texas
Member No.: 7991



QUOTE (Ian R @ Aug 4 2016, 11:30 AM) *
Without correction for geometric distortion, the Jupiter approach movies will look as though they were filmed underwater (I speak here from bitter experience).


Yeah, I'd noticed some of the frames are like that - the edges of the planet get stretched a bit, and the ultra bright images are too large - I didn't want to use the geometrically corrected images though, at least for now, because I wanted to see what the 800x800 movies were like. But at some point I might switch to using the 1000x1000 images - I guess they would make nicer composites and mosaics also.

QUOTE
I've already completed this particular video using a much more refined version of the method outlined in this clip from 2010:

https://www.youtube.com/watch?v=0XjW0vZZZXw


That's cool with the cloud warping -

QUOTE
Here's a snippet of the final movie, which I can't yet release for various reasons: (But look out for it soon cool.gif )

https://youtu.be/ZLSD0_-3LTM?VQ=HD1080


Nice - it's all cleaned up and stable! Looking forward to seeing the whole thing.

I'd love to see a complete Voyager movie of that quality, someday. I guess it would require a lot of manual cleanup - 70k+ images - might take a while...
Go to the top of the page
 
+Quote Post
Bjorn Jonsson
post Aug 5 2016, 12:01 AM
Post #10


IMG to PNG GOD
****

Group: Moderator
Posts: 2250
Joined: 19-February 04
From: Near fire and ice
Member No.: 38



QUOTE (Brian Burns @ Aug 4 2016, 07:20 PM) *
I didn't want to use the geometrically corrected images though, at least for now, because I wanted to see what the 800x800 movies were like. But at some point I might switch to using the 1000x1000 images - I guess they would make nicer composites and mosaics also.


Using the geometrically corrected images (the *_GEOMED.IMG files) makes a big difference, especially when doing mosaics. One problem with them though is that even though they have been flatfielded and cleaned up there are a few subtle blemishes and dark horizontal lines left that need to be removed by using an 'extra' flatfield if you have a low contrast scene like Saturn or some of the Jupiter closeups. Because of this I now usually use the *_CALIB.IMG files, flatfield these using 'extra' flatfields I generated and then use information from the *_GEOMA.TAB files to warp them. The result is an image that is geometrically identical to the *_GEOMED.IMG files but without the (small) 'residual' blemishes that are present in the *_GEOMED.IMG files (flatfielding the *_GEOMED.IMG files does not work well since the position of the reseau marks varies slightly with things like exposure, gain, target brightness etc.).
Go to the top of the page
 
+Quote Post
Brian Burns
post Aug 5 2016, 02:59 AM
Post #11


Junior Member
**

Group: Members
Posts: 54
Joined: 7-July 16
From: Austin, Texas
Member No.: 7991



That's good to know about, thanks - I'd read something about the flatfield correction not being ideal - it might be nice to add something like that to the pipeline at some point, further down the line. Also will need some good reseau-mark-removal cleanup algorithms, since they're so noticeable on the limbs.

And for cleaning up noisy images it could be a nice citizen science type project to host the images online somewhere so people could download some and clean them up. Someday...
Go to the top of the page
 
+Quote Post
JohnVV
post Aug 5 2016, 04:54 AM
Post #12


Member
***

Group: Members
Posts: 890
Joined: 18-November 08
Member No.: 4489



i use gmic ( was called GREYCstoration ) but it evolved into including basically everything Imagemagic can do and the PDE based smoothing and inpainting

the pds imq ( yes the IMQ ) header has a x/y points for the reseau marks

CODE
  Group = Reseaus
    Line     = (12.6, 7.0, 4.0, -1.2, -4.6, -6.3, -8.0, -8.6, -8.9, -8.2,
                -10.0, -3.2, 22.9, 26.0, 22.0, 18.0, 17.0, 15.0, 14.0, 13.0,
                13.0, 14.0, 8.0, 58.0, 55.1, 51.0, 48.0, 46.0, 44.0, 43.0,
                42.0, 41.0, 41.0, 43.0, 44.0, 94.0, 90.0, 88.0, 85.0, 84.0,
                82.0, 81.0, 81.0, 80.0, 81.0, 81.0, 131.9, 130.0, 118.7, 120.0,
                169.9, 167.1, 164.8, 164.0, 161.8, 160.7, 159.9, 160.0, 158.5,
                159.0, 159.0, 208.5, 208.0, 197.6, 197.6, 247.6, 245.5, 243.8,
                242.6, 241.6, 240.6, 239.9, 239.2, 238.4, 237.7, 237.3, 286.8,
                286.0, 277.2, 276.7, 326.4, 324.7, 323.3, 322.3, 321.5, 320.6,
                320.0, 319.3, 318.5, 317.5, 316.9, 366.0, 365.2, 357.0, 356.5,
                405.6, 404.2, 403.0, 402.1, 401.3, 400.6, 399.9, 399.1, 398.3,
                397.4, 396.4, 445.2, 444.6, 436.4, 435.6, 485.0, 483.7, 482.8,
                481.9, 481.1, 480.4, 479.6, 478.9, 478.0, 476.8, 475.5, 524.4,
                524.2, 515.6, 514.6, 564.4, 563.3, 562.4, 561.6, 560.8, 560.0,
                559.3, 558.4, 557.4, 556.0, 554.3, 603.5, 603.5, 593.0, 592.6,
                643.1, 642.5, 641.0, 641.0, 640.2, 638.0, 637.0, 636.0, 635.0,
                634.4, 630.0, 682.0, 682.0, 671.0, 669.3, 720.6, 720.9, 720.0,
                719.0, 718.0, 717.0, 716.0, 714.0, 712.0, 710.0, 706.0, 757.7,
                759.1, 758.0, 758.0, 757.0, 756.0, 755.0, 753.0, 752.0, 750.0,
                746.0, 743.4, 795.4, 787.0, 787.0, 786.0, 785.0, 784.0, 783.0,
                782.0, 780.0, 776.0, 778.0, 804.0, 810.7, 808.5, 808.5, 808.0,
                807.4, 806.1, 805.0, 801.0, 798.0, 798.5, 787.0, 120.0)
    Sample   = (0.4, 44.0, 119.0, 199.9, 278.0, 356.1, 433.9, 511.6, 588.5,
                664.1, 737.1, 784.4, 6.0, 81.0, 158.0, 236.0, 314.0, 393.0,
                471.0, 549.9, 626.3, 702.0, 772.0, -5.8, 43.0, 120.0, 199.0,
                276.0, 355.0, 433.0, 512.0, 589.0, 666.0, 740.0, 792.0, 15.0,
                81.0, 159.0, 238.0, 317.0, 395.0, 474.0, 551.0, 629.0, 705.0,
                769.0, -4.6, 43.0, 744.1, 793.0, 15.3, 82.3, 159.0, 238.0,
                318.1, 396.7, 475.2, 553.3, 631.0, 707.7, 773.5, -5.3, 43.0,
                746.8, 795.7, 15.3, 82.4, 160.8, 239.0, 318.0, 397.4, 475.9,
                554.3, 632.3, 709.6, 775.7, -5.1, 43.9, 748.7, 797.8, 15.6,
                82.9, 161.3, 240.1, 319.0, 397.8, 476.4, 555.0, 633.2, 710.9,
                777.5, -4.6, 44.4, 749.8, 799.3, 16.4, 83.4, 161.7, 240.4,
                319.3, 398.1, 476.9, 555.5, 633.8, 711.7, 778.5, -3.8, 45.3,
                750.4, 801.8, 17.5, 84.4, 162.3, 240.9, 319.7, 398.4, 477.2,
                555.9, 634.3, 712.3, 779.2, -2.3, 46.4, 751.0, 799.8, 19.0,
                85.4, 163.0, 241.4, 320.0, 398.9, 477.5, 556.4, 634.7, 712.7,
                779.4, -0.5, 48.0, 752.0, 800.0, 21.3, 86.9, 163.9, 242.0,
                320.6, 401.0, 479.0, 558.0, 636.0, 712.7, 778.0, 2.6, 50.4,
                752.0, 799.5, 26.0, 89.2, 166.0, 243.0, 322.0, 401.0, 480.0,
                559.0, 637.0, 714.0, 778.0, 4.5, 54.0, 129.0, 206.0, 284.0,
                362.0, 441.0, 519.0, 598.0, 675.0, 751.0, 800.0, 22.2, 93.0,
                168.0, 245.0, 323.0, 402.0, 481.0, 559.0, 637.0, 713.0, 785.0,
                11.0, 56.6, 129.6, 205.8, 283.3, 361.5, 439.8, 518.3, 599.0,
                675.6, 748.7, 794.0, 591.0)


this was for the voyager "c1133337.imq" ingested into isis3 , before any work on it

there is old c code for reading the imq's
http://pds-imaging.jpl.nasa.gov/data/vg2-n..._0009/software/
http://pds-imaging.jpl.nasa.gov/data/vg2-n...re/softinfo.txt
Go to the top of the page
 
+Quote Post
Brian Burns
post Aug 5 2016, 05:53 PM
Post #13


Junior Member
**

Group: Members
Posts: 54
Joined: 7-July 16
From: Austin, Texas
Member No.: 7991



Thanks, G'mic looks interesting, especially if it can do inpainting well.

I've got the program set up to use the later PDS volumes, so would be using that data - it includes the reseau mark locations like so -

C1469813_RESLOC.TXT:
This is a table of the center locations of
the reseau markings in the corresponding raw image, C1469813_RAW.IMG. The
table has 202 rows, one per reseau marking. Each row contains values for the
line and sample coordinates. Note that lines and samples range from 1 to 800,
although some reseau markings can fall outside these limits. This file was
derived from the corresponding VICAR-format binary data file
C1469813_RESLOC.DAT. It has been converted to ASCII text format for users'
convenience. An extra column contains the reseau marking number as originally
identified by the Voyager Imaging Team; this number differs from the order of
the rows in the file. See Fig. 1 of
Danielson, G. E., P. N. Kupferman, T. V. Johnson, and L. A. Soderblom 1981.
Radiometric performance of the Voyager cameras. J. Geophys. Res. 86, 8683-
8689.

C1469813_RESLOC.TAB:
1, 4.9486, 12.4470, 1
2, 0.3756, 51.5631, 3
3, 2.4425,126.7564, 4
4, 1.5738,205.6022, 5
5, -0.2871,282.6453, 6
6, -0.1777,362.7857, 7
...

For filling in large gaps, I was thinking that since the program knows what the size of the target should be (thanks to the SPICE data), once you have the image centered, you could fill in the target area where it's black by filling in from the previous good frame, and blurring the edges a bit. Either that or some kind of texture synthesis, e.g. http://eric-yuan.me/texture-synthesis/.

I'm still having trouble stabilizing images due to all the bad frames, so cleaning up the images might need to be higher in priority - until then the movies are going to be fairly choppy. But the expected target size from SPICE has matched the actual image very well from the images I've looked at, so that might help with centering the images, since the Hough circle detector can look for circles of a specific radius.
Go to the top of the page
 
+Quote Post
Brian Burns
post Aug 27 2016, 03:33 PM
Post #14


Junior Member
**

Group: Members
Posts: 54
Joined: 7-July 16
From: Austin, Texas
Member No.: 7991



I've been having trouble getting camera pointing information from SPICE - I've tried both the older C-kernels from NAIF and the newer versions from PDS, Voyager 1 and 2 data, run through all available image times, tried the scan platform vs cameras, set the time tolerance to increasingly larger values, but it still comes back with pointing information not found. I must be doing something obviously wrong - does anyone know what it is?

Here's the Python code using SpiceyPy (I also tried it in C to make sure there wasn't something wrong with the Python interface but got the same results) -

Thanks for any pointers!

CODE
"""
SPICE C-kernel (camera pointing) test

Kernels from
http://naif.jpl.nasa.gov/pub/naif/VOYAGER/kernels/ck/vgr1_super.bc
http://naif.jpl.nasa.gov/pub/naif/VOYAGER/kernels/sclk/vg100019.tsc
http://pds-rings.seti.org/voyager/ck/vg1_jup_version1_type1_iss_sedr.bc
"""

import spiceypy as spice

# load SPICE kernels
spice.furnsh('kernels/vgr1_super.bc') # voyager 1 pointing data (11mb)
# spice.furnsh('kernels/vg1_jup_version1_type1_iss_sedr.bc') # pds version jupiter (700kb)
spice.furnsh('kernels/vg100019.tsc') # voyager 1 clock data (76kb)
spice.furnsh('kernels/naif0012.tls') # leap second data (5kb)

# settings
spacecraft = -31 # voyager 1
instrument = -31001 # narrow angle camera

utcTime = "1979-01-05T15:14:10" # time of first jupiter image
ephemerisTime = spice.str2et(utcTime) # seconds since J2000
sclkch = spice.sce2s(spacecraft, ephemerisTime) # spacecraft clock string
sclkdp = spice.sce2c(spacecraft, ephemerisTime) # spacecraft clock double

tolerance = spice.sctiks(spacecraft, "0:00:400") # time tolerance
# tolerance = 1e12 # nowork

frame = 'ECLIPB1950' # coordinate frame
# frame = 'J2000' # coordinate frame

# get camera pointing information
cmat, clkout, found = spice.ckgp(instrument, sclkdp, tolerance, frame)

# clean up the kernels
spice.kclear()

# print results
print 'utc',utcTime
print 'et',ephemerisTime
print 'sclkch',sclkch
print 'sclkdp',sclkdp
print 'tolerance',tolerance
print 'frame',frame
print
print 'cmat',cmat
print 'clkout',clkout
print 'found',found

"""
=>
utc 1979-01-05T15:14:10
et -662330699.816
sclkch 2/14623:20:785
sclkdp 701344767.168
tolerance 399.0
frame ECLIPB1950

cmat [[ 0.  0.  0.]
[ 0.  0.  0.]
[ 0.  0.  0.]]
clkout 0.0
found False
"""



and the example C code from https://naif.jpl.nasa.gov/pub/naif/toolkit_...ice/ckgp_c.html, slightly adapted

CODE
#include <stdio.h>
#include "SpiceUsr.h"

int main ()
{
  /*
    Constants for this program:

    -- The code for the Voyager 2 spacecraft clock is -32

    -- The code for the narrow angle camera on the Voyager 2
    spacecraft is -32001.

    --  Spacecraft clock times for successive Voyager images always
    differ by more than 0:0:400.  This is an acceptable
    tolerance, and must be converted to "ticks" (units of
    encoded SCLK) for input to ckgp_c.

    -- The reference frame we want is FK4.

    -- The narrow angle camera boresight defines the third
    axis of the instrument-fixed reference frame.
    Therefore, the vector ( 0, 0, 1 ) represents
    the boresight direction in the camera-fixed frame.
  */

#define   SC        -32
#define   INST      -32001
#define   REF       "FK4"
#define   TOLVGR    "0:0:400"
#define   NPICS     2
#define   MAXCLK    30
#define   CKCORR    "voyager2_corrected.bc"
#define   SCLK      "voyager2_sclk.tsc"
#define   FOO "naif0012.tls"


  SpiceBoolean            found;

  SpiceChar               sclkch  [NPICS][MAXCLK] =

    { { "2/18473:46:768" },
      { "2/18381:55:768" } };

  SpiceChar               clkch   [MAXCLK];

  SpiceDouble             cmat    [3][3];
  SpiceDouble             clkout;
  SpiceDouble             sclkdp;
  SpiceDouble             toltik;
  SpiceDouble             vinert  [3];

  SpiceInt                i;


  furnsh_c ( CKCORR );

  /*
    Need to load a Voyager 2 SCLK kernel to convert from
    clock string to ticks.  Although not required for
    the Voyager spacecraft clocks, most modern spacecraft
    clocks require a leapseconds kernel to be loaded in
    addition to an SCLK kernel.
  */
  furnsh_c ( SCLK );
  furnsh_c ( FOO );

  /*
    Convert tolerance from VGR formatted character string
    SCLK to ticks, which are units of encoded SCLK.
  */
  /* sctiks_c ( SC, TOLVGR, &toltik ); */
  /* printf("toltik %f\n", toltik); */
  /* toltik = 800.0; */
  toltik = 1e10;

  for ( i = 0;  i < NPICS;  i++ )
    {
      /*
        ckgp_c requires encoded spacecraft clock time.
      */
      scencd_c ( SC, sclkch[ i ], &sclkdp );

      ckgp_c ( INST,  sclkdp,  toltik, REF,
               cmat,  &clkout, &found       );

      if ( found )
        {
          /*
            The boresight vector, relative to inertial coordinates,
            is just the third row of the C-matrix.
          */
          vequ_c   ( cmat[2], vinert );

          scdecd_c ( SC, clkout, MAXCLK, clkch  );


          printf ( "VGR 2 SCLK time: %s\n", clkch  );

          printf ( "VGR 2 NA ISS boresight pointing vector: "
                   "%f %f %f\n",
                   vinert[0],
                   vinert[1],
                   vinert[2]                               );
        }
      else
        {
          printf ( "Pointing not found for time %s\n", sclkch[i] );
        }

    }

  return ( 0 );
}


Go to the top of the page
 
+Quote Post
mcaplinger
post Aug 27 2016, 04:53 PM
Post #15


Senior Member
****

Group: Members
Posts: 2504
Joined: 13-September 05
Member No.: 497



QUOTE (Brian Burns @ Aug 27 2016, 07:33 AM) *
I've been having trouble getting camera pointing information from SPICE - I've tried both the older C-kernels from NAIF and the newer versions from PDS, Voyager 1 and 2 data, run through all available image times, tried the scan platform vs cameras, set the time tolerance to increasingly larger values, but it still comes back with pointing information not found.

If it's a type 1 kernel with no interpolation, then you will only get values at specific times no matter how you set the tolerance -- I think. Does it work for the exact time of an image as calculated using the clock strings instead of the ISO time?

If it is a type 1 you could convert it to type 3 with ckspanit. http://naif.jpl.nasa.gov/naif/utilities_PC...dows_32bit.html


--------------------
Disclaimer: This post is based on public information only. Any opinions are my own.
Go to the top of the page
 
+Quote Post

2 Pages V   1 2 >
Reply to this topicStart new topic

 



RSS Lo-Fi Version Time is now: 28th March 2024 - 08:22 AM
RULES AND GUIDELINES
Please read the Forum Rules and Guidelines before posting.

IMAGE COPYRIGHT
Images posted on UnmannedSpaceflight.com may be copyrighted. Do not reproduce without permission. Read here for further information on space images and copyright.

OPINIONS AND MODERATION
Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators.
SUPPORT THE FORUM
Unmannedspaceflight.com is funded by the Planetary Society. Please consider supporting our work and many other projects by donating to the Society or becoming a member.