IPB

Welcome Guest ( Log In | Register )

2 Pages V  < 1 2  
Reply to this topicStart new topic
Jim Bell Q'n'a Posted!, ...your questions answered
ElkGroveDan
post Feb 1 2006, 04:54 AM
Post #16


Senior Member
****

Group: Admin
Posts: 4763
Joined: 15-March 05
From: Glendale, AZ
Member No.: 197



QUOTE (elakdawalla @ Feb 1 2006, 01:37 AM)
Lyford stepped up to transcribe the first 4 minutes of this; does anyone want to transcribe more?

--Emily
*

All right all kidding aside: I'm good for 6:00-10:00

It's here: http://www.unmannedspaceflight.com/index.p...388&#entry39388


--------------------
If Occam had heard my theory, things would be very different now.
Go to the top of the page
 
+Quote Post
slinted
post Feb 2 2006, 11:06 AM
Post #17


Member
***

Group: Admin
Posts: 468
Joined: 11-February 04
From: USA
Member No.: 21



I'd be happy to take 10:00-end (of the first interview, Jan 26th)
Go to the top of the page
 
+Quote Post
ElkGroveDan
post Feb 2 2006, 06:19 PM
Post #18


Senior Member
****

Group: Admin
Posts: 4763
Joined: 15-March 05
From: Glendale, AZ
Member No.: 197



QUOTE (slinted @ Feb 2 2006, 11:06 AM)
I'd be happy to take 10:00-end
*

Note that I went over my 10:00 up to 11:00:

http://www.unmannedspaceflight.com/index.p...388&#entry39388

EDIT: My mistake, that was the SECOND interview that I did.


--------------------
If Occam had heard my theory, things would be very different now.
Go to the top of the page
 
+Quote Post
slinted
post Feb 3 2006, 12:55 AM
Post #19


Member
***

Group: Admin
Posts: 468
Joined: 11-February 04
From: USA
Member No.: 21



Pancam update 1, Jan 26, 9:58 - 16:37

JB : ...It makes it challenging to visualize the surface in true color with those filters and that's an important goal that we are always thinking about. But if we are forced to rank our goals, if the bits are restricted or the power is restricted, then we always choose on the side of greatest scientific coverage of the wavelength range as opposed to human coverage of the wavelength range. If that makes sense.

DE : It does

JB: OK

DE: And I share your frustration with making the best color you can from 2-5-7.

JB: (laughing) Right.

DE: We are getting another transit season for Opportunity at the moment. You had a paper in Nature in last July with 6 transits, 4 for Phobos and 2 for Deimos. How many are we expecting this time around? Are they all done?

JB: They're done. We've just finished for Opportunity. We were only able to, there were only 4 events visible for Opportunity, 1 Deimos event and 3 Phobos events. They all worked beautifully, so that is the end of transit season for Opportunity. Spirit has a short transit season coming up in February with 3 Phobos events, and so we'll try to observe those and then we need to wait another three hundred and something sols for the geometry to be transit-friendly again.

DE: The Phobos transits seem to have employed something that I've only recently found on my new digital camera, a kind of Pancam Sport Mode.

JB: (laughing) Sport Mode.

DE: Getting frames out very very quickly. Much quicker. Navcam sequences, we were seeing the 20 seconds or so, Deimos imaging sequences were 10 seconds but you seem to have got that down to a lot less. What's the story behind that?

JB: Typically, the fastest that we can take pictures, using all the normal default modes of the cameras, is about once every 10 seconds. There is a mode however that Mark Lemmon, the colleague of mine who is an atmospheric scientist, spends of a lot of time thinking about sky observations, dust devils and astronomical observations, he remembered from calibration that we had this mode where, if you're taking a subframe, just a part of the image, that subframe can be extracted from the detector either using software, which is the default mode, or there is actually a way to do it in hardware which is much much faster, 2 to 3 times faster, if you just let the hardware do the transfer of the subframe. It only works under specific combinations of parameters for the instruments, and we have to be extra careful doing the sequencing and commanding. But we tried it with some tests a few weeks ago, and found that it worked. We had done it a long time ago in calibration, but all of us are so tired that we forgot, except for Mark. So we did some tests with some dust devil and cloud search sequences on Mars. Everything worked great and then we fired them off. Our first real live scary tests were with these Phobos sequences. You can see that for some of the images we obtained up to about 3 seconds spacing between images, so better than 3 times what we normally get. And of course, the more time resolution we get on these events, it's astrometry, it's positional science, the better we can determine the position. So with our 10 second spacing, typically for transits, we could nail the position of Phobos down to about a kilometer in the sky. So with 3 seconds we get down to under half a kilometer.

DE: This is a sort of science that might feed forward through to the navigation camera that's on Mars Reconnaissance Orbiter that’ll arrive soon.

JB: Yes, absolutely and also Mars Express as well. The Mars Express camera team has been targeting Phobos and Deimos from orbit with their high resolution camera. This is getting folded into that as well.

DE: Mars Global Surveyor, some years ago I think, took a wide angle color image of an eclipse shadow on the ground, and the Viking landers, with their primitive cameras, scanned a vertical slice during an eclipse and saw the terrain darkening. Will there become a point where the science of doing direct observations of these transits will become diminishing and may be able to do observations with Navcam or one of the Hazcams to see the terrain darkening during a good Phobos eclipse?

JB: That day may come. By the time the next transit season rolls around, we should know whether there's still the best scientific thing to do is to keep observing the transit directly, or whether we can afford to have the patience and the guts not to look right at the sun but to look down instead at the ground, because we can't do both at the same time, and just observe the effect of the shadow passing over the terrain.

DE: Do you think you could use Sport Mode with the Hazcam?

JB: (laughing) I suppose it's possible. Although, we could certainly use the Navcam or the Pancams to do that. The Pancam especially is a calibrated camera, so we could actually get quantitative measurements of how the brightness is changing.

DE: Well, hopefully this won't be the first and last of these that we do. I've taken too much of your time already but hopefully people will think of new cunning questions and we'll talk again soon.

JB: That would be great. I really welcome peoples questions and I know a lot of people want to learn more about how the cameras work and why, especially why the team does things the way they do. It's often hard to decipher what's happening from day to day. All I can assure you is that it makes sense at the time (laughs).

DE: (laughing) The results might not, but the idea to do it does.

JB: Absolutely


DE: And so that's it for the first Pancam Update but as you've heard, not the last. So if you have any questions you'd like to ask in the next update, then please email them to blog@planetary.org and Emily Lakdawalla will forward them on to me. I’d like to thank Emily and everyone at The Planetary Society for hosting these files and I’ll speak to you soon, again, for the next Pancam Update.
Go to the top of the page
 
+Quote Post
Bob Shaw
post Feb 3 2006, 01:03 PM
Post #20


Senior Member
****

Group: Members
Posts: 2488
Joined: 17-April 05
From: Glasgow, Scotland, UK
Member No.: 239



Jim Bell mentioned that the Mars Express camera team has been targeting Phobos and Deimos from orbit with their high resolution camera...

...so there may be some interesting images, eventually...

Bob Shaw


--------------------
Remember: Time Flies like the wind - but Fruit Flies like bananas!
Go to the top of the page
 
+Quote Post
odave
post Feb 3 2006, 05:58 PM
Post #21


Member
***

Group: Members
Posts: 510
Joined: 17-March 05
From: Southeast Michigan
Member No.: 209



Sorry this is late - it's really annoying when work interferes with UMSF!

To make it up to you guys, I went from 4:00 up to Slinted's start time, so now we've got the transcription of the first interview done.

--------

Pancam update 1, Jan 26
04:00-09:58

DE: Spirit's currently doing a lot of driving, whereas Opportunity's kind of doing a lot of the opposite. The two situations, how do they differ in terms of imaging planning? When Spirit's driving can you pluck out sequences from a library for end-of-driving sequences or is it more complicated than that?

JB: Yeah, actually it's a very interesting comparison right now. They're almost exact opposite situations. With Spirit, we need to get through the Inner Basin to Home Plate and up to a north-facing slope pretty quickly, because we just passed the fall equinox, of course, in the south, and we’re heading towards winter, and the power is dropping-dropping-dropping. So we can’t linger in south-facing slopes or even flat places. So we’re kind of racing the clock to try to get over to Home Plate, which everybody’s very excited about, we don’t really know what that is, but we’re gonna be there soon. And we’re passing lots of juicy rocks and ridges and outcrops and we’re trying to do the best job we can of doing some basic characterization of them with the cameras, with the Mini-TES Spectrometer, taking color pictures, infrared spectra, making whatever measurements we can in our traverse that minimize the time taken out of the drive. We’ve just been flying across the terrain as quickly as possible, trying to get to our next juicy target zone.

Whereas on the other side of the planet, Meridiani, it’s the flip side. We’ve been forced to sit in one spot because of the problems with the shoulder joint on Opportunity’s arm. We’ve been understandably trying to be very patient, the science team very patient with the engineering team, because they need to do their job well, of diagnosing what’s happening with the shoulder joint, how to use it in its new configuration, and how to address these issues of driving with the shoulder joint either deployed or stowed, those decisions are still being discussed, exactly how to do that. And all the time, while these discussions and simulations with the test rover and modeling go on, those all take time, we’ve been at this one spot. And so we’ve been taking enormous panoramas, we took the 360 degree panorama in all the filters, the first time we’ve been able to do that. We’ve been doing high spatial resolution, what we call super-rez, monitoring of a bunch of what turn out to be very interesting rocks all around us. It’s been a spectacular place to get stuck if you have to get stuck.

DE: A whole range of things you might never have noticed had you not got stuck

JB: Absolutely, absolutely right. And because we had this extra time, it’s been sort of a bonus to use some of the capabilities of the cameras that we don’t often get to use, like the super-rez capability. And so we’ve been shooting targets all around us, we took a large stereo panorama all the way around the rover in lossless, without any loss of information in compression, using the blue filters. And that’s a wonderful product for the sedimentologists and the morphologists who are looking at these rocks in great detail.

But what’s happened is that we’ve been at this spot for so long, and believe me it’s frustrating for the science team, but we have to let the engineering folks do their job and make sure that the health of the rover comes first. But we’ve been sitting at this spot for so long that we’re literally running out of things to do with the cameras, it’s almost like we’ve been a lander mission, and nobody ever really wanted to be a lander mission, and luckily we were able to do this little bump a few days ago and move to a spot where we can get the microscope down onto some of these juicy targets. We’ll be doing some more of that over the coming days, perhaps even longer, it just depends on what we see. But a lot of people are antsy to get back on the road to Victoria.

DE: It seems the longer Opportunity sits still, the further away Victoria seems.

JB: [laughs] It only seems that way, but yes.

DE: We’ve got quite a few questions submitted. A guy called edstrick has asked about the different combinations of filters you use. Most sequences for color pancam imaging tend to be L257 or L256.

JB: Right.

DE: And some with a photometry label get done in 247. What are the differences between them and how do you pick which set of three is most appropriate for a particular situation?

JB: I would say 257 is our most common combination, mostly because 2 and 7 give us the widest range of wavelength coverage, and 5 gives us a point in the middle, that also happens to be green, that we can use along with 2 and 7 to make these approximate true color renderings, or these garish false-color renderings that we often release onto the website. Primarily, that’s sort of our tactical, “you’ve got the least amount of bits to spend, but you want to cover the most amount of wavelength”, left-eye filter set. You know pancam can see into the infrared farther than human eyes and into the ultraviolet deeper than human eyes can. And we try to exploit that capability because it gives us sensitivity to different kinds of iron bearing rocks, iron bearing minerals, so spanning the most wavelength gets us the greatest sensitivity to color variations...

EDIT: Corrected spelling of edstrick's user ID. Must remember: edSTRICK, tedSTRYK rolleyes.gif


--------------------
--O'Dave
Go to the top of the page
 
+Quote Post

2 Pages V  < 1 2
Reply to this topicStart new topic

 



RSS Lo-Fi Version Time is now: 26th April 2024 - 11:16 PM
RULES AND GUIDELINES
Please read the Forum Rules and Guidelines before posting.

IMAGE COPYRIGHT
Images posted on UnmannedSpaceflight.com may be copyrighted. Do not reproduce without permission. Read here for further information on space images and copyright.

OPINIONS AND MODERATION
Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators.
SUPPORT THE FORUM
Unmannedspaceflight.com is funded by the Planetary Society. Please consider supporting our work and many other projects by donating to the Society or becoming a member.