IPB

Welcome Guest ( Log In | Register )

4 Pages V  < 1 2 3 4 >  
Reply to this topicStart new topic
AVIATR - Titan Airplane Mission Concept, Proposed unmanned aerial exploration of Titan
Greg Hullender
post Jul 12 2010, 08:58 PM
Post #31


Senior Member
****

Group: Members
Posts: 1018
Joined: 29-November 05
From: Seattle, WA, USA
Member No.: 590



Forgive my ignorance, but shouldn't the resolution just be a simple calculation from the field of view, the distance to Titan, and the dimension of the camera in pixels? I realize SAR isn't quite the same as a simple picture, but can it be THAT different?

The discussion of how accurate things need to be reminds me of something a senior astronomy major told me when I was a freshman at Caltech back in the 1970s:

Mathematicians insist on exact answers. Pi is NOT 3.1416
Chemists and Engineers generally settle for a few parts per thousand error.
Physicists are happy to get the order of magnitude right.
Except Astro-Physicists, who just want to get the order of magnitude of the order of magnitude right.
And Computer Scientists only need to know if it's zero or not.

--Greg :-)
Go to the top of the page
 
+Quote Post
Jason W Barnes
post Jul 12 2010, 11:21 PM
Post #32


Member
***

Group: Members
Posts: 131
Joined: 30-August 06
From: Moscow, Idaho
Member No.: 1086



QUOTE (Greg Hullender @ Jul 12 2010, 01:58 PM) *
Forgive my ignorance, but shouldn't the resolution just be a simple calculation from the field of view, the distance to Titan, and the dimension of the camera in pixels? I realize SAR isn't quite the same as a simple picture, but can it be THAT different?


Short answer: yes.

The real problem, though, and one that is quite common, is that what you're describing is the pixel scale. It is not the resolution. Resolution is the smallest thing that you can resolve, which is always larger than a pixel. In fact the theoretical minimum is 2 pixels. Experience with HiRISE is that about 3 pixels is what it really takes to resolve an object.

Take Cassini ISS looking at Titan, for instance. You can do the calculation that you describe, and calculate the pixel scale. But no matter how small the pixels are, you can't achieve a true resolution better than 1 km. The atmospheric haze scatters too much to do any better.

Which is why, even with Ralph's discussion, I am still convinced that with 300m properly-sampled RADAR "pixels", the RESOLUTION is ~750m.

- Jason
Go to the top of the page
 
+Quote Post
rlorenz
post Jul 13 2010, 12:46 AM
Post #33


Member
***

Group: Members
Posts: 609
Joined: 23-February 07
From: Occasionally in Columbia, MD
Member No.: 1764







QUOTE (Jason W Barnes @ Jul 12 2010, 06:21 PM) *
Which is why, even with Ralph's discussion, I am still convinced that with 300m properly-sampled RADAR "pixels", the RESOLUTION is ~750m.


The definition of the SAR resolution from Doppler bandwidth etc (giving 300m) is physically equivalent to
that which defines the resolution of an optical telescope as ~1.3lambda/D. I stand by my original number. The
pixels are 175m, which as you say is irrelevant.

QUOTE (Jason W Barnes @ Jul 12 2010, 06:21 PM) *
The real problem, though, and one that is quite common, is that what you're describing is the pixel scale. It is not the resolution. Resolution is the smallest thing that you can resolve, which is always larger than a pixel. In fact the theoretical minimum is 2 pixels. Experience with HiRISE is that about 3 pixels is what it really takes to resolve an object.


On this Jason and I might agree. The original definition of resolution is all about the diffraction patterns of
point objects overlapping - so to separate 2 objects X apart, you need a resolution of 'better than X' and really
a pixel scale of better than X/3 (so you see some dark space between your two bright points). But you can
still do science at much less than the pixel scale (e.g. fitting a point spread function, so you can determine
the position of an object to not only much less than the resolution, but also much lower than the pixel scale -
1/10 of a pixel is not uncommon). But this sort of thing (and 'super-resolution' techniques) rely on well-characterized
point spread functions, and high signal to noise data. Which brings me back to my original point that 'useful'
resolution depends what you are trying to do and on the signal/noise.

Go to the top of the page
 
+Quote Post
nprev
post Jul 13 2010, 01:00 AM
Post #34


Merciless Robot
****

Group: Admin
Posts: 8783
Joined: 8-December 05
From: Los Angeles
Member No.: 602



Speaking as a layperson, very interesting discussion, you guys. I had no idea that 'resolution' as a figure of merit had so many contingencies other than atmospherics; never thought about it.


--------------------
A few will take this knowledge and use this power of a dream realized as a force for change, an impetus for further discovery to make less ancient dreams real.
Go to the top of the page
 
+Quote Post
Greg Hullender
post Jul 13 2010, 02:33 PM
Post #35


Senior Member
****

Group: Members
Posts: 1018
Joined: 29-November 05
From: Seattle, WA, USA
Member No.: 590



I think I see. Part of my confusion is that, in the computer biz, when we talk about the resolution of a screen, we always just mean the pixels. So, if I understand correctly, when you guys talk about resolution, you include all the factors that could degrade the image: the pixel scale, of course, but also atmospheric noise, diffraction, probably even noise in the electronics themselves. Beyond a certain point (all other things being equal) increasing the pixel scale will not improve resolution at all. And so the dispute you two are having is not over the actual hardware being used but over the effect of these other factors?

Ralph: When you talk about doing science below the pixel scale, are you talking about making repeated observations of the same thing and computing a higher-resolution model from that? That is, you have to depend on having a static target. Or do you mean something more complex? (I may be guilty of seeing Bayesian and Markov Networks everywhere these days.) ;-)

--Greg
Go to the top of the page
 
+Quote Post
vjkane
post Jul 13 2010, 09:08 PM
Post #36


Member
***

Group: Members
Posts: 704
Joined: 22-April 05
Member No.: 351



Properly speaking, resolution should be interpreted in terms of what you want to detect. Haze, low contrast, size of structure (frequently referred to as 'scale'), etc. all can effect how finely you can discriminate what you want to detect. To give an example. I may want to image trees that are 10 m across in an image in which each pixel covers an area of 1x1 m. So, the pixel resolution is 1 m, but my object resolution is 10 m. If tree sizes varied, the smallest tree I could reliably detect would probably have a crown 2 m across. It's actually more complicated than this since the tree probably wouldn't be exactly centered right, and I'd end up with some pixels that are all tree and some that are a mixture of tree and background, making the tree id harder. So my best reliable resolution (i.e., smallest tree) is probably 3x3 m.

However, people who build optical and camera systems want to have a way to compare the theoretical capabilities of the hardware. Sometimes, lines per inch are quoted (I've seen this in camera lens reviews, which then ignores the grain of the film, which would be equivalent to the size and density of pixels in an electronic system). In planetary missions, the instantaneous field of view is often given, which describes the angle seen by an individual pixel. To get theoretical resolution, you need this information and the distance to the object being imaged.

Computer monitor resolution is usually quoted as an area of pixels (e.g., 1600x1200 pixels), but the pitch between pixels is somewhat equivalent to the IFV in camera systems, but not exactly. Every camera forms images in an array of pixels x by y in size (with push broom cameras have a single line of pixels and spacecraft motion creates the y dimension). Image size can be quoted in x by y dimensions, but that says nothing about the resolution. You can put the same 1000x1000 CCD chip behind both a telescopic and a wide angle lens and get very different resolutions that cover very different areas on the surface

The above is not my area of specialty, so others may add or correct.

What is my area of specialty is sub-pixel interpretation for Landsat scenes. Each pixel of a Landsat scene has several 'colors' that represent key spectral ranges. (More technically, each landsat image is really several images, each of which was imaged with one filter.) If you know the dominant materials within a scene, you can use that knowledge to determine the approximate area that each material represents within the area imaged by an individual pixel. To continue with my tree analogy, if you know that everything in the picture is tree canopy or a soil background, you can model how much of the area covered by each pixel. The technique is called spectral unmixing.

It sounds like similar approaches can be used with radar data. They key, though, is that you need to know a lot about the surface you are imaging.


--------------------
Go to the top of the page
 
+Quote Post
rlorenz
post Jul 14 2010, 04:54 PM
Post #37


Member
***

Group: Members
Posts: 609
Joined: 23-February 07
From: Occasionally in Columbia, MD
Member No.: 1764



QUOTE (Greg Hullender @ Jul 13 2010, 10:33 AM) *
And so the dispute you two are having is not over the actual hardware being used but over the effect of these other factors?


I dont know what the dispute was about. As far as I am concerned there is no dispute, the resolution as normally
defined is 350m and that's that.


QUOTE (Greg Hullender @ Jul 13 2010, 10:33 AM) *
Ralph: When you talk about doing science below the pixel scale, are you talking about making repeated observations of the same thing and computing a higher-resolution model from that? That is, you have to depend on having a static target. Or do you mean something more complex? (I may be guilty of seeing Bayesian and Markov Networks everywhere these days.) ;-)


What you describe sounds a bit like how i understand 'super resolution' works. (I think with the procedure
can also be referred to as 'dithering' : it was used on Pathfinder, also on HST). Radio astronomers (with typically
low angular resolutions defined by the real aperture) use similar methods by e.g. allowing objects to pass
through the beam as the Earth or spacecraft rotate. The key is having a well-defined psf, and having a
precise enough pointing history to know where in the psf of the scene the pixels of the detector actually are.

But it can be as simple as taking an image (many pixels) of an object which is geometrically smaller than a
pixel (e.g. a star) but whose image, as defined by the telescope optical system, is much larger. The information
obtained by sampling many pixels allows you to estimate where the star was to much less than one pixel, if
the point-spread function is known. That's a nicely-posed problem for a point source like a star, the cleverness
(Maximum Entropy, Bayesian, whatever) comes in how you deconvolve that psf from the image to make a
best estimate of a more complex scene (non-point objects like planets, many stars, plus some noise).
Go to the top of the page
 
+Quote Post
Juramike
post Sep 7 2010, 03:58 AM
Post #38


Senior Member
****

Group: Moderator
Posts: 2785
Joined: 10-November 06
From: Pasadena, CA
Member No.: 1345



Artists impression of AVIATR aeroshell and parachute on way to lower atmosphere of Titan:

Attached Image


A MUCH higher-resolution image is here: http://www.flickr.com/photos/31678681@N07/4966028229/


--------------------
Some higher resolution images available at my photostream: http://www.flickr.com/photos/31678681@N07/
Go to the top of the page
 
+Quote Post
Juramike
post Sep 28 2010, 03:28 AM
Post #39


Senior Member
****

Group: Moderator
Posts: 2785
Joined: 10-November 06
From: Pasadena, CA
Member No.: 1345



Artistic impression of AVIATR Titan Airplane over Equatorial Bright Terrain. This has the latest aircraft design configuration (Configuration 3.0).

Attached Image


A higher resolution version is here: http://www.flickr.com/photos/31678681@N07/5032173290/


--------------------
Some higher resolution images available at my photostream: http://www.flickr.com/photos/31678681@N07/
Go to the top of the page
 
+Quote Post
Juramike
post Jun 10 2011, 01:11 AM
Post #40


Senior Member
****

Group: Moderator
Posts: 2785
Joined: 10-November 06
From: Pasadena, CA
Member No.: 1345



Reworked some of the AVIATR images with the latest configuration (AVIATR CF 3.0):

Here is an animated GIF of a camera tracking the AVIATR craft over a north polar Titan lake:

Attached Image

[*animated GIF: click to animate]

Full-res still image of AVIATR flying over north polar Titan lake: http://www.flickr.com/photos/31678681@N07/...57623741262207/

Full-res reworked image of AVIATR over Bright Terrain: http://www.flickr.com/photos/31678681@N07/...57623741262207/


--------------------
Some higher resolution images available at my photostream: http://www.flickr.com/photos/31678681@N07/
Go to the top of the page
 
+Quote Post
Juramike
post Nov 19 2011, 06:16 PM
Post #41


Senior Member
****

Group: Moderator
Posts: 2785
Joined: 10-November 06
From: Pasadena, CA
Member No.: 1345



Link to OPAG AVIATR mission presentation slides by Jason Barnes: http://www.lpi.usra.edu/opag/Oct2011/prese...IATR_Barnes.pdf


--------------------
Some higher resolution images available at my photostream: http://www.flickr.com/photos/31678681@N07/
Go to the top of the page
 
+Quote Post
ngunn
post Nov 19 2011, 10:58 PM
Post #42


Senior Member
****

Group: Members
Posts: 3516
Joined: 4-November 05
From: North Wales
Member No.: 542



Some of your images there Mike? Don't be shy!

Good proposal and illustrations. smile.gif
Go to the top of the page
 
+Quote Post
nprev
post Nov 20 2011, 12:23 AM
Post #43


Merciless Robot
****

Group: Admin
Posts: 8783
Joined: 8-December 05
From: Los Angeles
Member No.: 602



Nice. I'd like to see this one fly (literally).

Only thing that concerns me is the UAV mechanical deployment sequence (which seems complex, to say nothing of time-critical), and I suspect that this will be a major point of evaluation.


--------------------
A few will take this knowledge and use this power of a dream realized as a force for change, an impetus for further discovery to make less ancient dreams real.
Go to the top of the page
 
+Quote Post
stevesliva
post Nov 20 2011, 03:53 AM
Post #44


Senior Member
****

Group: Members
Posts: 1576
Joined: 14-October 05
From: Vermont
Member No.: 530



Uses an ASRG now. First nuclear plane. That's pretty sexy.

Actually wonder if DARPA would fund it. Nuclear UAVs wouldn't fly now, but nuclear UAV research? Why not.
Go to the top of the page
 
+Quote Post
Jason W Barnes
post Nov 24 2011, 01:39 AM
Post #45


Member
***

Group: Members
Posts: 131
Joined: 30-August 06
From: Moscow, Idaho
Member No.: 1086



QUOTE (nprev @ Nov 19 2011, 06:23 PM) *
Only thing that concerns me is the UAV mechanical deployment sequence (which seems complex, to say nothing of time-critical)


The deployment isn't too bad -- one joint on each wing unfolds, and that's it. And there's plenty of time to do it -- this isn't Mars, you know! 1/7th gravity, and 4 times the air density as Earth means we've got about 12 hours between entry and when we'd hit the ground. So not nearly as hair-raising as on Mars.

- Jason
Go to the top of the page
 
+Quote Post

4 Pages V  < 1 2 3 4 >
Reply to this topicStart new topic

 



RSS Lo-Fi Version Time is now: 19th March 2024 - 10:44 AM
RULES AND GUIDELINES
Please read the Forum Rules and Guidelines before posting.

IMAGE COPYRIGHT
Images posted on UnmannedSpaceflight.com may be copyrighted. Do not reproduce without permission. Read here for further information on space images and copyright.

OPINIONS AND MODERATION
Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators.
SUPPORT THE FORUM
Unmannedspaceflight.com is funded by the Planetary Society. Please consider supporting our work and many other projects by donating to the Society or becoming a member.