IPB

Welcome Guest ( Log In | Register )

3 Pages V  < 1 2 3 >  
Reply to this topicStart new topic
HST Albedo Map Processing
ZLD
post Jul 28 2015, 02:07 AM
Post #16


Member
***

Group: Members
Posts: 555
Joined: 27-September 10
Member No.: 5458



Thats the type of correspondence I was looking for JRehling. Thank you for this idea. I'd be highly interested in trying this.

I do think the imaging device can play a very large role in if this works though. Consumer products do emit much more noise that is very apparent when doing this in comparison to much higher quality research grade CCDs. I've tried this to some extent.


--------------------
Go to the top of the page
 
+Quote Post
Guest_alex_k_*
post Jul 28 2015, 09:56 AM
Post #17





Guests






Hi ZLD,
It's interesting, can you algorithm extract details from this image?
Go to the top of the page
 
+Quote Post
ZLD
post Jul 28 2015, 03:51 PM
Post #18


Member
***

Group: Members
Posts: 555
Joined: 27-September 10
Member No.: 5458



I only worked on a small crop due to the strange shape of the image. Sudden contrast changes, as on the edges plays havoc.

Attached Image


Attached Image


Attached Image


I know pretty much nothing about 67P or the Rosetta mission other than the occasional news bits that come out. Is there any context to this / any idea where it actually is?

As a further note, this could be way off. There is a lot of motion and Philae was slightly rotating during capture, making it very possible that some of the surface features were very distorted/mangled.

Edit: Rotating the image 180 degrees gives a better image I think.

Attached Image


Edit 2: Totally guessing here, because I'm very unfamiliar with 67P as I said before, but just in a cursory search, I think the dark area is probably the shadow from this pillar feature and I didn't correct for the motion enough.


Attached Image


--------------------
Go to the top of the page
 
+Quote Post
Guest_alex_k_*
post Jul 28 2015, 04:14 PM
Post #19





Guests






Thanks ZLD, a nice processing. It was interesting to see what your method can do with this really very difficult image (non-linear motion blur, etc), and to compare with previous attempts:

http://www.unmannedspaceflight.com/index.p...st&p=216419
http://www.unmannedspaceflight.com/index.p...mp;#entry216427
http://blogs.esa.int/rosetta/2014/12/18/up...#comment-283282

Maybe it will help you to tune your algorithm. If you want I'll give some more "difficult" samples for testing.
Go to the top of the page
 
+Quote Post
ZLD
post Jul 28 2015, 04:32 PM
Post #20


Member
***

Group: Members
Posts: 555
Joined: 27-September 10
Member No.: 5458



Sure thing, Alex. I'd be very interested. This has been an evolving process for months now.


--------------------
Go to the top of the page
 
+Quote Post
Guest_alex_k_*
post Jul 28 2015, 04:45 PM
Post #21





Guests






QUOTE (ZLD @ Jul 28 2015, 09:32 PM) *
Sure thing, Alex. I'd be very interested. This has been an evolving process for months now.


Ok, I'll find appropriate samples. One of my experiments you can see at neighbour thread.

About the comet, Philae make this shot about 5 minutes after first bouncing. So the place on the image is somewhere on the right top of this image, a blue arrow and further to right.
Go to the top of the page
 
+Quote Post
Guest_alex_k_*
post Jul 29 2015, 01:41 PM
Post #22





Guests






QUOTE (ZLD @ Jul 28 2015, 09:32 PM) *
Sure thing, Alex. I'd be very interested. This has been an evolving process for months now.


Keeping to the topic. This is a very amazing map (not HST):
Attached Image


Can your algorithm extract anything correct from it?
Go to the top of the page
 
+Quote Post
ZLD
post Jul 29 2015, 03:05 PM
Post #23


Member
***

Group: Members
Posts: 555
Joined: 27-September 10
Member No.: 5458



It looked computer generated from the set out so I went with that assumption.

Attached Image

Attached Image


Appears to be a cube (possibly rounded) a sphere, maybe with a texture but probably just jpeg noise, and something else at the right that I can't discern.


--------------------
Go to the top of the page
 
+Quote Post
Guest_alex_k_*
post Jul 29 2015, 03:22 PM
Post #24





Guests






QUOTE (ZLD @ Jul 29 2015, 08:05 PM) *
It looked computer generated from the set out so I went with that assumption.


Hmm... Actually it was a map of Mars, reconstructed from a set of single pixel measures made in 1960-es. The article is here.
If more proper image can be obtained, it should be closer to this:
Attached Image
Go to the top of the page
 
+Quote Post
ZLD
post Jul 29 2015, 03:23 PM
Post #25


Member
***

Group: Members
Posts: 555
Joined: 27-September 10
Member No.: 5458



Well a wrong assumption will certainly wreck everything following. Oops.

Would you care to describe how you reached your test image result? I can't seem to reproduce it myself.


--------------------
Go to the top of the page
 
+Quote Post
PDP8E
post Jul 29 2015, 04:45 PM
Post #26


Member
***

Group: Members
Posts: 808
Joined: 10-October 06
From: Maynard Mass USA
Member No.: 1241



rolleyes.gif when I saw that 'test image' I fed it to a little stochastic battalion of filters I maintain.
The only instruction for convergence was high freq edges exceeding 30 %
Here is what my script 'hallucinated'
Attached Image



--------------------
CLA CLL
Go to the top of the page
 
+Quote Post
Guest_alex_k_*
post Jul 29 2015, 04:58 PM
Post #27





Guests






QUOTE (ZLD @ Jul 29 2015, 08:23 PM) *
Would you care to describe how you reached your test image result? I can't seem to reproduce it myself.


My best result was the following:
Attached Image


It is uncertain due to strong Fourier extrapolation, and match with "ground truth" image is unclear.
But there's some correspondence with a real map of Mars - Tharsis volcanos, etc.
Attached Image

(animated)

Though maybe features are just exaggerated noise. It's interesting if it possible to extract the real details.
Go to the top of the page
 
+Quote Post
ZLD
post Jul 29 2015, 05:20 PM
Post #28


Member
***

Group: Members
Posts: 555
Joined: 27-September 10
Member No.: 5458



Haha PDP8E.

Also, finally got around to looking through the paper. If I am understanding correctly, the very orange image is based on a 6x1 pixel image and then stretched out. This alone would never be recoverable into a true image. Very interesting paper though.



Attached Image


This should be close to what was resampled. I'm not quite sure how they simulated the upper and lower lines. Probably just an average falloff or something.

----------
Edit
----------
I decided to do a quick try to see what could be done with a 6x1 image. Not much still but it seems to correspond to whats on the map. Color data is derived from the same 6x1 pixel image, just applied differently than it was in the paper.

Attached Image


I could see their method being pretty useful in observing exoplanets if the resolution can hit a few pixels across.



--------------------
Go to the top of the page
 
+Quote Post
JRehling
post Jul 29 2015, 06:18 PM
Post #29


Senior Member
****

Group: Members
Posts: 2530
Joined: 20-April 05
Member No.: 321



An old chestnut from the technology of image file compression: If two different source images are compressed and they make identical target files, you cannot recover from the target which one was the original source.

These one-row images present a stark situation: If you have three pixels which are, in sequence: BLACK - GRAY - WHITE, you cannot determine whether the real object (at, say, 1x100 pixel resolution) had a sharp cliff from black to white or a gradual transition. You cannot. The information is not there.

But there are some possible (and related) saving graces that can give you additional information:

1) You may have a priori information about the likely transitions. If you knew, for example, that the image was of Mercury, you would have a lot of constraints on the norms for transitions. If you knew, moreover, that the image was of Mercury at a high phase angle, you would have still more information. But if you didn't know that, you'd be much more limited in your ability to guess between abrupt versus gradual transitions.

Just knowing that the image is of a body in space (and not, say, a Captcha of blurred text) is potentially useful information, but that only goes so far. Iapetus, Mars, and Mercury have profoundly different norms for how sharp/gradual transitions are. You can guess with one set of norms and luckily get some details right, but it was a guess. If you guess the world is visually like Mars but it turns out to be visually like Iapetus, your guess is simply wrong. And if you have no a priori information, the guess remains a guess.

2) The one-row case is not a common one, so you have information from adjacent rows which might inform how abrupt/gradual the transitions are. This doesn't provide new information in an absolute sense (the real object might have sharply defined square blocks as its true shading!), but one can infer norms across the surface and then use those locally.

3) When you have selective high-resolution imaging of a world but only low-resolution imaging in many other areas, you can use this information to set the parameters in (1). This seems applicable for, eg., Europa and Pluto.

But given, say, an exoplanet with no high-resolution data possible, the ability to guess at details more fine than we can see in the raw image is going to be close to nil.
Go to the top of the page
 
+Quote Post
ZLD
post Jul 29 2015, 08:08 PM
Post #30


Member
***

Group: Members
Posts: 555
Joined: 27-September 10
Member No.: 5458



I absolutely do agree that at a measured 'image', a single pixel in height, will leave the raw information as the best obtainable at the time. Without lots of other data, theres nothing else to work from.

However, it wouldn't be completely useless to make inferences based on lots of other collected data and following up by defining several scenarios based on multiple interpretations of the data. Isn't that the basis for forming future experiments most of the time?


--------------------
Go to the top of the page
 
+Quote Post

3 Pages V  < 1 2 3 >
Reply to this topicStart new topic

 



RSS Lo-Fi Version Time is now: 25th April 2024 - 10:37 AM
RULES AND GUIDELINES
Please read the Forum Rules and Guidelines before posting.

IMAGE COPYRIGHT
Images posted on UnmannedSpaceflight.com may be copyrighted. Do not reproduce without permission. Read here for further information on space images and copyright.

OPINIONS AND MODERATION
Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators.
SUPPORT THE FORUM
Unmannedspaceflight.com is funded by the Planetary Society. Please consider supporting our work and many other projects by donating to the Society or becoming a member.