DSCOVR |
DSCOVR |
Jan 6 2006, 08:55 PM
Post
#101
|
|
Senior Member Group: Members Posts: 2454 Joined: 8-July 05 From: NGC 5907 Member No.: 430 |
ADMIN NOTE: Please note that this topic was unavoidably poltical before the 'No Politics' rule. Please restrict future comments to the mission/spacecraft/news updates etc.
WHAT'S NEW Robert L. Park Friday, 6 Jan 06 Washington, DC DEEP SPACE CLIMATE OBSERVATORY KILLED. http://bobpark.physics.umd.edu/index.html -------------------- "After having some business dealings with men, I am occasionally chagrined,
and feel as if I had done some wrong, and it is hard to forget the ugly circumstance. I see that such intercourse long continued would make one thoroughly prosaic, hard, and coarse. But the longest intercourse with Nature, though in her rudest moods, does not thus harden and make coarse. A hard, sensible man whom we liken to a rock is indeed much harder than a rock. From hard, coarse, insensible men with whom I have no sympathy, I go to commune with the rocks, whose hearts are comparatively soft." - Henry David Thoreau, November 15, 1853 |
|
|
Dec 9 2016, 10:49 PM
Post
#102
|
|
Junior Member Group: Members Posts: 22 Joined: 7-January 13 Member No.: 6834 |
I begin to be a bit desperate about the DSCOVR images.
I've first worked with the PNG images provided by their server, associated with their metadata. After days of work, I still couldn't understand what I was seeing: when projected on a map with Spice kernels, the data did not seem very well registrated, and there were slight shifts changing depending on the local view angle. I contacted NASA that nicely replied that the PNG images were not scientifically accurate in terms of localization, and that I should work with the L1B calibrated data instead. To make it short, those L1B data are calibrated and include all bands registred in a common reference frame. In other terms, for each image you have the gray level for each band, as well as the local latitude/longitude corresponding to each pixel, or map of coordinates if you prefer. But after hours of play with those data, it appears the problem is even worse than before: those L1B data are even wrong-er and it's impossible as it to project properly the images on a map. As a short demonstration, I projected a "true" coastline map onto the images according to their lat/long maps. Here is the result: After a bit of investigation, I finally understood where the problem is in those raw calibrated data. Unfortunately, the maps associated to the images have been generated very roughly, the DSCOVR EPIC team have considered all "non-null" pixels as "the Earth", and the darkness of space as "not Earth". As a result, the atmosphere of the Earth is considered as being the ground, and ten pixels outside the Earth the maps still indicate different latitudes and longitudes ! You can see this fact on the following images, where I show a portion of Earth's limb and the same portion with the size of the Earth uncorrectly given by the metadata overimposed on it: As a consequence, when you project your coastline map on the images for debug, you see the coast floating up into the upper atmosphere: I think we can all agree it is clearly off-target... I've already tried different techniques to correct for this issue, such as detecting the "true" size of the Earth in the images and to shrink all the lat/long maps accordingly, but this is very hard in practice, as you need to make the difference between lit portions of the limb that are clearly visible and barely visible portions of the limb that are already into the night. Remember that DSCOVR is orbiting around L1 and not on L1, thus shadows are visible. I've implemented alternative techniques as well, such as clouds removal and recognition of different patterns on the Earth to morph the metadata they provide to a more correct geometry, but it is very hard to make it 100% automated with high confidence without spending weeks and weeks of effort. As a last chance, I've tried working with the L1A data (uncalibrated), but it fails for the same reason. My conclusion so far is: I'm done with those data, they lack of reliable metadata to work with them at pixelic resolution with an automated process, to make animated maps for example. It's okay to make movies on the original images themselves (no transformation needed), it's okay to transform one of those images, but it's pointless working on them for a smooth transformed animation. At least without days and days of work. The optical flows and other filtering techniques I've implemented to guess automatically what should be the correct lat/long grid for each image are quiet complex, but still insufficient to do the job. That's a shame, the outcome would have been awsome... |
|
|
Lo-Fi Version | Time is now: 25th September 2024 - 11:58 AM |
RULES AND GUIDELINES Please read the Forum Rules and Guidelines before posting. IMAGE COPYRIGHT |
OPINIONS AND MODERATION Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators. |
SUPPORT THE FORUM Unmannedspaceflight.com is funded by the Planetary Society. Please consider supporting our work and many other projects by donating to the Society or becoming a member. |