IPB

Welcome Guest ( Log In | Register )

10 Pages V  « < 8 9 10  
Reply to this topicStart new topic
DSCOVR
JRehling
post Jan 24 2017, 06:11 PM
Post #136


Senior Member
****

Group: Members
Posts: 1925
Joined: 20-April 05
Member No.: 321



If I were to try to build a robust solution to this, I think I'd try the following. In large part, I think this follows more or less the algorithm that we people use in inspecting an image of the Earth.

Preparatory indexing:
1) Make an index of the shapes of coastlines at the resolution of ~ 5km/pixel. In particular, index segments where coastlines change orientation such as the Strait of Hormuz, the east coast of Somalia, the Baja peninsula, Gibraltar, southern Italy, Tierra del Fuego, Newfoundland, Michigan, etc.

Processing a single image:
2) At the time the image was taken, make a list of the coastline segments that are located within ~60 of the sub solar point. Perform a transformation to adjust them to how they should appear from the direction of the Sun, which will approximate the geometry of DSCOVR.

3) Run edge detection on the image, excluding any edges that are bounded by white regions, which are probably clouds.

4) Match the detected segments against the projected indexed segments from (2).

5) If three or more segments are matched (possibly two that are far apart), you now have a good registration between the image and the Earth.

Probably the tricky step is (4), but there's research on this.
Go to the top of the page
 
+Quote Post
scalbers
post Jan 24 2017, 07:53 PM
Post #137


Senior Member
****

Group: Members
Posts: 1228
Joined: 5-March 05
From: Boulder, CO
Member No.: 184



Interesting to consider this procedure. I wonder how this solution would work at the limb. I've been able to match the coastlines and other features fairly well. Clouds in my matching were also useful to check. However the extreme limb is where things appeared to drift off, possibly due to the setting of the reference limb in the atmosphere as mentioned earlier. It seems this might work OK with the raw data (however that would be available) and less well with the displayed web images or L1B image data. It's also helpful to consider the actual position of DSCOVR that can be around 10 degrees from where the sun is located.

http://stevealbers.net/albers/allsky/outerspace.html


--------------------
Go to the top of the page
 
+Quote Post
JRehling
post Jan 25 2017, 05:30 PM
Post #138


Senior Member
****

Group: Members
Posts: 1925
Joined: 20-April 05
Member No.: 321



QUOTE (scalbers @ Jan 24 2017, 12:53 PM) *
I wonder how this solution would work at the limb.


I wonder how possible it is to capture the limb. We know that we can see stars and the Sun when they would, on an airless globe, be below the horizon. That means, conversely, that vantage points in space have a view of points on the surface beyond the literal horizon, which means that other points must be projected to other locations. In principle, this means something very messy is happening at the limb. And a small displacement near the limb corresponds to a large difference in position on the map. The devil is in the details as to the magnitude of that effect, or if it affects such a tiny boundary around the disk as to be negligible.
Go to the top of the page
 
+Quote Post
scalbers
post Jan 25 2017, 06:20 PM
Post #139


Senior Member
****

Group: Members
Posts: 1228
Joined: 5-March 05
From: Boulder, CO
Member No.: 184



Using my simulated imagery (link 2 posts above) as an example, it seems possible to determine how far the (often obscured solid surface) limb is located below the top of the visible atmosphere. With a non-zero phase angle, the shading effects can also be considered. This is the more significant aspect I think with the visible atmosphere extending perhaps 30-60km above the limb.

Refraction is also of interest as you note. The actual lateral displacement of the limb (and locations nearby) from refraction is only about 1km, smaller than the camera resolution. This small amount can still allow another 100km or so of land to be squeezed into theoretical visibility near the limb.


--------------------
Go to the top of the page
 
+Quote Post
Michael Boccara
post Mar 7 2017, 03:13 PM
Post #140


Newbie
*

Group: Members
Posts: 6
Joined: 29-August 16
From: Israel
Member No.: 8032



Hi

Sharing some nice results I had with interpolating EPIC images and projecting then on a planar map (equirectangular).

https://vimeo.com/207296528/b9b8eee67c

The video is at its optimal quality, in 4K resolution and 120Hz..

The images are generated in real-time as I stream the EPIC images (and their metadata) from the NASA website.
Interpolation is made by simple blending of perspective projection on a 3D ellipsoid model of the Earth, executed in the GPU via a custom fragment shader. Good enough to look almost like optical flow.
The conversion from 3D ellipsoid model into equirectangular map is also done in real-time, via a vertex shader on the GPU.

Here is another video from my Blueturn app (freely available on all platforms), that shows better the transition between spheric and planar.

https://vimeo.com/207296473/28f01f0807

Enjoy!
Go to the top of the page
 
+Quote Post
scalbers
post Mar 9 2017, 12:25 AM
Post #141


Senior Member
****

Group: Members
Posts: 1228
Joined: 5-March 05
From: Boulder, CO
Member No.: 184



Looks really nice to see the smooth changes in the clouds - interesting to see how much they evolve during the course of a day. It's a fun challenge to try and perceive this with a vertical perspective view where the Earth is rotating.

As a quick note I often like to suggest a bit lower contrast between the blue ocean/sky and the white clouds to have a more linearly proportional displayed brightness.


--------------------
Go to the top of the page
 
+Quote Post
Michael Boccara
post Mar 9 2017, 06:44 AM
Post #142


Newbie
*

Group: Members
Posts: 6
Joined: 29-August 16
From: Israel
Member No.: 8032



Thanks Steve.
This is indeed a nice global view on the clouds motion without any collage artifact.
And taking note of your remark about the blue/white contrast: aren't you suggesting to rather increase it ?

My next plan for the default sphere view is to indeed add some 3D navigation to pivot around the Earth beyond the L1 viewpoint. It's a simple thing to do with Unity. A geosynchronous view from above the pole (one with constant daylight) would indeed be an interesting vantage point to see the dynamics or the Coriolis force on the clouds. Maybe soon on the SOS ? wink.gif
Go to the top of the page
 
+Quote Post
scalbers
post Mar 9 2017, 07:51 PM
Post #143


Senior Member
****

Group: Members
Posts: 1228
Joined: 5-March 05
From: Boulder, CO
Member No.: 184



For the contrast I was thinking of reducing it, by increasing the brightness and having more of a sky blue color in the clear sky areas over the ocean. The bright clouds would then stay about the same. Here's an updated version of my earlier blog post on the Planetary Society site with a detailed rationale. Another example of this is ugordan's very nice LROC colorized image. Even this one appears though to have some contrast enhancement.

Attached Image


One current thing I'm doing is improving the sun glint so it looks more accurate for a crescent Earth (maybe for a future version of DSCOVR) as well as when fully lit. Here is an animation of the various phases of the Earth.

Indeed SOS will be fun to look at this with. It's easy to change the viewpoint then to see a polar view. I'll try this out with both geosynchronous and sun synchronous. Note that SOS Explorer can also do the 3D navigation. There is a tradeoff to the versatility of various viewpoints in that we wouldn't then be properly seeing the hazier looking limb.


--------------------
Go to the top of the page
 
+Quote Post
Stratespace
post Mar 13 2017, 07:43 PM
Post #144


Junior Member
**

Group: Members
Posts: 21
Joined: 7-January 13
Member No.: 6834



QUOTE (Michael Boccara @ Mar 7 2017, 04:13 PM) *
Sharing some nice results I had with interpolating EPIC images and projecting then on a planar map (equirectangular).
Great work, congrats ! I didn't work again on the data since last time, but apparently you achieve much better navigation/projection performance than I did. How did you improve it so dramatically ? I want to go back to work with such metadata !
Go to the top of the page
 
+Quote Post
Michael Boccara
post Mar 22 2017, 05:41 AM
Post #145


Newbie
*

Group: Members
Posts: 6
Joined: 29-August 16
From: Israel
Member No.: 8032



QUOTE (Stratespace @ Mar 13 2017, 09:43 PM) *
Great work, congrats ! I didn't work again on the data since last time, but apparently you achieve much better navigation/projection performance than I did. How did you improve it so dramatically ? I want to go back to work with such metadata !


Thanks Stratespace.
I had dramatic improvements after calculating the enclosing ellipse of the Earth instead of the enclosing circle. Plus I had a bug in the optimal enclosing circle.
But the funniest is that after I did that, my resulting ellipse always had its normalized center rounded at (0.500,0.500) (yes zeros until the 3rd decimal), and the axis sizes being constantly at (0.777,0.776), accounting for the ellipsoid polar squeeze. It means that the images were originally aligned by NASA's EPIC team. In other words, the image is centered to the Earth center at 1 pixel precision. In other word, I worked hard for nothing smile.gif That was not so a few months ago, but seems like a late refresh of the data brought this calibration improvement.

Bottomline, you can proceed with your work and use the EPIC metadata as-is.
Note that I'm using the L1B data from the EPIC website(https://epic.gsfc.nasa.gov/), not from the ASDC archive (https://eosweb.larc.nasa.gov/project/dscovr/dscovr_table).
I don't think it makes much of a difference. However please note this explanation I once had from a member of the EPIC team:
The level 1B (L1B) data is the science data product. This has the raw calibrated data that the scientists use. It also includes the complete geolocation information (per pixel lats/lons/angles, etc) and the astronomical/geolocation values required to do the calculations. The complete astronomical/geolocation metadata has been added to the images.


Michael
Go to the top of the page
 
+Quote Post

10 Pages V  « < 8 9 10
Reply to this topicStart new topic

 



RSS Lo-Fi Version Time is now: 24th March 2017 - 11:57 PM
RULES AND GUIDELINES
Please read the Forum Rules and Guidelines before posting.

IMAGE COPYRIGHT
Images posted on UnmannedSpaceflight.com may be copyrighted. Do not reproduce without permission. Read here for further information on space images and copyright.

OPINIONS AND MODERATION
Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators.
SUPPORT THE FORUM
Unmannedspaceflight.com is a project of the Planetary Society and is funded by donations from visitors and members. Help keep this forum up and running by contributing here.