IPB

Welcome Guest ( Log In | Register )

5 Pages V  < 1 2 3 4 > »   
Reply to this topicStart new topic
Himawari
scalbers
post Aug 24 2015, 10:02 PM
Post #16


Senior Member
****

Group: Members
Posts: 1620
Joined: 5-March 05
From: Boulder, CO
Member No.: 184



Very nice and very impressive Dan. I may have to give this a try at some point. For tracking I can suggest that the pixels should be mappable to lat/lon by assuming a vertical perspective map projection. The geosynchronous satellite is at a known distance from the Earth and we could probably figure out the sub-point. I might employ such a projection to create cylindrical map projection images from the satellite images.


--------------------
Steve [ my home page and planetary maps page ]
Go to the top of the page
 
+Quote Post
hendric
post Aug 25 2015, 04:14 PM
Post #17


Director of Galilean Photography
***

Group: Members
Posts: 896
Joined: 15-July 04
From: Austin, TX
Member No.: 93



Once the HD fully loaded it was amazing. I agree the Butterworth interpolator performed much better. I assume you plan to use this on Jupiter/Saturn pics? Can't wait for Juno pics to start coming down!


--------------------
Space Enthusiast Richard Hendricks
--
"The engineers, as usual, made a tremendous fuss. Again as usual, they did the job in half the time they had dismissed as being absolutely impossible." --Rescue Party, Arthur C Clarke
Mother Nature is the final inspector of all quality.
Go to the top of the page
 
+Quote Post
Dan Delany
post Sep 16 2015, 09:39 PM
Post #18


Junior Member
**

Group: Members
Posts: 23
Joined: 15-February 14
Member No.: 7141



Thanks, Steve and Richard, for the kind words and good advice. I've been continuing to work on this project as I have time. Progress has been somewhat slow, as I ran into a bug in the Butterflow library when I tried to correct for the missing frames - however, I reached out to the creator of the library and he has been kind enough to help me out and fix the issues.

Last night was the first time I got the new code running with missing-image correction, and I let it run all night to render five more videos, with one day of data each. Here are the results - again, make sure you have the Youtube quality setting set manually to "1080p60" for best viewing: https://www.youtube.com/watch?v=PXD9MvLSXCk...ICG0x6Vu3mV9Uma

These are even slower, with 1 real frame and 59 interpolated frames per second of video (previous video was twice as fast). I think these may even be a little too slow for my taste, I will probably render to a playback speed ~halfway in between the two in the future. Each 90-second x 60-FPS video takes about an hour and a half to render on my Macbook Pro, which is another limiting factor. But I hope to get it running soon on my desktop machine with a fast CPU+GPU, which should improve the render time by quite a bit.

It's still in a bit of a messy state, but my script is on Github in case its helpful for anyone else.

@Steve thanks very much for the notes re: projection, I knew the basic concepts but didn't know what it was called. I have been doing some reading about it, and may try to estimate some of the remaining unknown parameters soon to see how closely I can map lat/long to pixel location in order to do typhoon tracking. However, for now I am just focused on setting up my existing script to be as repeatable and hands-off as possible. I would really like to be able to just run it once every day, and have the script take care of not only rendering the videos but also splicing them together and uploading them to Youtube. This way I could have a nearly-automated Youtube account which posts new "Earth from Space" videos daily.

@Richard I'm not quite sure what else I'll apply this to yet. I've tried a couple of other quick and dirty experiments, with some decent results and some not-so-decent. The images really need to be close enough together (temporally) for the algorithm to detect feature movement from one image to the next - if the features move too far, you end up with some really gnarly artifacts that look much worse than the original, low-framerate video. But some Jupiter photos would be a great application. I'm also looking forward to applying the technique to the DSCOVR Epic images once they start getting released regularly later this year - hopefully the planned cadence of 1 image per 30 minutes will be enough to get good results.
Go to the top of the page
 
+Quote Post
ZLD
post Sep 16 2015, 09:45 PM
Post #19


Member
***

Group: Members
Posts: 555
Joined: 27-September 10
Member No.: 5458



Dan, this is just fantastic. I've wanted to watch something similar for decades! I do agree this may be a little on the slow side for viewing and for the processing with there being a few jumps here and there.

I wonder if this process would work with HiRISE dust storm captures?


--------------------
Go to the top of the page
 
+Quote Post
scalbers
post Sep 16 2015, 10:20 PM
Post #20


Senior Member
****

Group: Members
Posts: 1620
Joined: 5-March 05
From: Boulder, CO
Member No.: 184



Just took a quick look so far - very nice again with the missing image correction and such. I actually like the slower pace, as there is so much detail that one can gaze at in a leisurely fashion.


--------------------
Steve [ my home page and planetary maps page ]
Go to the top of the page
 
+Quote Post
hendric
post Sep 17 2015, 05:36 PM
Post #21


Director of Galilean Photography
***

Group: Members
Posts: 896
Joined: 15-July 04
From: Austin, TX
Member No.: 93



How big are your files Dan? Is it possible to host them somewhere? Youtube keeps throttling the videos and I lose detail when that happens. Very awesome movies.


--------------------
Space Enthusiast Richard Hendricks
--
"The engineers, as usual, made a tremendous fuss. Again as usual, they did the job in half the time they had dismissed as being absolutely impossible." --Rescue Party, Arthur C Clarke
Mother Nature is the final inspector of all quality.
Go to the top of the page
 
+Quote Post
Dan Delany
post Sep 17 2015, 11:49 PM
Post #22


Junior Member
**

Group: Members
Posts: 23
Joined: 15-February 14
Member No.: 7141



Thanks a lot for the encouragement, all. Most people I mention this to give me a funny look and say something like "wait, why are you doing this?". "Because it's cool!" should be a good enough answer!

The lossless files I make are BIG, around 1 GB or more per minute of video. But they seem to compress well down to ~50-80 MB per minute without any artifacts that I can see. I could probably bring it down a bit further if I tweak some ffmpeg settings. I will probably only keep these compressed copies since my hard drive will fill up soon otherwise smile.gif I'll work on finding a way to host these somewhere accessible. I would put them on Amazon S3 or something, but I don't want to be on the hook for a big bandwidth bill if the links get posted on a big website or someone hammers them with a script or something.

For now I have been putting files on Mega, which gives me 50 GB to play with. It seems to work OK but not ideal. Not sure if I will be able to upload to there automatically using a script. Here's one I uploaded - Japan on 9/11/2015 (80MB), one of my favorites. (Click "download through your browser"). More to come.

The most noticeable remaining artifact is the fact that sometimes the color channels on the RGB frames are misaligned by a bit, and the resulting color shift looks odd in the video. I'm not sure how to handle these frames. Detecting and realigning them automatically with a script seems difficult or maybe impossible. I could just manually tag the bad frames, drop them, and interpolate around them, but it somehow it feels wrong to just throw away the data, especially since some are only misaligned by a bit.
Go to the top of the page
 
+Quote Post
hendric
post Sep 18 2015, 03:55 PM
Post #23


Director of Galilean Photography
***

Group: Members
Posts: 896
Joined: 15-July 04
From: Austin, TX
Member No.: 93



Thanks Dan, that's truly enchanting.


--------------------
Space Enthusiast Richard Hendricks
--
"The engineers, as usual, made a tremendous fuss. Again as usual, they did the job in half the time they had dismissed as being absolutely impossible." --Rescue Party, Arthur C Clarke
Mother Nature is the final inspector of all quality.
Go to the top of the page
 
+Quote Post
tanjent
post Sep 28 2015, 05:31 PM
Post #24


Member
***

Group: Members
Posts: 214
Joined: 30-December 05
Member No.: 628



I would love to see s similar animated view of typhoon dujuan hitting Taiwan and smashing itself to pieces against the central mountain range. Unfortunately most of the action takes place (is taking place) at night. We will find out tomorrow what else got smashed to pieces.

http://www.cwb.gov.tw/V7e/observe/satellite/Sat_EA.htm
Go to the top of the page
 
+Quote Post
Dan Delany
post Sep 29 2015, 08:44 PM
Post #25


Junior Member
**

Group: Members
Posts: 23
Joined: 15-February 14
Member No.: 7141



Ask, and you shall receive smile.gif I just uploaded some videos to this playlist for a cropped region that includes Eastern China and Taiwan, and there are some pretty incredible views of Typhoon Dujuan taking aim at Taiwan from the past few days. I'll upload another one tonight with the most recent images. Usual disclaimer applies - make sure you set the quality to 1080p60. I've noticed that this quality setting seems to "stick" better in Chrome than in other browsers, FWIW.

I've gotten my Youtube upload script working fairly well, and now I'm in the process of rendering/uploading lots of daily videos to that one and these 3 other regional playlists: Japan/Korea/Beijing/Shanghai, Thailand/Philippines (+others), and Indonesia/Malaysia/Singapore. They each have almost a month of data now. I have a few more regions to work on - namely Australia, since it doesn't fit natively in my 1920x1080 crop window, I'll have to do a larger view and then downscale before creating the video (either that or split Australia across two regions). I'm also planning a couple of crops of different regions of the Pacific Ocean, since that's where a lot of the big storms initially form.

I'm still playing with the format of the Youtube channel, so let me know if you have feedback. Primarily I'm wondering whether the regional playlists should be in chronological order, so you can see the whole time span if you play from the beginning, or in reverse order so the newest videos are always at the top. I think I'm leaning towards reverse order; the oldest images in the RAMMB dataset have some weird processing applied, and I don't really want those at the top of the playlist. I suppose I could have two playlists per region, one for each order, but that may be confusing. I'm also considering rolling up the dailies into weekly videos for these playlists, so the videos are longer and you spend less time loading. However, I really like the immediacy of being able to see "yesterday from space" every day so I would probably upload them in addition to the dailies. This also confuses the playlist order issue a bit because, if the playlist uses weekly videos in reverse order, you'd see a week chronologically, followed by the previous week chronologically, etc...

Anyway, hope you enjoy! I haven't made progress yet on hosting the original video files somewhere, but I haven't forgotten about it, will look into it more soon.
Go to the top of the page
 
+Quote Post
tanjent
post Sep 30 2015, 03:29 AM
Post #26


Member
***

Group: Members
Posts: 214
Joined: 30-December 05
Member No.: 628



That's terriffic, Dan. Much more fun to watch from orbit than from ground level.
I can see that as you experiment with different processing options, you would want to lead with the most recent results that incorporate the benefit of all your experience.
Maybe I can put together some kind of playlist to view them in natural order.

By the way, do you have any insights about what the Taiwan CWB (in my link) is doing in order to provide their continuous night-and-day coverage with seemingly consistent "illumination"? Are they presenting some kind of false-colored view as seen from non-visible wavelengths?

I have noticed that low cloudiness and rain doesn't always show up in their hourly posted views - day or night. Sometimes it can be raining on the ground but appears perfectly clear in the satellite view. That may partially explain why the eye seems to be "wide open" in the posted photos, while in the animations there are many periods when it is clouded over. Did Dujuan have ...cataracts?
Go to the top of the page
 
+Quote Post
Dan Delany
post Oct 4 2015, 11:03 PM
Post #27


Junior Member
**

Group: Members
Posts: 23
Joined: 15-February 14
Member No.: 7141



@tanjent Yes, in addition to the 3 visible bands, Himawari has 13 infrared bands which work at night. I assume that their process is to use one of those bands with an alpha channel to get transparency, then overlay the transparent image on an existing, cloudless RGB image with even daytime illumination. This also explains why they don't match up exactly with visible observations, since the IR bands are sensitive to different atmospheric phenomena than the visible.

This week I have been trading e-mails with Steve Miller and Dan Lindsey at NOAA/CSU/CIRA, who are responsible for the great true-color images on the CIRA/RAMMB page that I'm using to create my videos. They have seen the Youtube channel, and Steve was kind of flabbergasted at first that someone else had managed to replicate their composite process, until I showed him the image credit (click 'see more' on youtube description) and explained that I was using their output laugh.gif But they both seem excited about the motion interpolation technique, its capabilities, and its applicability to other datasets. So I am in the process of writing some better documentation for my project (which can be found here), to help them get Butterflow + my scripts running on their machines.

They are continuing to work on their image process, and just pushed an update which includes better color correction near the terminator, so the images are continually improving! And they're working on a new product similar to the VIIRS day/night band that includes night-time coverage, which should be incredible.

I also tried interpolating some hourly NOAA FIM cloud forecast images from Science on a Sphere that Steve Albers (scalbers) sent me. These work pretty well too, you can see a downscaled version and a cropped version here:

https://www.youtube.com/watch?v=KPpq_XmwKkM
https://www.youtube.com/watch?v=Q9zHM8IFB_k

Finally, I've been working my way through the original paper about the optical flow algorithm underlying the interpolator, and it's given me lots of ideas (most of them likely well outside my coding abilities) about how the technique could be improved for the specific use case of satellite imagery. I'll write some of them up when I have time, in the "discussion" section of the documentation mentioned above. In particular, the bit about incorporating a priori knowledge seems useful, as we often have lots of a priori knowledge about the expected flow from one image to the next (ie. most movement is constrained to the surface of the sphere, we know the timing of the terminator, etc.).
Go to the top of the page
 
+Quote Post
scalbers
post Oct 16 2015, 07:03 PM
Post #28


Senior Member
****

Group: Members
Posts: 1620
Joined: 5-March 05
From: Boulder, CO
Member No.: 184



QUOTE (Dan Delany @ Sep 29 2015, 08:44 PM) *
Anyway, hope you enjoy! I haven't made progress yet on hosting the original video files somewhere, but I haven't forgotten about it, will look into it more soon.

I enjoy these so much I miss them when they are missing/garbled as from the past couple of days. There's a typhoon setting up east of the Philippines.


--------------------
Steve [ my home page and planetary maps page ]
Go to the top of the page
 
+Quote Post
Dan Delany
post Nov 13 2015, 03:19 AM
Post #29


Junior Member
**

Group: Members
Posts: 23
Joined: 15-February 14
Member No.: 7141



Thanks for saying so smile.gif I went away on vacation, and when I got back, I realized my scripts had gone haywire while I was gone. Due to a combination of lack of spare time, code bugs, and what I eventually diagnosed as bad Linux support for my hard drive's file system format, I haven't quite gotten the whole pipeline fully running again yet. But I just got a new drive which I'm reformatting and plan to try again tonight.

More excitingly, I've been in contact with Steve Miller at NOAA/CIRA who just returned from Tokyo where he presented their awesome new Himawari 8 Geocolor product (as well as one of my interpolated videos laugh.gif). Geocolor adds night coverage via a combination of infrared cloud imagery and a static city lights background layer. It's similar to their GOES Geocolor product. From an email:

QUOTE
CIRA has been working on generating Steve Miller's Geocolor product from Himawari-8. There are three notable improvements over the GOES version:

1) The daytime background is no longer static - the daytime imagery is now the Hybrid Atmospherically Corrected True Color

2) The nighttime city lights are based on a new 1-km dataset built from 1 year of VIIRS Day Night Band data. The lights are still static, but the resolution is improved over the GOES version.

3) The product is higher resolution in both space (1 km) and time (10 minutes) for the full disk.

For the nighttime portion, white colors are high ice clouds, and the reddish colors represent lower liquid water clouds.

We of course plan to generate a similar product from the GOES-R ABI. The current version from Himawari is experimental and we'll be working on improvements over the next year.


This replaces their previous RGB daytime product, so as of 2015-11-11 20:40 all of the images in the RAMMB/CIRA archive have night-time Geocolor coverage. You'll see these showing up in my new videos on Youtube in a day or two. Very cool stuff.

-d
Go to the top of the page
 
+Quote Post
scalbers
post Jan 16 2016, 03:14 PM
Post #30


Senior Member
****

Group: Members
Posts: 1620
Joined: 5-March 05
From: Boulder, CO
Member No.: 184



I've been looking into where one can obtain the HImawari imagery, including more of the raw imagery prior to construction of a color image. One site (requiring free registration where you explain the purpose of use) can be found here.

http://www.eorc.jaxa.jp/ptree/

In making visually realistic color images, the CIRA imagery does an interesting correction for the fact that the green band is a shorter wavelength (0.51 microns) compared with the ideal of about .55 microns. This is why land areas look too brown, as presented by Steve Miller at the American Meteorological Society conference this week. Some other key Himawari people were also at this conference. It turns out that chlorophyll has a rather narrow reflectance peak right around .55 microns and it is missed by Himawari thus suppressing the green component. As a solution, another chlorophyll peak in the IR is used to help adjust the green color.

Even though this correction helps in getting a better color balance, the colors are still exaggerated from a visually realistic view since the atmosphere (blue sky in foreground) is subtracted out via a Rayleigh correction. Thus to make visually realistic imagery the raw images would be needed. The green correction could be applied while withholding the Rayleigh correction.


--------------------
Steve [ my home page and planetary maps page ]
Go to the top of the page
 
+Quote Post

5 Pages V  < 1 2 3 4 > » 
Reply to this topicStart new topic

 



RSS Lo-Fi Version Time is now: 19th March 2024 - 02:52 AM
RULES AND GUIDELINES
Please read the Forum Rules and Guidelines before posting.

IMAGE COPYRIGHT
Images posted on UnmannedSpaceflight.com may be copyrighted. Do not reproduce without permission. Read here for further information on space images and copyright.

OPINIONS AND MODERATION
Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators.
SUPPORT THE FORUM
Unmannedspaceflight.com is funded by the Planetary Society. Please consider supporting our work and many other projects by donating to the Society or becoming a member.