Here is a real-time full color view of the Earth from the Himawari satellite, updated every half hour or so. This geostationary weather satellite is stationed over the longitude of Japan. The view is complete with orange sunglint off the ocean. You can click on the link to see the latest update.
Wonderful! Thank you.
I've tried ( but failed ) to find a source for the full res data - it's just so stunning.
Not the full res data, but I was able to locate four higher resolution partial disk images, which if pieced together make a nearly full disk image. These can be found by looking at the raw full disk URL, e.g. http://www.jma.go.jp/en/gms/imgs_c/6/visible/0/201507290500-00.png, then change the number after "/imgs_c/" to 1, 2, 3, or 4. A number 0 gets you what appears to be a zoomed in region near the east coast of China.
You can see the clearly on the images the North Pole in constant daylight, pretty neat.
Good detective work and mosaic johnmerrit! Here is a (low-res) full-disk current day movie with a 10-minute cadence. Interesting to see how the sun glint changes as it moves across the ocean.
http://himawari8.nict.go.jp/himawari8-movie.htm#
And a saved animation is shown here on YouTube:
https://www.youtube.com/watch?v=SSqgMF596MI
The full disk images are now at full resolution. Incredible view of Super Typhoon SOUDELOR
Sounds intriguing. Is there a link where the full-disk full-resolution images are now regularly available, or at least for this typhoon?
Visible (Color) imagery, including animations, here:
http://www.jma.go.jp/en/gms/
Midday is about 0300 UTC.
Make sure "region" is Full Disk if you want that.
err, wait, though... Is that full resolution?
I got them here http://rammb.cira.colostate.edu/ramsdis/online/himawari-8.asp
Nice - however the full color ones look to be processed to remove the original brightness variations relating to lighting geometry and atmospheric scattering. The full disk images are also "only" half resolution, compared with the sample I posted at the top of the thread. The floaters do show the typhoon better, and go to 500m resolution in monochromatic imagery. This compares to 1km resolution for the global views, that have 11000x11000 pixels.
Here is a bit of info on a planned mechanism to make the Himawari-8 data available:
https://www.ssec.wisc.edu/mcidas/news/himawari_plans.html
Hi all! Recently I have been experimenting with applying optical-flow-based motion interpolation to earth-observing satellite images - ie. guessing intermediate frames to make smoother videos with artificially higher framerates. I found two open source software libraries which estimate optical flow using different algorithms, and both seem to perform fairly well - https://github.com/slowmoVideo/slowmoVideo/wiki which implements the Kanade-Lucas algorithm, and https://github.com/dthpham/butterflow which implements the Farneback algorithm.
I downloaded all of the high-resolution full-disk images from http://rammb.cira.colostate.edu/ramsdis/online/himawari-8.asp and have been testing the two libraries on cropped regions and short timespans from this dataset. For each method, I started with a 6 FPS video generated from the underlying images. Then I applied a 0.5x speed stretch and motion-interpolated the results to 60 FPS, resulting in a "20x interpolation" - ie. 1 out of every 20 frames is a real image, the other 19 are interpolated.
Here is a comparison of the two methods, using one day of data which show the twin typhoons Goni and Atsani in the Pacific. The images used are 1920x1080 pixels, so make sure you force the quality to "1080p60" for best viewing: https://www.youtube.com/watch?v=Skk58D3waQg
As you can see, the results seem to be pretty comparable during the daytime, when lighting conditions are fairly consistent from frame to frame. However, the Butterflow (Farneback) method appears to handle the dynamic lighting conditions near the terminator around the time of local twilight much better than the slowmoVideo (Kanade-Lucas) method.
Note that the speed of rotation appears to "jump" a few times - this is because most original frames are taken once-per-ten-minutes but there are a few frames missing, and I'm not correcting for this. Next on my to-do list is to handle this more gracefully by generating interpolated frames for the missing frames before performing the full 60 fps interpolation.
Overall I'm quite happy with the results - they may not serve any scientific purposes but they sure are pretty. I think I will produce some more videos using the Butterflow library once I handle the missing-frames problem. I may also integrate it with typhoon track data to produce "typhoon tracker" videos centered on the typhoons as they move around, but this would require a method for getting pixel location from lat/long, which would require reprojection, which I'm not sure if I have enough data for. This is my first attempt at motion interpolation, so any feedback or suggestions would be greatly appreciated!
Very nice and very impressive Dan. I may have to give this a try at some point. For tracking I can suggest that the pixels should be mappable to lat/lon by assuming a vertical perspective map projection. The geosynchronous satellite is at a known distance from the Earth and we could probably figure out the sub-point. I might employ such a projection to create cylindrical map projection images from the satellite images.
Once the HD fully loaded it was amazing. I agree the Butterworth interpolator performed much better. I assume you plan to use this on Jupiter/Saturn pics? Can't wait for Juno pics to start coming down!
Thanks, Steve and Richard, for the kind words and good advice. I've been continuing to work on this project as I have time. Progress has been somewhat slow, as I ran into a bug in the Butterflow library when I tried to correct for the missing frames - however, https://github.com/dthpham/butterflow/issues/13.
Last night was the first time I got the new code running with missing-image correction, and I let it run all night to render five more videos, with one day of data each. Here are the results - again, make sure you have the Youtube quality setting set manually to "1080p60" for best viewing: https://www.youtube.com/watch?v=PXD9MvLSXCk&list=PLrmCQL5hELCy-Q1Rl4ICG0x6Vu3mV9Uma
These are even slower, with 1 real frame and 59 interpolated frames per second of video (previous video was twice as fast). I think these may even be a little too slow for my taste, I will probably render to a playback speed ~halfway in between the two in the future. Each 90-second x 60-FPS video takes about an hour and a half to render on my Macbook Pro, which is another limiting factor. But I hope to get it running soon on my desktop machine with a fast CPU+GPU, which should improve the render time by quite a bit.
It's still in a bit of a messy state, but https://github.com/dandelany/animate-earth/blob/master/pipeline/main.js.
@Steve thanks very much for the notes re: projection, I knew the basic concepts but didn't know what it was called. I have been doing some reading about it, and may try to estimate some of the remaining unknown parameters soon to see how closely I can map lat/long to pixel location in order to do typhoon tracking. However, for now I am just focused on setting up my existing script to be as repeatable and hands-off as possible. I would really like to be able to just run it once every day, and have the script take care of not only rendering the videos but also splicing them together and uploading them to Youtube. This way I could have a nearly-automated Youtube account which posts new "Earth from Space" videos daily.
@Richard I'm not quite sure what else I'll apply this to yet. I've tried a couple of other quick and dirty experiments, with some decent results and some not-so-decent. The images really need to be close enough together (temporally) for the algorithm to detect feature movement from one image to the next - if the features move too far, you end up with some really gnarly artifacts that look much worse than the original, low-framerate video. But some Jupiter photos would be a great application. I'm also looking forward to applying the technique to the DSCOVR Epic images once they start getting released regularly later this year - hopefully the planned cadence of 1 image per 30 minutes will be enough to get good results.
Dan, this is just fantastic. I've wanted to watch something similar for decades! I do agree this may be a little on the slow side for viewing and for the processing with there being a few jumps here and there.
I wonder if this process would work with HiRISE dust storm captures?
Just took a quick look so far - very nice again with the missing image correction and such. I actually like the slower pace, as there is so much detail that one can gaze at in a leisurely fashion.
How big are your files Dan? Is it possible to host them somewhere? Youtube keeps throttling the videos and I lose detail when that happens. Very awesome movies.
Thanks a lot for the encouragement, all. Most people I mention this to give me a funny look and say something like "wait, why are you doing this?". "Because it's cool!" should be a good enough answer!
The lossless files I make are BIG, around 1 GB or more per minute of video. But they seem to compress well down to ~50-80 MB per minute without any artifacts that I can see. I could probably bring it down a bit further if I tweak some ffmpeg settings. I will probably only keep these compressed copies since my hard drive will fill up soon otherwise I'll work on finding a way to host these somewhere accessible. I would put them on Amazon S3 or something, but I don't want to be on the hook for a big bandwidth bill if the links get posted on a big website or someone hammers them with a script or something.
For now I have been putting files on Mega, which gives me 50 GB to play with. It seems to work OK but not ideal. Not sure if I will be able to upload to there automatically using a script. https://mega.nz/#!AAM2UZza!RIRxNRKAI6_IrsxJSdl1_94viMYBHHDx15UcjnJR7Hc (80MB), one of my favorites. (Click "download through your browser"). More to come.
The most noticeable remaining artifact is the fact that sometimes the color channels on the RGB frames are misaligned by a bit, and the resulting color shift looks odd in the video. I'm not sure how to handle these frames. Detecting and realigning them automatically with a script seems difficult or maybe impossible. I could just manually tag the bad frames, drop them, and interpolate around them, but it somehow it feels wrong to just throw away the data, especially since some are only misaligned by a bit.
Thanks Dan, that's truly enchanting.
I would love to see s similar animated view of typhoon dujuan hitting Taiwan and smashing itself to pieces against the central mountain range. Unfortunately most of the action takes place (is taking place) at night. We will find out tomorrow what else got smashed to pieces.
http://www.cwb.gov.tw/V7e/observe/satellite/Sat_EA.htm
Ask, and you shall receive I just uploaded https://www.youtube.com/playlist?list=PLrmCQL5hELCycTLYZ1f4IGbrft4mQU0JJ, and there are some pretty incredible views of Typhoon Dujuan taking aim at Taiwan from the past few days. I'll upload another one tonight with the most recent images. Usual disclaimer applies - make sure you set the quality to 1080p60. I've noticed that this quality setting seems to "stick" better in Chrome than in other browsers, FWIW.
I've gotten my Youtube upload script working fairly well, and now I'm in the process of rendering/uploading lots of daily videos to that one and these 3 other regional playlists: https://www.youtube.com/playlist?list=PLrmCQL5hELCzlmcwAGiHg740CCs6KA9y9, https://www.youtube.com/playlist?list=PLrmCQL5hELCy4wOoVZKLsplyFuby_aiur, and https://www.youtube.com/playlist?list=PLrmCQL5hELCxmcXaoEXVKyDE8CrIN1t-x. They each have almost a month of data now. I have a few more regions to work on - namely Australia, since it doesn't fit natively in my 1920x1080 crop window, I'll have to do a larger view and then downscale before creating the video (either that or split Australia across two regions). I'm also planning a couple of crops of different regions of the Pacific Ocean, since that's where a lot of the big storms initially form.
I'm still playing with the format of the Youtube channel, so let me know if you have feedback. Primarily I'm wondering whether the regional playlists should be in chronological order, so you can see the whole time span if you play from the beginning, or in reverse order so the newest videos are always at the top. I think I'm leaning towards reverse order; the oldest images in the RAMMB dataset have some weird processing applied, and I don't really want those at the top of the playlist. I suppose I could have two playlists per region, one for each order, but that may be confusing. I'm also considering rolling up the dailies into weekly videos for these playlists, so the videos are longer and you spend less time loading. However, I really like the immediacy of being able to see "yesterday from space" every day so I would probably upload them in addition to the dailies. This also confuses the playlist order issue a bit because, if the playlist uses weekly videos in reverse order, you'd see a week chronologically, followed by the previous week chronologically, etc...
Anyway, hope you enjoy! I haven't made progress yet on hosting the original video files somewhere, but I haven't forgotten about it, will look into it more soon.
That's terriffic, Dan. Much more fun to watch from orbit than from ground level.
I can see that as you experiment with different processing options, you would want to lead with the most recent results that incorporate the benefit of all your experience.
Maybe I can put together some kind of playlist to view them in natural order.
By the way, do you have any insights about what the Taiwan CWB (in my link) is doing in order to provide their continuous night-and-day coverage with seemingly consistent "illumination"? Are they presenting some kind of false-colored view as seen from non-visible wavelengths?
I have noticed that low cloudiness and rain doesn't always show up in their hourly posted views - day or night. Sometimes it can be raining on the ground but appears perfectly clear in the satellite view. That may partially explain why the eye seems to be "wide open" in the posted photos, while in the animations there are many periods when it is clouded over. Did Dujuan have ...cataracts?
@tanjent Yes, in addition to the 3 visible bands, Himawari has 13 infrared bands which work at night. I assume that their process is to use one of those bands with an alpha channel to get transparency, then overlay the transparent image on an existing, cloudless RGB image with even daytime illumination. This also explains why they don't match up exactly with visible observations, since the IR bands are sensitive to different atmospheric phenomena than the visible.
This week I have been trading e-mails with Steve Miller and Dan Lindsey at NOAA/CSU/CIRA, who are responsible for the great true-color images on the CIRA/RAMMB page that I'm using to create my videos. They have seen the Youtube channel, and Steve was kind of flabbergasted at first that someone else had managed to replicate their composite process, until I showed him the image credit (click 'see more' on youtube description) and explained that I was using their output But they both seem excited about the motion interpolation technique, its capabilities, and its applicability to other datasets. So I am in the process of writing some better documentation for my project https://github.com/dandelany/animate-earth/blob/master/README.asciidoc, to help them get Butterflow + my scripts running on their machines.
They are continuing to work on their image process, and just pushed an update which includes better color correction near the terminator, so the images are continually improving! And they're working on a new product similar to the VIIRS day/night band that includes night-time coverage, which should be incredible.
I also tried interpolating some hourly NOAA FIM cloud forecast images from Science on a Sphere that Steve Albers (scalbers) sent me. These work pretty well too, you can see a downscaled version and a cropped version here:
https://www.youtube.com/watch?v=KPpq_XmwKkM
https://www.youtube.com/watch?v=Q9zHM8IFB_k
Finally, I've been working my way through the http://www.diva-portal.org/smash/get/diva2:273847/FULLTEXT01.pdf, and it's given me lots of ideas (most of them likely well outside my coding abilities) about how the technique could be improved for the specific use case of satellite imagery. I'll write some of them up when I have time, in the "discussion" section of the documentation mentioned above. In particular, the bit about incorporating a priori knowledge seems useful, as we often have lots of a priori knowledge about the expected flow from one image to the next (ie. most movement is constrained to the surface of the sphere, we know the timing of the terminator, etc.).
Thanks for saying so I went away on vacation, and when I got back, I realized my scripts had gone haywire while I was gone. Due to a combination of lack of spare time, code bugs, and what I eventually diagnosed as bad Linux support for my hard drive's file system format, I haven't quite gotten the whole pipeline fully running again yet. But I just got a new drive which I'm reformatting and plan to try again tonight.
More excitingly, I've been in contact with Steve Miller at NOAA/CIRA who just returned from Tokyo where he presented their awesome new Himawari 8 Geocolor product (as well as one of my interpolated videos ). Geocolor adds night coverage via a combination of infrared cloud imagery and a static city lights background layer. It's similar to their http://rammb.cira.colostate.edu/ramsdis/online/goes-r_proving_ground.asp. From an email:
I've been looking into where one can obtain the HImawari imagery, including more of the raw imagery prior to construction of a color image. One site (requiring free registration where you explain the purpose of use) can be found here.
http://www.eorc.jaxa.jp/ptree/
In making visually realistic color images, the CIRA imagery does an interesting correction for the fact that the green band is a shorter wavelength (0.51 microns) compared with the ideal of about .55 microns. This is why land areas look too brown, as presented by Steve Miller at the American Meteorological Society conference this week. Some other key Himawari people were also at this conference. It turns out that chlorophyll has a rather narrow reflectance peak right around .55 microns and it is missed by Himawari thus suppressing the green component. As a solution, another chlorophyll peak in the IR is used to help adjust the green color.
Even though this correction helps in getting a better color balance, the colors are still exaggerated from a visually realistic view since the atmosphere (blue sky in foreground) is subtracted out via a Rayleigh correction. Thus to make visually realistic imagery the raw images would be needed. The green correction could be applied while withholding the Rayleigh correction.
I had to stitch this together from 16 thumbnails because I didn't want to wait for tomorrow to see the eclipse in high resolution.
Unfortunately there isn't an easy way to share a huge image without it being compressed. I think this dropbox link works for people without accounts
https://www.dropbox.com/s/3hwdapm064m7ug1/015000_0_0.png?dl=0
Credit: Japan Meteorological Agency
We can see the Himawari "Loop of the Day" here:
http://rammb.cira.colostate.edu/ramsdis/online/loop_of_the_day/
A month long animation is here:
https://www.youtube.com/watch?v=1nWpLZoGhVg
Earth [preview]
Click thru for a video...
https://flic.kr/p/WoaCa6
Stretched playback to 1600%
Very nice - interesting you are able to show a smooth animation and to preserve the view of the atmosphere along the limb. Is it possible to show longer intervals from the individual vantage points?
Yes I have some shots in the queue... I'll post when they are ready.
Here are some tests... click thru for video.
https://flic.kr/p/WpupeH
https://flic.kr/p/XyTKjN
https://flic.kr/p/XqU3UR
https://flic.kr/p/WnnjYq
Pretty neat Sean - I thought I was out there in space and wondered if I would fall toward the Earth.
The sun glint (and twilight effect) is really neat to see and a good test for sun glint modeling. On the limb itself we quickly transition from the orange glint color to the blue atmosphere. Perhaps that's the way it really looks. I would suspect the limb edge would be slightly muted and we might see a few clouds or haze layers silhouetted in front of the blue atmosphere? Hard to get a consensus considering other imagery, and LEO imagery might look somewhat different. Perhaps more resolution would be needed to tell for sure.
Another note is that Himawari wavelengths are a bit different from CIE peaks, helping to explain why Australia looks on the brown side.
Ah thanks...bear in mind that I'm applying time stretch function to the image sequence which invents tween frames. I don't touch the color.
Here is a montage along with some music by Andreas Vollenweider.
'Clouds'
https://flic.kr/p/X3Y7Tq
*update*
4k stills from today...
https://flic.kr/p/XAZbTw
https://flic.kr/p/XAZ9wN
https://flic.kr/p/Xqc4mq
*another update*
12.5 megapixel portrait with image processing...
https://flic.kr/p/X5viFj
Typhoon KILO
https://flic.kr/p/XGc9Lk
September 02 2015
92 source images. Playback stretched to 2000%
Spectacular!!!
'Musicaux'
https://flic.kr/p/WtEUJa
273 source images [ from 116 ] cropped & interpolated to 1000% with music by Schubert
and a bonus photobomb...
https://flic.kr/p/XHeeYp
This is a 4k upscale test from a 1080 source crop & stretched to 1000% [3 seconds to 30]
2017_Aug13
https://flic.kr/p/XFeLUW
Eclipse
https://flic.kr/p/WyRhWr
March 9th 2016
Music by Andreas Vollenweider. Source extended x30
Full 8k version in the works as well as an edit with crops & zooms.
*update*
Here is the 8k version...
https://flic.kr/p/XcyksY
https://youtu.be/s5B4BCUoOuw
Cropped from full res sequence & interpolated x10
ECLIPSE Closeup
https://flic.kr/p/XJWdJU
ECLIPSE Limb
https://flic.kr/p/Xy7JiW
ECLIPSE Sunset
https://flic.kr/p/XzKx5y
Here is the edit...
ECLIPSEČ
https://flic.kr/p/XQRNhu
Here is a processed image of yesterday's eclipse at full resolution
https://flic.kr/p/WGciUY
Video of Typhoon Noru from August 2nd 2017
https://flic.kr/p/XPexmm
Here is another clip showing an 'atmospheric river' from October 15th, 2017.
Click thru for a video...
https://flic.kr/p/Zs8x7Q
https://www.youtube.com/watch?v=HQacQpHhze4
Obvious artifacts to repair but this served as a stress test on my new PC build... the 4k frame tracked across a 50 megapixel plate covering the entire upper Earth [ 20gb of data to wade through ]
Adam,
Sean posts his stuff as CC-NC-ND: https://creativecommons.org/licenses/by-nc-nd/2.0/
Typhoon Lan, October 21, 2017
60fps uptimed
https://flic.kr/p/23EyhWp
I figured out how to get full frame interpolated 121 megapixel Himawari time-lapse mapped to 3D to enable limited camera positioning.
I did it by slicing each full resolution frame into 9 8k slices, tweening them & stitching back to full frame.
134 source frames were sliced, tweened & stitched to 1600 121 megapixel frames adding up to 143GB
It's nice to get this data into a volume.
The following are initial test videos...
https://flic.kr/p/26E8p1C
https://youtu.be/PoQqnd-Cz_U
https://flic.kr/p/J8gXoV
https://youtu.be/fcfMcUc0jsU
...and some stills.
https://flic.kr/p/KDqHDf
https://flic.kr/p/26pCn2R
https://flic.kr/p/26FuEo3
https://flic.kr/p/26pCmAR
*update:fixed all links in the post above*
Earth - Solar Eclipse, March 9th 2016
https://flic.kr/p/2543ags
Made with interpolated time-lapse footage projected to 3D
https://www.youtube.com/watch?v=O0nfNCVcXcc
I need to figure out why the gamma is stretched on the 8k Youtube profile, so here are some 4k versions with correct gamma...
https://www.youtube.com/watch?v=bhijgkoTJWQ
https://www.youtube.com/watch?v=fXTY5Rnvl5w
These videos are absolutely awesome - thanks for posting them. Due to my area of residence I might be biased but the Earth is probably the most beautiful looking planet in the solar system when seen from space.
Thanks Bjorn... Earth is a beauty alright.
Pacific
https://flic.kr/p/JctKFi
https://www.youtube.com/watch?v=Xf2w5vpR9oM
Australia
https://flic.kr/p/256FM4J
https://www.youtube.com/watch?v=UJSj3KzW0RA
Last one for this batch...some vertical rotation.
https://flic.kr/p/257Yhpu
https://youtu.be/6Xy8JWYCjMc
Earth Portrait_March 9th 2016
https://flic.kr/p/27MSyyN
OK...this is really strange. The Flickr BBCode submitted to each post has quotations added to the Flickr url portion which then kills the link. I have been manually editing them to get the working links back.
eg:
this...
url=https://flic.kr/p/27MSyyN
becomes this...
url="https://flic.kr/p/27MSyyN"
Problem is that this doesn't appear to be consistent and I now wonder how many of my posts suffer this error with the resulting broken link.
*the links with quotes now appear to be working so it could be a problem at the Flickr end*
Vietnam, Laos, Cambodia & Thailand...
https://flic.kr/p/KMQBeq
Added some masked limb processing and atmosphere.
https://www.youtube.com/watch?v=OYzy86p0qMs
Here is a sequence of hi res portraits [8k & 16k]
https://flic.kr/p/KKHzKj
https://flic.kr/p/26N1H3L
https://flic.kr/p/26wadWV
https://flic.kr/p/27PdePh
https://flic.kr/p/JeSMKk
https://flic.kr/p/26waicP
Himawari 8 capture of Tonga eruption on January 15th 2022, 5:00UTC
I was wondering if it would have caught it, and sure enough DSCOVR also caught it:
https://epic.gsfc.nasa.gov/archive/natural/2022/01/15/png/epic_1b_20220115042159.png
That image was taken at 04:21:59 GMT.
Powered by Invision Power Board (http://www.invisionboard.com)
© Invision Power Services (http://www.invisionpower.com)