Help - Search - Members - Calendar
Full Version: Himawari
Unmanned Spaceflight.com > Earth & Moon > Earth Observations
Pages: 1, 2
scalbers
Here is a real-time full color view of the Earth from the Himawari satellite, updated every half hour or so. This geostationary weather satellite is stationed over the longitude of Japan. The view is complete with orange sunglint off the ocean. You can click on the link to see the latest update.

Click to view attachment

http://www.jma.go.jp/en/gms/largec.html?ar...=1&mode=UTC

I previously posted this high resolution sample image in the Whole Earth Images thread. We can look around for the full resolution real-time data archive that would be about 11000 pixels wide.
lyford
Wonderful! Thank you.
scalbers
Another view with a gibbous Earth:

Click to view attachment
djellison
I've tried ( but failed ) to find a source for the full res data - it's just so stunning.
johnmerritt
Not the full res data, but I was able to locate four higher resolution partial disk images, which if pieced together make a nearly full disk image. These can be found by looking at the raw full disk URL, e.g. http://www.jma.go.jp/en/gms/imgs_c/6/visib...07290500-00.png, then change the number after "/imgs_c/" to 1, 2, 3, or 4. A number 0 gets you what appears to be a zoomed in region near the east coast of China.
MahFL
You can see the clearly on the images the North Pole in constant daylight, pretty neat.
scalbers
Good detective work and mosaic johnmerrit! Here is a (low-res) full-disk current day movie with a 10-minute cadence. Interesting to see how the sun glint changes as it moves across the ocean.

http://himawari8.nict.go.jp/himawari8-movie.htm#

And a saved animation is shown here on YouTube:

https://www.youtube.com/watch?v=SSqgMF596MI
stevesliva
QUOTE (scalbers @ Jul 29 2015, 02:42 PM) *
Interesting to see how the sun glint changes as it moves across the ocean.


Yeah, watch that area, and it's a stark illustration of phase affect in some places, as areas go from brighter to darker than surroundings rapidly.
xflare
The full disk images are now at full resolution. Incredible view of Super Typhoon SOUDELOR
scalbers
Sounds intriguing. Is there a link where the full-disk full-resolution images are now regularly available, or at least for this typhoon?
stevesliva
Visible (Color) imagery, including animations, here:
http://www.jma.go.jp/en/gms/

Midday is about 0300 UTC.

Make sure "region" is Full Disk if you want that.

err, wait, though... Is that full resolution?
xflare
I got them here http://rammb.cira.colostate.edu/ramsdis/on.../himawari-8.asp
scalbers
Nice - however the full color ones look to be processed to remove the original brightness variations relating to lighting geometry and atmospheric scattering. The full disk images are also "only" half resolution, compared with the sample I posted at the top of the thread. The floaters do show the typhoon better, and go to 500m resolution in monochromatic imagery. This compares to 1km resolution for the global views, that have 11000x11000 pixels.
scalbers
Here is a bit of info on a planned mechanism to make the Himawari-8 data available:

https://www.ssec.wisc.edu/mcidas/news/himawari_plans.html
Dan Delany
Hi all! Recently I have been experimenting with applying optical-flow-based motion interpolation to earth-observing satellite images - ie. guessing intermediate frames to make smoother videos with artificially higher framerates. I found two open source software libraries which estimate optical flow using different algorithms, and both seem to perform fairly well - slowmoVideo which implements the Kanade-Lucas algorithm, and Butterflow which implements the Farneback algorithm.

I downloaded all of the high-resolution full-disk images from the RAMMB link posted upthread and have been testing the two libraries on cropped regions and short timespans from this dataset. For each method, I started with a 6 FPS video generated from the underlying images. Then I applied a 0.5x speed stretch and motion-interpolated the results to 60 FPS, resulting in a "20x interpolation" - ie. 1 out of every 20 frames is a real image, the other 19 are interpolated.

Here is a comparison of the two methods, using one day of data which show the twin typhoons Goni and Atsani in the Pacific. The images used are 1920x1080 pixels, so make sure you force the quality to "1080p60" for best viewing: https://www.youtube.com/watch?v=Skk58D3waQg

As you can see, the results seem to be pretty comparable during the daytime, when lighting conditions are fairly consistent from frame to frame. However, the Butterflow (Farneback) method appears to handle the dynamic lighting conditions near the terminator around the time of local twilight much better than the slowmoVideo (Kanade-Lucas) method.

Note that the speed of rotation appears to "jump" a few times - this is because most original frames are taken once-per-ten-minutes but there are a few frames missing, and I'm not correcting for this. Next on my to-do list is to handle this more gracefully by generating interpolated frames for the missing frames before performing the full 60 fps interpolation.

Overall I'm quite happy with the results - they may not serve any scientific purposes but they sure are pretty. I think I will produce some more videos using the Butterflow library once I handle the missing-frames problem. I may also integrate it with typhoon track data to produce "typhoon tracker" videos centered on the typhoons as they move around, but this would require a method for getting pixel location from lat/long, which would require reprojection, which I'm not sure if I have enough data for. This is my first attempt at motion interpolation, so any feedback or suggestions would be greatly appreciated!
scalbers
Very nice and very impressive Dan. I may have to give this a try at some point. For tracking I can suggest that the pixels should be mappable to lat/lon by assuming a vertical perspective map projection. The geosynchronous satellite is at a known distance from the Earth and we could probably figure out the sub-point. I might employ such a projection to create cylindrical map projection images from the satellite images.
hendric
Once the HD fully loaded it was amazing. I agree the Butterworth interpolator performed much better. I assume you plan to use this on Jupiter/Saturn pics? Can't wait for Juno pics to start coming down!
Dan Delany
Thanks, Steve and Richard, for the kind words and good advice. I've been continuing to work on this project as I have time. Progress has been somewhat slow, as I ran into a bug in the Butterflow library when I tried to correct for the missing frames - however, I reached out to the creator of the library and he has been kind enough to help me out and fix the issues.

Last night was the first time I got the new code running with missing-image correction, and I let it run all night to render five more videos, with one day of data each. Here are the results - again, make sure you have the Youtube quality setting set manually to "1080p60" for best viewing: https://www.youtube.com/watch?v=PXD9MvLSXCk...ICG0x6Vu3mV9Uma

These are even slower, with 1 real frame and 59 interpolated frames per second of video (previous video was twice as fast). I think these may even be a little too slow for my taste, I will probably render to a playback speed ~halfway in between the two in the future. Each 90-second x 60-FPS video takes about an hour and a half to render on my Macbook Pro, which is another limiting factor. But I hope to get it running soon on my desktop machine with a fast CPU+GPU, which should improve the render time by quite a bit.

It's still in a bit of a messy state, but my script is on Github in case its helpful for anyone else.

@Steve thanks very much for the notes re: projection, I knew the basic concepts but didn't know what it was called. I have been doing some reading about it, and may try to estimate some of the remaining unknown parameters soon to see how closely I can map lat/long to pixel location in order to do typhoon tracking. However, for now I am just focused on setting up my existing script to be as repeatable and hands-off as possible. I would really like to be able to just run it once every day, and have the script take care of not only rendering the videos but also splicing them together and uploading them to Youtube. This way I could have a nearly-automated Youtube account which posts new "Earth from Space" videos daily.

@Richard I'm not quite sure what else I'll apply this to yet. I've tried a couple of other quick and dirty experiments, with some decent results and some not-so-decent. The images really need to be close enough together (temporally) for the algorithm to detect feature movement from one image to the next - if the features move too far, you end up with some really gnarly artifacts that look much worse than the original, low-framerate video. But some Jupiter photos would be a great application. I'm also looking forward to applying the technique to the DSCOVR Epic images once they start getting released regularly later this year - hopefully the planned cadence of 1 image per 30 minutes will be enough to get good results.
ZLD
Dan, this is just fantastic. I've wanted to watch something similar for decades! I do agree this may be a little on the slow side for viewing and for the processing with there being a few jumps here and there.

I wonder if this process would work with HiRISE dust storm captures?
scalbers
Just took a quick look so far - very nice again with the missing image correction and such. I actually like the slower pace, as there is so much detail that one can gaze at in a leisurely fashion.
hendric
How big are your files Dan? Is it possible to host them somewhere? Youtube keeps throttling the videos and I lose detail when that happens. Very awesome movies.
Dan Delany
Thanks a lot for the encouragement, all. Most people I mention this to give me a funny look and say something like "wait, why are you doing this?". "Because it's cool!" should be a good enough answer!

The lossless files I make are BIG, around 1 GB or more per minute of video. But they seem to compress well down to ~50-80 MB per minute without any artifacts that I can see. I could probably bring it down a bit further if I tweak some ffmpeg settings. I will probably only keep these compressed copies since my hard drive will fill up soon otherwise smile.gif I'll work on finding a way to host these somewhere accessible. I would put them on Amazon S3 or something, but I don't want to be on the hook for a big bandwidth bill if the links get posted on a big website or someone hammers them with a script or something.

For now I have been putting files on Mega, which gives me 50 GB to play with. It seems to work OK but not ideal. Not sure if I will be able to upload to there automatically using a script. Here's one I uploaded - Japan on 9/11/2015 (80MB), one of my favorites. (Click "download through your browser"). More to come.

The most noticeable remaining artifact is the fact that sometimes the color channels on the RGB frames are misaligned by a bit, and the resulting color shift looks odd in the video. I'm not sure how to handle these frames. Detecting and realigning them automatically with a script seems difficult or maybe impossible. I could just manually tag the bad frames, drop them, and interpolate around them, but it somehow it feels wrong to just throw away the data, especially since some are only misaligned by a bit.
hendric
Thanks Dan, that's truly enchanting.
tanjent
I would love to see s similar animated view of typhoon dujuan hitting Taiwan and smashing itself to pieces against the central mountain range. Unfortunately most of the action takes place (is taking place) at night. We will find out tomorrow what else got smashed to pieces.

http://www.cwb.gov.tw/V7e/observe/satellite/Sat_EA.htm
Dan Delany
Ask, and you shall receive smile.gif I just uploaded some videos to this playlist for a cropped region that includes Eastern China and Taiwan, and there are some pretty incredible views of Typhoon Dujuan taking aim at Taiwan from the past few days. I'll upload another one tonight with the most recent images. Usual disclaimer applies - make sure you set the quality to 1080p60. I've noticed that this quality setting seems to "stick" better in Chrome than in other browsers, FWIW.

I've gotten my Youtube upload script working fairly well, and now I'm in the process of rendering/uploading lots of daily videos to that one and these 3 other regional playlists: Japan/Korea/Beijing/Shanghai, Thailand/Philippines (+others), and Indonesia/Malaysia/Singapore. They each have almost a month of data now. I have a few more regions to work on - namely Australia, since it doesn't fit natively in my 1920x1080 crop window, I'll have to do a larger view and then downscale before creating the video (either that or split Australia across two regions). I'm also planning a couple of crops of different regions of the Pacific Ocean, since that's where a lot of the big storms initially form.

I'm still playing with the format of the Youtube channel, so let me know if you have feedback. Primarily I'm wondering whether the regional playlists should be in chronological order, so you can see the whole time span if you play from the beginning, or in reverse order so the newest videos are always at the top. I think I'm leaning towards reverse order; the oldest images in the RAMMB dataset have some weird processing applied, and I don't really want those at the top of the playlist. I suppose I could have two playlists per region, one for each order, but that may be confusing. I'm also considering rolling up the dailies into weekly videos for these playlists, so the videos are longer and you spend less time loading. However, I really like the immediacy of being able to see "yesterday from space" every day so I would probably upload them in addition to the dailies. This also confuses the playlist order issue a bit because, if the playlist uses weekly videos in reverse order, you'd see a week chronologically, followed by the previous week chronologically, etc...

Anyway, hope you enjoy! I haven't made progress yet on hosting the original video files somewhere, but I haven't forgotten about it, will look into it more soon.
tanjent
That's terriffic, Dan. Much more fun to watch from orbit than from ground level.
I can see that as you experiment with different processing options, you would want to lead with the most recent results that incorporate the benefit of all your experience.
Maybe I can put together some kind of playlist to view them in natural order.

By the way, do you have any insights about what the Taiwan CWB (in my link) is doing in order to provide their continuous night-and-day coverage with seemingly consistent "illumination"? Are they presenting some kind of false-colored view as seen from non-visible wavelengths?

I have noticed that low cloudiness and rain doesn't always show up in their hourly posted views - day or night. Sometimes it can be raining on the ground but appears perfectly clear in the satellite view. That may partially explain why the eye seems to be "wide open" in the posted photos, while in the animations there are many periods when it is clouded over. Did Dujuan have ...cataracts?
Dan Delany
@tanjent Yes, in addition to the 3 visible bands, Himawari has 13 infrared bands which work at night. I assume that their process is to use one of those bands with an alpha channel to get transparency, then overlay the transparent image on an existing, cloudless RGB image with even daytime illumination. This also explains why they don't match up exactly with visible observations, since the IR bands are sensitive to different atmospheric phenomena than the visible.

This week I have been trading e-mails with Steve Miller and Dan Lindsey at NOAA/CSU/CIRA, who are responsible for the great true-color images on the CIRA/RAMMB page that I'm using to create my videos. They have seen the Youtube channel, and Steve was kind of flabbergasted at first that someone else had managed to replicate their composite process, until I showed him the image credit (click 'see more' on youtube description) and explained that I was using their output laugh.gif But they both seem excited about the motion interpolation technique, its capabilities, and its applicability to other datasets. So I am in the process of writing some better documentation for my project (which can be found here), to help them get Butterflow + my scripts running on their machines.

They are continuing to work on their image process, and just pushed an update which includes better color correction near the terminator, so the images are continually improving! And they're working on a new product similar to the VIIRS day/night band that includes night-time coverage, which should be incredible.

I also tried interpolating some hourly NOAA FIM cloud forecast images from Science on a Sphere that Steve Albers (scalbers) sent me. These work pretty well too, you can see a downscaled version and a cropped version here:

https://www.youtube.com/watch?v=KPpq_XmwKkM
https://www.youtube.com/watch?v=Q9zHM8IFB_k

Finally, I've been working my way through the original paper about the optical flow algorithm underlying the interpolator, and it's given me lots of ideas (most of them likely well outside my coding abilities) about how the technique could be improved for the specific use case of satellite imagery. I'll write some of them up when I have time, in the "discussion" section of the documentation mentioned above. In particular, the bit about incorporating a priori knowledge seems useful, as we often have lots of a priori knowledge about the expected flow from one image to the next (ie. most movement is constrained to the surface of the sphere, we know the timing of the terminator, etc.).
scalbers
QUOTE (Dan Delany @ Sep 29 2015, 08:44 PM) *
Anyway, hope you enjoy! I haven't made progress yet on hosting the original video files somewhere, but I haven't forgotten about it, will look into it more soon.

I enjoy these so much I miss them when they are missing/garbled as from the past couple of days. There's a typhoon setting up east of the Philippines.
Dan Delany
Thanks for saying so smile.gif I went away on vacation, and when I got back, I realized my scripts had gone haywire while I was gone. Due to a combination of lack of spare time, code bugs, and what I eventually diagnosed as bad Linux support for my hard drive's file system format, I haven't quite gotten the whole pipeline fully running again yet. But I just got a new drive which I'm reformatting and plan to try again tonight.

More excitingly, I've been in contact with Steve Miller at NOAA/CIRA who just returned from Tokyo where he presented their awesome new Himawari 8 Geocolor product (as well as one of my interpolated videos laugh.gif). Geocolor adds night coverage via a combination of infrared cloud imagery and a static city lights background layer. It's similar to their GOES Geocolor product. From an email:

QUOTE
CIRA has been working on generating Steve Miller's Geocolor product from Himawari-8. There are three notable improvements over the GOES version:

1) The daytime background is no longer static - the daytime imagery is now the Hybrid Atmospherically Corrected True Color

2) The nighttime city lights are based on a new 1-km dataset built from 1 year of VIIRS Day Night Band data. The lights are still static, but the resolution is improved over the GOES version.

3) The product is higher resolution in both space (1 km) and time (10 minutes) for the full disk.

For the nighttime portion, white colors are high ice clouds, and the reddish colors represent lower liquid water clouds.

We of course plan to generate a similar product from the GOES-R ABI. The current version from Himawari is experimental and we'll be working on improvements over the next year.


This replaces their previous RGB daytime product, so as of 2015-11-11 20:40 all of the images in the RAMMB/CIRA archive have night-time Geocolor coverage. You'll see these showing up in my new videos on Youtube in a day or two. Very cool stuff.

-d
scalbers
I've been looking into where one can obtain the HImawari imagery, including more of the raw imagery prior to construction of a color image. One site (requiring free registration where you explain the purpose of use) can be found here.

http://www.eorc.jaxa.jp/ptree/

In making visually realistic color images, the CIRA imagery does an interesting correction for the fact that the green band is a shorter wavelength (0.51 microns) compared with the ideal of about .55 microns. This is why land areas look too brown, as presented by Steve Miller at the American Meteorological Society conference this week. Some other key Himawari people were also at this conference. It turns out that chlorophyll has a rather narrow reflectance peak right around .55 microns and it is missed by Himawari thus suppressing the green component. As a solution, another chlorophyll peak in the IR is used to help adjust the green color.

Even though this correction helps in getting a better color balance, the colors are still exaggerated from a visually realistic view since the atmosphere (blue sky in foreground) is subtracted out via a Rayleigh correction. Thus to make visually realistic imagery the raw images would be needed. The green correction could be applied while withholding the Rayleigh correction.
anticitizen2
I had to stitch this together from 16 thumbnails because I didn't want to wait for tomorrow to see the eclipse in high resolution.

Unfortunately there isn't an easy way to share a huge image without it being compressed. I think this dropbox link works for people without accounts

https://www.dropbox.com/s/3hwdapm064m7ug1/0...00_0_0.png?dl=0

Credit: Japan Meteorological Agency
scalbers
We can see the Himawari "Loop of the Day" here:

http://rammb.cira.colostate.edu/ramsdis/on...oop_of_the_day/

A month long animation is here:

https://www.youtube.com/watch?v=1nWpLZoGhVg
Sean
Earth [preview]

Click thru for a video...



Stretched playback to 1600%
scalbers
Very nice - interesting you are able to show a smooth animation and to preserve the view of the atmosphere along the limb. Is it possible to show longer intervals from the individual vantage points?
Sean
Yes I have some shots in the queue... I'll post when they are ready.
Sean
Here are some tests... click thru for video.









scalbers
Pretty neat Sean - I thought I was out there in space and wondered if I would fall toward the Earth.

The sun glint (and twilight effect) is really neat to see and a good test for sun glint modeling. On the limb itself we quickly transition from the orange glint color to the blue atmosphere. Perhaps that's the way it really looks. I would suspect the limb edge would be slightly muted and we might see a few clouds or haze layers silhouetted in front of the blue atmosphere? Hard to get a consensus considering other imagery, and LEO imagery might look somewhat different. Perhaps more resolution would be needed to tell for sure.

Another note is that Himawari wavelengths are a bit different from CIE peaks, helping to explain why Australia looks on the brown side.
Sean
Ah thanks...bear in mind that I'm applying time stretch function to the image sequence which invents tween frames. I don't touch the color.

Here is a montage along with some music by Andreas Vollenweider.

'Clouds'


*update*
4k stills from today...






*another update*
12.5 megapixel portrait with image processing...




Sean
Typhoon KILO


September 02 2015

92 source images. Playback stretched to 2000%


nprev
Spectacular!!!
Sean
'Musicaux'


273 source images [ from 116 ] cropped & interpolated to 1000% with music by Schubert

and a bonus photobomb...





stevesliva
QUOTE (nprev @ Aug 13 2017, 09:42 PM) *
Spectacular!!!


Specular!!! biggrin.gif

I love the convection in Musicaux... cloudtops that grow like a mushroom, with a nice shadow affect on the lower cloud layer that makes the mushrooming clear. Amazing.
Sean
This is a 4k upscale test from a 1080 source crop & stretched to 1000% [3 seconds to 30]

2017_Aug13



Sean
Eclipse


March 9th 2016

Music by Andreas Vollenweider. Source extended x30

Full 8k version in the works as well as an edit with crops & zooms.

*update*
Here is the 8k version...


Youtube


Sean
Cropped from full res sequence & interpolated x10

ECLIPSE Closeup




ECLIPSE Limb




ECLIPSE Sunset




Sean
Here is the edit...

ECLIPSEČ





Sean
Here is a processed image of yesterday's eclipse at full resolution



Sean
Video of Typhoon Noru from August 2nd 2017

Sean
Here is another clip showing an 'atmospheric river' from October 15th, 2017.

Click thru for a video...



4k version over on Youtube

Obvious artifacts to repair but this served as a stress test on my new PC build... the 4k frame tracked across a 50 megapixel plate covering the entire upper Earth [ 20gb of data to wade through ]


Adam Hurcewicz
QUOTE (Sean @ Aug 12 2017, 11:55 PM) *
12.5 megapixel portrait with image processing...


Hi Sean.

Can I use your photo to my kitchen ? Here is visualization how it will be look.

Adam

Click to view attachment
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Invision Power Board © 2001-2024 Invision Power Services, Inc.