Getting straight and smooth horizons in navcam panoramas |
Getting straight and smooth horizons in navcam panoramas |
Oct 4 2006, 02:55 PM
Post
#1
|
|
Member Group: Members Posts: 713 Joined: 30-March 05 Member No.: 223 |
Hi all,
I first thought that the problem of uneven horizons in navcam stitches must have been discussed a lot previously but to my surprise the forum search in the Tech&Imagery-board with key-word "horizon" did not yield that much relevant results ... So far, for autostich one "solution" seems to be to try-and-error estimate the correct "greek" orientation parameters (phi,psi,..) However I would like to know if there is a better and more deterministic way. I also played with PTGui/hugin and manual control points on the horizon but this seemed even more cumbersome than getting the right orientation in autostitch. The other problem ar the "edges"/"bumps" along the horizon exactly at the transition zones between individual panorama frames ... Now I am not that an expert in the stitching part of the image processing (jut use Autostitch which does a sufficiently good job, except for the horizon-problem ) so the question goes to the real "stitching Gurus (->Nico, Tman and others ??) how to cope with the horizon :-) Any recommendation for other pano programs that do better with flattening the horizon ? One idea qould be to use some JPL Metadata about the actual rover orientation. I understand that Micheal's MMB program does something like this but was unable to exploit this horizon-information for deriving orientation parameters for external panorama programs. Also, does anyone know if there is already a specification of the lens geometry of the MER navcam available in the form to be usable for a panorama program like hugin etc. ? this would certainly also help to yield undistorted horizons ... Maybe this thread could serve as a collection of useful tips & hints specifically with respect to the "horizon problem" |
|
|
Oct 4 2006, 03:22 PM
Post
#2
|
|
Senior Member Group: Members Posts: 1619 Joined: 12-February 06 From: Bergerac - FR Member No.: 678 |
I just post a part of answer in "Cabo Verde" topic
-------------------- |
|
|
Oct 4 2006, 03:31 PM
Post
#3
|
|
Senior Member Group: Moderator Posts: 4279 Joined: 19-April 05 From: .br at .es Member No.: 253 |
In addition, I remember to have posted something similar in the Autostitch thread on this same sub-forum about one year ago.
Link: http://www.unmannedspaceflight.com/index.p...ost&p=26418 (but using the first image instead of the second as reference). |
|
|
Oct 4 2006, 03:38 PM
Post
#4
|
|
Member Group: Members Posts: 713 Joined: 30-March 05 Member No.: 223 |
In addition, I remember to have posted something similar in the Autostitch thread on this same sub-forum. Link: http://www.unmannedspaceflight.com/index.p...ost&p=26418 thanks a lot Tesheiner, very useful information ! (strange that this did not turn up when I searched for "horizon straight" .. |
|
|
Oct 4 2006, 03:41 PM
Post
#5
|
|
Member Group: Members Posts: 713 Joined: 30-March 05 Member No.: 223 |
|
|
|
Oct 4 2006, 06:50 PM
Post
#6
|
|
Member Group: Members Posts: 656 Joined: 20-April 05 From: League City, Texas Member No.: 285 |
Do you think it's also possible to directly use the MMB metadata about rover orientation for the phi,psi... settings ? (Micheal ?) The difficulty that I see, at least if you're using Autostitch, is that you don't implicitly know where the 0-degree azimuth and elevation positions is going to be in the resulting stitched image. Converting the rover orientation quaternion to Euler angles would be fairly simple (if you already happen to have the right code libraries), but then there's no nice way to apply it to the stitched image. This would be simpler in Panotools (PTGui), as you have more control over which image goes where in the resulting mosaic (but PTGui is not as nice to work with). To completely automate the process is non-trivial, something I've gone to a lot of trouble with in AlgorimancerPG, as have Indian3000 and mhoward and others. Worse yet, the rover orientation quaternion is only accurate to within about 1.5 degrees. In many ways, you may well be better off doing an initial render in autostitch and using the horizons in that image to estimate the angle corrections for a subsequent render. |
|
|
Oct 4 2006, 07:08 PM
Post
#7
|
|
Member Group: Members Posts: 239 Joined: 20-April 05 From: Bruxelles, Belgium Member No.: 278 |
I had also some problems with the telemetry of these last sols.
I think that the extimation of the quaternion of orientation is bad these last sols. I have look in my program if there were not a bug, by recreating old panoramas, but they is correct, it is perhaps a bug in my program, I do not know yet. in this case ( if it is the telemetry of the rover which is not good ) it is necessary that I find a way of levelling the horizon automatically. -------------------- |
|
|
Oct 4 2006, 07:42 PM
Post
#8
|
|
Senior Member Group: Moderator Posts: 3431 Joined: 11-August 04 From: USA Member No.: 98 |
Ant, Thank you very much for the hint with the panorama export feature ... Do you think it's also possible to directly use the MMB metadata about rover orientation for the phi,psi... settings ? (Micheal ?) I've given up on Autostitch, so I'm not pursuing that direction. Of course what tool you use is up to you, but Autostitch just doesn't allow enough control for my taste - that's why I added the PTGui/Hugin export to the pan export feature in MMB. In MMB 1.5, the exported mmb.pts file has the camera pointings from tracking via the MMB metadata. Sometimes, as Indian 3000 has discovered, the rover quaternion is not perfect, so some manual tweaking of the pan may be needed; usually no more than a couple of degrees, though. Actually now that Indian 3000 is spitting out pans left and right, I probably won't bother so much. |
|
|
Oct 4 2006, 08:41 PM
Post
#9
|
||
Senior Member Group: Moderator Posts: 4279 Joined: 19-April 05 From: .br at .es Member No.: 253 |
I'm quite happy with autostitch; sure the lack of control is a negative point but the results usually are very nice, specially for pancams. But let's do an example with a navcam mosaic.
Here is the 5x1 navcam mosaic taken after driving on sol 958. And this is the process I almost always do to obtain such a mosaic: 1) Process all images with MichaelT's antivignetting tool. 2) Calculate the rotation angle (psi) to be applied on the first image in order to get a level horizon: 2.1) Use IrfanView (or any other tool) to measure the sides (dx, dy) of a rectangle whose opposite points are coincident with the horizon. In the above example, dx=820, dy=47 2.2) The rotation angle psi=atan(dy/dx). Psi=3.3º (aprox.) 3) Execute autostitch with the following non-default parameters: - Scale: 25% - Gain compensation: yes - SIFT Image Size: Scale: 100% - RANSAC Parameters: Max Iterations: 1500 - Auto Straighten: no - Phi: -15º (*) - Psi: 3.3º 4) If the result is acceptable, change scale to 100% and redo stitch. (*) Note: Use -15º for navcam mosaics and -3º for pancam. |
|
|
||
Oct 4 2006, 09:04 PM
Post
#10
|
|
Member Group: Members Posts: 713 Joined: 30-March 05 Member No.: 223 |
Thanks for the cook receipe
My autostitch work flow is quite similar ... works almost perfect with pancam mosaics and often reasonably well with navcam frames. Especially if the SIFT matching algorithm has enough features on the horizon to catch on (as in this example). More problematic are feature-less flat horizons such as in many Oppy pans of the meridiani plains. Another problem with autostitch I noticed recently is that the more aggressive SIFT and RANSAC parameters introduce quite a bit of noticeable blurring in the image. So I start to wonder if it may worth to invest more time in working with "advanced" pano-programs like PTAssembler etc. Especially if it would be possible to use additional meta-data like the exact MER camera lens data and a special matching methods for flat, featureless horizons ... |
|
|
Oct 4 2006, 09:59 PM
Post
#11
|
|
Member Group: Admin Posts: 468 Joined: 11-February 04 From: USA Member No.: 21 |
Good thread!
I've been using PTAssembler since the beginning of the missions, and I've been fairly happy with how it handles pancam images. Its integration with autopano and enblend are particularly handy. But I've never been happy with how it handles navcams, mainly because of the lens distortion. Optimizing the lens characteristics (the a, b and c parameters of panotools and its various front ends) does an awful job of actually characterizing the lens. It usually settles into some strange distortion of the images which just happens to line up the control points, but which is clearly not the actual lens characteristics. So...for those out there using PTGui, Hugin or PTAssembler: Which a, b, c have you found which work the best for navcam? |
|
|
Lo-Fi Version | Time is now: 25th May 2024 - 02:25 PM |
RULES AND GUIDELINES Please read the Forum Rules and Guidelines before posting. IMAGE COPYRIGHT |
OPINIONS AND MODERATION Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators. |
SUPPORT THE FORUM Unmannedspaceflight.com is funded by the Planetary Society. Please consider supporting our work and many other projects by donating to the Society or becoming a member. |