Fr. 77.00

Textured 3D Models - Automatic Reconstruction of 3D Models from a Sequance of Calibrated Images

English, German · Paperback / Softback

Shipping usually within 2 to 3 weeks (title will be printed to order)

Description

Read more

A vision based 3D scene analysis system is described that is capable to model complex real world scenes like old building, bridges and vestiges automatically from a sequence of calibrated images. Input to the system is a sequence of calibrated stereoscopic images which can be taken with a hand held camera. The camera is then moved throughout the scene and a long sequence of closely spaced views is recorded. A multi-view algorithm is used to link the corresponding points along a sequence of images. 3D model is reconstructed using triangulation directly from the image sequence, which allows fusing 3D surface measurements from different viewpoints into a consistent 3D model scene using a Kalman filter. The surface geometry of each scene object is approximated by a triangular surface mesh which stores the surface texture in a texture map. From the textured 3D models, realistic looking image sequences from arbitrary view points can be used in many applications. We demonstrate the successful application of the approach to several outdoor image sequences for some famous Egyptian vestiges in a framework that aims to electronically document Egypt s cultural and natural heritage.

Product details

Authors Hatem Ibrahim Mahmoud Rashwan
Publisher LAP Lambert Academic Publishing
 
Languages English, German
Product format Paperback / Softback
Released 17.11.2011
 
EAN 9783846545676
ISBN 978-3-8465-4567-6
No. of pages 132
Subjects Guides
Natural sciences, medicine, IT, technology > IT, data processing > Miscellaneous

Customer reviews

No reviews have been written for this item yet. Write the first review and be helpful to other users when they decide on a purchase.

Write a review

Thumbs up or thumbs down? Write your own review.

For messages to CeDe.ch please use the contact form.

The input fields marked * are obligatory

By submitting this form you agree to our data privacy statement.