Fr. 165.60

Bayesian Modeling of Uncertainty in Low-Level Vision

English · Hardback

Shipping usually within 3 to 5 weeks (title will be specially ordered)

Description

Read more

Vision has to deal with uncertainty. The sensors are noisy, the prior knowledge is uncertain or inaccurate, and the problems of recovering scene information from images are often ill-posed or underconstrained. This research monograph, which is based on Richard Szeliski's Ph.D. dissertation at Carnegie Mellon University, presents a Bayesian model for representing and processing uncertainty in low level vision. Recently, probabilistic models have been proposed and used in vision. Sze liski's method has a few distinguishing features that make this monograph im portant and attractive. First, he presents a systematic Bayesian probabilistic estimation framework in which we can define and compute the prior model, the sensor model, and the posterior model. Second, his method represents and computes explicitly not only the best estimates but also the level of uncertainty of those estimates using second order statistics, i.e., the variance and covariance. Third, the algorithms developed are computationally tractable for dense fields, such as depth maps constructed from stereo or range finder data, rather than just sparse data sets. Finally, Szeliski demonstrates successful applications of the method to several real world problems, including the generation of fractal surfaces, motion estimation without correspondence using sparse range data, and incremental depth from motion.

List of contents

1 Introduction.- 1.1 Modeling uncertainty in low-level vision.- 1.2 Previous work.- 1.3 Overview of results.- 1.4 Organization.- 2 Representations for low-level vision.- 2.1 Visible surface representations.- 2.2 Visible surface algorithms.- 2.3 Multiresolution representations.- 2.4 Discontinuities.- 2.5 Alternative representations.- 3 Bayesian models and Markov Random Fields.- 3.1 Bayesian models.- 3.2 Markov Random Fields.- 3.3 Using probabilistic models.- 4 Prior models.- 4.1 Regularization and fractal priors.- 4.2 Generating constrained fractals.- 4.3 Relative depth representations (reprise).- 4.4 Mechanical vs. probabilistic models.- 5 Sensor models.- 5.1 Sparse data: spring models.- 5.2 Sparse data: force field models.- 5.3 Dense data: optical flow.- 5.4 Dense data: image intensities.- 6 Posterior estimates.- 6.1 MAP estimation.- 6.2 Uncertainty estimation.- 6.3 Regularization parameter estimation.- 6.4 Motion estimation without correspondence.- 7 Incremental algorithms for depth-from-motion.- 7.1 Kaiman filtering.- 7.2 Incremental iconic depth-from-motion.- 7.3 Joint modeling of depth and intensity.- 8 Conclusions.- 8.1 Summary.- 8.2 Future research.- A Finite element implementation.- B Fourier analysis.- B.1 Filtering behavior of regularization.- B.2 Fourier analysis of the posterior distribution.- B.3 Analysis of gradient descent.- B.4 Finite element solution.- B.5 Fourier analysis of multigrid relaxation.- C Analysis of optical flow computation.- D Analysis of parameter estimation.- D.1 Computing marginal distributions.- D.2 Bayesian estimation equations.- D.3 Likelihood of observations.- Table of symbols.

Product details

Authors Richard Szeliski
Publisher Springer, Berlin
 
Languages English
Product format Hardback
Released 04.03.2011
 
EAN 9780792390398
ISBN 978-0-7923-9039-8
No. of pages 198
Weight 1080 g
Illustrations XX, 198 p.
Series The Springer International Series in Engineering and Computer Science
Recent Economic Thought
The Springer International Series in Engineering and Computer Science
Recent Economic Thought
The Springer International Eng
Subject Natural sciences, medicine, IT, technology > IT, data processing > IT

Customer reviews

No reviews have been written for this item yet. Write the first review and be helpful to other users when they decide on a purchase.

Write a review

Thumbs up or thumbs down? Write your own review.

For messages to CeDe.ch please use the contact form.

The input fields marked * are obligatory

By submitting this form you agree to our data privacy statement.