Fr. 130.00

Numerical Optimization with Computational Errors

English · Hardback

Shipping usually within 6 to 7 weeks

Description

Read more

This book studies the approximate solutions of optimization problems in the presence of computational errors. A number of results are presented on the convergence behavior of algorithms in a Hilbert space; these algorithms are examined taking into account computational errors. The author illustrates that algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Known computational errors  are examined with the aim of determining an approximate solution. Researchers and students interested in the optimization theory and its applications will find this book instructive and informative.
 
This monograph contains 16 chapters; including a chapters devoted to the subgradient projection algorithm, the mirror descent algorithm, gradient projection algorithm, the Weiszfelds method, constrained convex minimization problems, the convergence of a proximal point method in a Hilbert space, the continuous subgradient method, penalty methods and Newton's method.

  

List of contents

1. Introduction.- 2. Subgradient Projection Algorithm.- 3. The Mirror Descent Algorithm.- 4. Gradient Algorithm with a Smooth Objective Function.- 5. An Extension of the Gradient Algorithm.- 6. Weiszfeld's Method.- 7. The Extragradient Method for Convex Optimization.- 8. A Projected Subgradient Method for Nonsmooth Problems.- 9. Proximal Point Method in Hilbert Spaces.- 10. Proximal Point Methods in Metric Spaces.- 11. Maximal Monotone Operators and the Proximal Point Algorithm.- 12. The Extragradient Method for Solving Variational Inequalities.- 13. A Common Solution of a Family of Variational Inequalities.- 14. Continuous Subgradient Method.- 15. Penalty Methods.- 16. Newton's method.- References.- Index. 

Summary

This book studies the approximate solutions of optimization problems in the presence of computational errors. A number of results are presented on the convergence behavior of algorithms in a Hilbert space; these algorithms are examined taking into account computational errors. The author illustrates that algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Known computational errors  are examined with the aim of determining an approximate solution. Researchers and students interested in the optimization theory and its applications will find this book instructive and informative.
 
This monograph contains 16 chapters; including a chapters devoted to the subgradient projection algorithm, the mirror descent algorithm, gradient projection algorithm, the Weiszfelds method, constrained convex minimization problems, the convergence of a proximal point method in a Hilbert space, the continuous subgradient method, penalty methods and Newton’s method.

  

Additional text

“The author studies the approximate solutions of optimization problems in the presence of computational errors. A number of results are presented on the convergence behavior of algorithms in a Hilbert space. Researchers and students will find this book instructive and informative. The book has contains 16 chapters … .” (Hans Benker, zbMATH 1347.65112, 2016)

Report

"The author studies the approximate solutions of optimization problems in the presence of computational errors. A number of results are presented on the convergence behavior of algorithms in a Hilbert space. Researchers and students will find this book instructive and informative. The book has contains 16 chapters ... ." (Hans Benker, zbMATH 1347.65112, 2016)

Customer reviews

No reviews have been written for this item yet. Write the first review and be helpful to other users when they decide on a purchase.

Write a review

Thumbs up or thumbs down? Write your own review.

For messages to CeDe.ch please use the contact form.

The input fields marked * are obligatory

By submitting this form you agree to our data privacy statement.