Fr. 188.00

Support Vector Machines for Pattern Classification

English · Paperback / Softback

Shipping usually within 1 to 2 weeks (title will be printed to order)

Description

Read more

A guide on the use of SVMs in pattern classification, including a rigorous performance comparison of classifiers and regressors. The book presents architectures for multiclass classification and function approximation problems, as well as evaluation criteria for classifiers and regressors. Features: Clarifies the characteristics of two-class SVMs; Discusses kernel methods for improving the generalization ability of neural networks and fuzzy systems; Contains ample illustrations and examples; Includes performance evaluation using publicly available data sets; Examines Mahalanobis kernels, empirical feature space, and the effect of model selection by cross-validation; Covers sparse SVMs, learning using privileged information, semi-supervised learning, multiple classifier systems, and multiple kernel learning; Explores incremental training based batch training and active-set training methods, and decomposition techniques for linear programming SVMs; Discusses variable selection for support vector regressors.

List of contents

Two-Class Support Vector Machines.- Multiclass Support Vector Machines.- Variants of Support Vector Machines.- Training Methods.- Kernel-Based Methods Kernel@Kernel-based method .- Feature Selection and Extraction.- Clustering.- Maximum-Margin Multilayer Neural Networks.- Maximum-Margin Fuzzy Classifiers.- Function Approximation.

Summary

A guide on the use of SVMs in pattern classification, including a rigorous performance comparison of classifiers and regressors. The book presents architectures for multiclass classification and function approximation problems, as well as evaluation criteria for classifiers and regressors. Features: Clarifies the characteristics of two-class SVMs; Discusses kernel methods for improving the generalization ability of neural networks and fuzzy systems; Contains ample illustrations and examples; Includes performance evaluation using publicly available data sets; Examines Mahalanobis kernels, empirical feature space, and the effect of model selection by cross-validation; Covers sparse SVMs, learning using privileged information, semi-supervised learning, multiple classifier systems, and multiple kernel learning; Explores incremental training based batch training and active-set training methods, and decomposition techniques for linear programming SVMs; Discusses variable selection for support vector regressors.

Additional text

From the reviews:

"This broad and deep … book is organized around the highly significant concept of pattern recognition by support vector machines (SVMs). … The book is praxis and application oriented but with strong theoretical backing and support. Many … details are presented and discussed, thereby making the SVM both an easy-to-understand learning machine and a more likable data modeling (mining) tool. Shigeo Abe has produced the book that will become the standard … . I like it and therefore highly recommend this book … ." (Vojislav Kecman, SIAM Review, Vol. 48 (2), 2006)

Report

From the reviews:

"This broad and deep ... book is organized around the highly significant concept of pattern recognition by support vector machines (SVMs). ... The book is praxis and application oriented but with strong theoretical backing and support. Many ... details are presented and discussed, thereby making the SVM both an easy-to-understand learning machine and a more likable data modeling (mining) tool. Shigeo Abe has produced the book that will become the standard ... . I like it and therefore highly recommend this book ... ." (Vojislav Kecman, SIAM Review, Vol. 48 (2), 2006)

Customer reviews

No reviews have been written for this item yet. Write the first review and be helpful to other users when they decide on a purchase.

Write a review

Thumbs up or thumbs down? Write your own review.

For messages to CeDe.ch please use the contact form.

The input fields marked * are obligatory

By submitting this form you agree to our data privacy statement.