Fr. 199.00

LEARNING FROM DATA - CONCEPTS THEORY AND METHODS

English · Hardback

Shipping usually within 3 to 5 weeks (title will be specially ordered)

Description

Read more

Informationen zum Autor Vladimir CherKassky, PhD, is Professor of Electrical and Computer Engineering at the University of Minnesota. He is internationally known for his research on neural networks and statistical learning. Filip Mulier, PhD, has worked in the software field for the last twelve years, part of which has been spent researching, developing, and applying advanced statistical and machine learning methods. He currently holds a project management position. Klappentext An interdisciplinary framework for learning methodologies-now revised and updatedLearning from Data provides a unified treatment of the principles and methods for learning dependencies from data. It establishes a general conceptual framework in which various learning methods from statistics, neural networks, and pattern recognition can be applied-showing that a few fundamental principles underlie most new methods being proposed today in statistics, engineering, and computer science.Since the first edition was published, the field of data-driven learning has experienced rapid growth. This Second Edition covers these developments with a completely revised chapter on support vector machines, a new chapter on noninductive inference and alternative learning formulations, and an in-depth discussion of the VC theoretical approach as it relates to other paradigms.Complete with over one hundred illustrations, case studies, examples, and chapter summaries, Learning from Data accommodates both beginning and advanced graduate students in engineering, computer science, and statistics. It is also indispensable for researchers and practitioners in these areas who must understand the principles and methods for learning dependencies from data. Zusammenfassung An interdisciplinary framework for learning methodologies, covering statistics, neural networks, and fuzzy logic, Learning from Data provides a unified treatment of the principles and methods for learning dependencies from data. Inhaltsverzeichnis PREFACE. NOTATION. 1 Introduction. 1.1 Learning and Statistical Estimation. 1.2 Statistical Dependency and Causality. 1.3 Characterization of Variables. 1.4 Characterization of Uncertainty. 1.5 Predictive Learning versus Other Data Analytical Methodologies. 2 Problem Statement, Classical Approaches, and Adaptive Learning. 2.1 Formulation of the Learning Problem. 2.1.1 Objective of Learning. 2.1.2 Common Learning Tasks. 2.1.3 Scope of the Learning Problem Formulation. 2.2 Classical Approaches. 2.2.1 Density Estimation. 2.2.2 Classification. 2.2.3 Regression. 2.2.4 Solving Problems with Finite Data. 2.2.5 Nonparametric Methods. 2.2.6 Stochastic Approximation. 2.3 Adaptive Learning: Concepts and Inductive Principles. 2.3.1 Philosophy, Major Concepts, and Issues. 2.3.2 A Priori Knowledge and Model Complexity. 2.3.3 Inductive Principles. 2.3.4 Alternative Learning Formulations. 2.4 Summary. 3 Regularization Framework. 3.1 Curse and Complexity of Dimensionality. 3.2 Function Approximation and Characterization of Complexity. 3.3 Penalization. 3.3.1 Parametric Penalties. 3.3.2 Nonparametric Penalties. 3.4 Model Selection (Complexity Control). 3.4.1 Analytical Model Selection Criteria. 3.4.2 Model Selection via Resampling. 3.4.3 Bias-Variance Tradeoff. 3.4.4 Example of Model Selection. 3.4.5 Function Approximation versus Predictive Learning. 3.5 Summary. 4 Statistical Learning Theory. 4.1 Conditions for Consistency and Convergence of ERM. 4.2 Growth Function and VC Dimension. 4.2.1 VC Dimension for Classification and Regression Problems. 4.2.2 Examples of Calculating VC Dimension. 4.3 Bounds on t...

Customer reviews

No reviews have been written for this item yet. Write the first review and be helpful to other users when they decide on a purchase.

Write a review

Thumbs up or thumbs down? Write your own review.

For messages to CeDe.ch please use the contact form.

The input fields marked * are obligatory

By submitting this form you agree to our data privacy statement.