Fr. 169.00

Information Theory and Statistical Learning

Inglese · Tascabile

Spedizione di solito entro 6 a 7 settimane

Descrizione

Ulteriori informazioni

"Information Theory and Statistical Learning" presents theoretical and practical results about information theoretic methods used in the context of statistical learning.
The book will present a comprehensive overview of the large range of different methods that have been developed in a multitude of contexts. Each chapter is written by an expert in the field. The book is intended for an interdisciplinary readership working in machine learning, applied statistics, artificial intelligence, biostatistics, computational biology, bioinformatics, web mining or related disciplines.
Advance Praise for "Information Theory and Statistical Learning":
"A new epoch has arrived for information sciences to integrate various disciplines such as information theory, machine learning, statistical inference, data mining, model selection etc. I am enthusiastic about recommending the present book to researchers and students, because it summarizes most of these new emerging subjects and methods, which are otherwise scattered in many places." Shun-ichi Amari, RIKEN Brain Science Institute, Professor-Emeritus at the University of Tokyo

Sommario

Algorithmic Probability: Theory and Applications.- Model Selection and Testing by the MDL Principle.- Normalized Information Distance.- The Application of Data Compression-Based Distances to Biological Sequences.- MIC: Mutual Information Based Hierarchical Clustering.- A Hybrid Genetic Algorithm for Feature Selection Based on Mutual Information.- Information Approach to Blind Source Separation and Deconvolution.- Causality in Time Series: Its Detection and Quantification by Means of Information Theory.- Information Theoretic Learning and Kernel Methods.- Information-Theoretic Causal Power.- Information Flows in Complex Networks.- Models of Information Processing in the Sensorimotor Loop.- Information Divergence Geometry and the Application to Statistical Machine Learning.- Model Selection and Information Criterion.- Extreme Physical Information as a Principle of Universal Stability.- Entropy and Cloning Methods for Combinatorial Optimization, Sampling and Counting Using the Gibbs Sampler.

Info autore

Frank Emmert-Streib studied physics at the University of Siegen (Germany) and received his Ph.D. in Theoretical Physics from the University of Bremen (Germany). He was a postdoctoral research associate at the Stowers Institute for Medical Research (Kansas City, USA) in the Department for Bioinformatics and a Senior Fellow at the University of Washington (Seattle, USA) in the Department of Biostatistics and the Department of Genome Sciences. Currently, he is Lecturer/Assistant Professor at the Queen's University Belfast at the Center for Cancer Research and Cell Biology (CCRCB) leading the Computational Biology and Machine Learning Lab. His research interests are in the field of computational biology, machine learning and biostatistics in the development and application of methods from statistics and machine learning for the analysis of high-throughput data from genomics and genetics experiments.

Matthias Dehmer studied mathematics at the University of Siegen (Germany) and received his PhD in computer science from the Technical University of Darmstadt (Germany). Afterwards, he was a research fellow at Vienna Bio Center (Austria), Vienna University of Technology and University of Coimbra (Portugal). Currently, he is Professor at UMIT - The Health and Life Sciences University (Austria). His research interests are in bioinformatics, cancer analysis, chemical graph theory, systems biology, complex networks, complexity, statistics and information theory. In particular, he is also working on machine learning-based methods to design new data analysis methods for solving problems in computational biology and medicinal chemistry.

Riassunto

"Information Theory and Statistical Learning" presents theoretical and practical results about information theoretic methods used in the context of statistical learning.

The book will present a comprehensive overview of the large range of different methods that have been developed in a multitude of contexts. Each chapter is written by an expert in the field. The book is intended for an interdisciplinary readership working in machine learning, applied statistics, artificial intelligence, biostatistics, computational biology, bioinformatics, web mining or related disciplines.

Advance Praise for "Information Theory and Statistical Learning":

"A new epoch has arrived for information sciences to integrate various disciplines such as information theory, machine learning, statistical inference, data mining, model selection etc. I am enthusiastic about recommending the present book to researchers and students, because it summarizes most of these new emerging subjects and methods, which are otherwise scattered in many places." Shun-ichi Amari, RIKEN Brain Science Institute, Professor-Emeritus at the University of Tokyo

Recensioni dei clienti

Per questo articolo non c'è ancora nessuna recensione. Scrivi la prima recensione e aiuta gli altri utenti a scegliere.

Scrivi una recensione

Top o flop? Scrivi la tua recensione.

Per i messaggi a CeDe.ch si prega di utilizzare il modulo di contatto.

I campi contrassegnati da * sono obbligatori.

Inviando questo modulo si accetta la nostra dichiarazione protezione dati.