Fr. 109.00

Estimation of Mutual Information

Anglais · Livre Relié

Paraît le 01.01.2025

Description

En savoir plus

This book presents the mutual information (MI) estimation methods recently proposed by the author and published in a number of major journals. It includes two types of applications: learning a forest structure from data for multivariate variables and identifying independent variables (independent component analysis). MI between a pair of random variables is mathematically defined in information theory. It measures how dependent the two variables are, takes nonnegative values, and is zero if, and only if, they are independent, and is often necessary to know the value of MI between two variables in machine learning, statistical data analysis, and various sciences, including physics, psychology, and economics. However, the real value of MI is not available and it can only be estimated from data. The essential difference between this and other estimations is that consistency and independence testing are proved for the estimations proposed by the author, where the authors state that an estimation satisfies consistency and independence testing when the estimation corresponds to the true value and when the MI estimation value is zero with probability one as the sample size grows, respectively. Thus far, no MI estimations satisfy both these properties at once.

Table des matières

Chapter 1  Introduction.- Chapter 2  Estimation of Mutual Information for Discrete Variables.- Chapter 3 Estimation of Mutual Information for Continuous Variables.- Chapter 4 Estimation of Mutual Information for High-dimensional Variables.- Chapter 5 Application to Causal Discovery: Lingam and ICA.- Chapter 6 Concluding Remarks.

A propos de l'auteur

Joe Suzuki, Osaka University

Résumé

This book presents the mutual information (MI) estimation methods recently proposed by the author and published in a number of major journals. It includes two types of applications: learning a forest structure from data for multivariate variables and identifying independent variables (independent component analysis). MI between a pair of random variables is mathematically defined in information theory. It measures how dependent the two variables are, takes nonnegative values, and is zero if, and only if, they are independent, and is often necessary to know the value of MI between two variables in machine learning, statistical data analysis, and various sciences, including physics, psychology, and economics. However, the real value of MI is not available and it can only be estimated from data. The essential difference between this and other estimations is that consistency and independence testing are proved for the estimations proposed by the author, where the authors state that an estimation satisfies consistency and independence testing when the estimation corresponds to the true value and when the MI estimation value is zero with probability one as the sample size grows, respectively. Thus far, no MI estimations satisfy both these properties at once.

Commentaires des clients

Aucune analyse n'a été rédigée sur cet article pour le moment. Sois le premier à donner ton avis et aide les autres utilisateurs à prendre leur décision d'achat.

Écris un commentaire

Super ou nul ? Donne ton propre avis.

Pour les messages à CeDe.ch, veuillez utiliser le formulaire de contact.

Il faut impérativement remplir les champs de saisie marqués d'une *.

En soumettant ce formulaire, tu acceptes notre déclaration de protection des données.