Fr. 116.00

Information Theory - From Coding to Learning

English · Hardback

Shipping usually within 1 to 3 weeks (not available at short notice)

Description

Read more

Informationen zum Autor Yury Polyanskiy is a Professor of Electrical Engineering and Computer Science at the Massachusetts Institute of Technology, with a focus on information theory, statistical machine learning, error-correcting codes, wireless communication, and fault tolerance. He is the recipient of the 2020 IEEE Information Theory Society James Massey Award for outstanding achievement in research and teaching in Information Theory. Yihong Wu is a Professor of Statistics and Data Science at Yale University, focusing on the theoretical and algorithmic aspects of high-dimensional statistics, information theory, and optimization. He is the recipient of the 2018 Sloan Research Fellowship in Mathematics. Klappentext This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning. Includes over 210 student exercises, emphasising practical applications in statistics, machine learning and modern communication theory. Accompanied by online instructor solutions. Zusammenfassung This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning. Includes over 210 student exercises, emphasising practical applications in statistics, machine learning and modern communication theory. Accompanied by online instructor solutions. Inhaltsverzeichnis Part I. Information measures: 1. Entropy; 2. Divergence; 3. Mutual information; 4. Variational characterizations and continuity of information measures; 5. Extremization of mutual information: capacity saddle point; 6. Tensorization and information rates; 7. f-divergences; 8. Entropy method in combinatorics and geometry; 9. Random number generators; Part II. Lossless Data Compression: 10. Variable-length compression; 11. Fixed-length compression and Slepian-Wolf theorem; 12. Entropy of ergodic processes; 13. Universal compression; Part III. Hypothesis Testing and Large Deviations: 14. Neyman-Pearson lemma; 15. Information projection and large deviations; 16. Hypothesis testing: error exponents; Part IV. Channel Coding: 17. Error correcting codes; 18. Random and maximal coding; 19. Channel capacity; 20. Channels with input constraints. Gaussian channels; 21. Capacity per unit cost; 22. Strong converse. Channel dispersion. Error exponents. Finite blocklength; 23. Channel coding with feedback; Part V. Rate-distortion Theory and Metric Entropy: 24. Rate-distortion theory; 25. Rate distortion: achievability bounds; 26. Evaluating rate-distortion function. Lossy Source-Channel separation; 27. Metric entropy; Part VI. : 28. Basics of statistical decision theory, 29. Classical large-sample asymptotics; 30. Mutual information method; 31. Lower bounds via reduction to hypothesis testing, 32. Entropic bounds for statistical estimation; 33. Strong data processing inequality....

List of contents










Part I. Information measures: 1. Entropy; 2. Divergence; 3. Mutual information; 4. Variational characterizations and continuity of information measures; 5. Extremization of mutual information: capacity saddle point; 6. Tensorization and information rates; 7. f-divergences; 8. Entropy method in combinatorics and geometry; 9. Random number generators; Part II. Lossless Data Compression: 10. Variable-length compression; 11. Fixed-length compression and Slepian-Wolf theorem; 12. Entropy of ergodic processes; 13. Universal compression; Part III. Hypothesis Testing and Large Deviations: 14. Neyman-Pearson lemma; 15. Information projection and large deviations; 16. Hypothesis testing: error exponents; Part IV. Channel Coding: 17. Error correcting codes; 18. Random and maximal coding; 19. Channel capacity; 20. Channels with input constraints. Gaussian channels; 21. Capacity per unit cost; 22. Strong converse. Channel dispersion. Error exponents. Finite blocklength; 23. Channel coding with feedback; Part V. Rate-distortion Theory and Metric Entropy: 24. Rate-distortion theory; 25. Rate distortion: achievability bounds; 26. Evaluating rate-distortion function. Lossy Source-Channel separation; 27. Metric entropy; Part VI. : 28. Basics of statistical decision theory, 29. Classical large-sample asymptotics; 30. Mutual information method; 31. Lower bounds via reduction to hypothesis testing, 32. Entropic bounds for statistical estimation; 33. Strong data processing inequality.

About the author

Yury Polyanskiy is a Professor of Electrical Engineering and Computer Science at the Massachusetts Institute of Technology, with a focus on information theory, statistical machine learning, error-correcting codes, wireless communication, and fault tolerance. He is the recipient of the 2020 IEEE Information Theory Society James Massey Award for outstanding achievement in research and teaching in Information Theory.Yihong Wu is a Professor of Statistics and Data Science at Yale University, focusing on the theoretical and algorithmic aspects of high-dimensional statistics, information theory, and optimization. He is the recipient of the 2018 Sloan Research Fellowship in Mathematics.

Customer reviews

No reviews have been written for this item yet. Write the first review and be helpful to other users when they decide on a purchase.

Write a review

Thumbs up or thumbs down? Write your own review.

For messages to CeDe.ch please use the contact form.

The input fields marked * are obligatory

By submitting this form you agree to our data privacy statement.