Fr. 140.00

Information-Theoretic Methods in Data Science

English · Hardback

Shipping usually within 3 to 5 weeks

Description

Read more










The first unified treatment of the interface between information theory and emerging topics in data science, written in a clear, tutorial style. Covering topics such as data acquisition, representation, analysis, and communication, it is ideal for graduate students and researchers in information theory, signal processing, and machine learning.

List of contents










1. Introduction Miguel Rodrigues, Stark Draper, Waheed Bajwa and Yonina Eldar; 2. An information theoretic approach to analog-to-digital compression Alon Knipis, Yonina Eldar and Andrea Goldsmith; 3. Compressed sensing via compression codes Shirin Jalali and Vincent Poor; 4. Information-theoretic bounds on sketching Mert Pillanci; 5. Sample complexity bounds for dictionary learning from vector- and tensor-valued data Zahra Shakeri, Anand Sarwate and Waheed Bajwa; 6. Uncertainty relations and sparse signal recovery Erwin Riegler and Helmut Bölcskei; 7. Understanding phase transitions via mutual Information and MMSE Galen Reeves and Henry Pfister; 8. Computing choice: learning distributions over permutations Devavrat Shah; 9. Universal clustering Ravi Raman and Lav Varshney; 10. Information-theoretic stability and generalization Maxim Raginsky, Alexander Rakhlin and Aolin Xu; 11. Information bottleneck and representation learning Pablo Piantanida and Leonardo Rey Vega; 12. Fundamental limits in model selection for modern data analysis Jie Ding, Yuhong Yang and Vahid Tarokh; 13. Statistical problems with planted structures: information-theoretical and computational limits Yihong Wu and Jiaming Xu; 14. Distributed statistical inference with compressed data Wenwen Zhao and Lifeng Lai; 15. Network functional compression Soheil Feizi and Muriel Médard; 16. An introductory guide to Fano's inequality with applications in statistical estimation Jonathan Scarlett and Volkan Cevher.

About the author

Miguel R. D. Rodrigues is a Reader in Information Theory and Processing in the Department of Electronic and Electrical Engineering, University College London, and a Faculty Fellow at the Turing Institute, London.Yonina C. Eldar is a Professor in the Faculty of Mathematics and Computer Science at the Weizmann Institute of Science, a Fellow of the IEEE and Eurasip, and a member of the Israel Academy of Sciences and Humanities. She is the author of Sampling Theory (Cambridge, 2015), and co-editor of Convex Optimization in Signal Processing and Communications (Cambridge, 2009), and Compressed Sensing (Cambridge, 2012).

Summary

The first unified treatment of the interface between information theory and emerging topics in data science, written in a clear, tutorial style. Covering topics such as data acquisition, representation, analysis, and communication, it is ideal for graduate students and researchers in information theory, signal processing, and machine learning.

Customer reviews

No reviews have been written for this item yet. Write the first review and be helpful to other users when they decide on a purchase.

Write a review

Thumbs up or thumbs down? Write your own review.

For messages to CeDe.ch please use the contact form.

The input fields marked * are obligatory

By submitting this form you agree to our data privacy statement.