Fr. 215.00

A First Course in Information Theory

English · Paperback / Softback

Shipping usually within 6 to 7 weeks

Description

Read more

A First Course in Information Theory is an up-to-date introduction to information theory. In addition to the classical topics discussed, it provides the first comprehensive treatment of the theory of I-Measure, network coding theory, Shannon and non-Shannon type information inequalities, and a relation between entropy and group theory. ITIP, a software package for proving information inequalities, is also included. With a large number of examples, illustrations, and original problems, this book is excellent as a textbook or reference book for a senior or graduate level course on the subject, as well as a reference for researchers in related fields.

List of contents

1. The Science of Information.- Information Measures.- 2.1 Independence and Markov Chains.- 2.2 Shannon's Information Measures.- 2.3 Continuity of Shannon's Information Measures.- 2.4 Chain Rules.- 2.5 Informational Divergence.- 2.6 The Basic Inequalities.- 2.7 Some Useful Information Inequalities.- 2.8 Fano's Inequality.- 2.9 Entropy Rate of Stationary Source.- Problems.- Historical Notes.- 3. Zero-Error Data Compression.- 3.1 The Entropy Bound.- 3.2 Prefix Codes.- 3.3 Redundancy of Prefix Codes.- Problems.- Historical Notes.- 4. Weak Typicality.- 4.1 The Weak.- 4.2 The Source Coding Theorem.- 4.3 Efficient Source Coding.- 4.4 The Shannon-McMillan-Breiman Theorem.- Problems.- Historical Notes.- 5. Strong Typicality.- 5.1 Strong.- 5.2 Strong Typicality Versus Weak Typicality.- 5.3 Joint Typicality.- 5.4 An Interpretation of the Basic Inequalities.- Problems.- Historical Notes.- The I-measure.- 6.1 Preliminaries.- 6.2 The I-Measure for Two Random Variables.- 6.3 Construction of the I-Measure ?*.- 6.4 ?* Can be Negative.- 6.5 Information Diagrams.- 6.6 Examples of Applications.- Appendix 6.A: A Variation of the Inclusion-Exclusion Formula.- Problems.- Historical Notes.- 7. Markov Structures.- 7.1 Conditional Mutual Independence.- 7.2 Full Conditional Mutual Independence.- 7.3 Markov Random Field.- 7.4 Markov Chain.- Problems.- Historical Notes.- 8. Channel Capacity.- 8.1 Discrete Memoryless Channels.- 8.2 The Channel Coding Theorem.- 8.3 The Converse.- 8.4 Achievability of the Channel Capacity.- 8.5 A Discussion.- 8.6 Feedback Capacity.- 8.7 Separation of Source and Channel Coding.- Problems.- Historical Notes.- 9. Rate-Distortion Theory.- 9.1 Single-Letter Distortion Measures.- 9.2 The Rate-Distortion Function R(D).- 9.3 The Rate-Distortion Theorem.- 9.4 The Converse.- 9.5 Achievability of RI(D).- Problems.- Historical Notes.- The Blahut-Arimoto Algorithms.- 10.1 Alternating Optimization.- 10.2 The Algorithms.- 10.3 Convergence.- Problems.- Historical Notes.- 11. Single-Source Network Coding.- 11.1 A Point-to-Point Network.- 11.2 What is Network Coding?.- 11.3 A Network Code.- 11.4 The Max-Flow Bound.- 11.5 Achievability of the Max-Flow Bound.- Problems.- Historical Notes.- 12. Information Inequalities.- 12.1 The Region ?*n.- 12.2 Information Expressions in Canonical Form.- 12.3 A Geometrical Framework.- 12.4 Equivalence of Constrained Inequalities.- 12.5 The Implication Problem of Conditional Independence.- Problems.- Historical Notes.- 13. Shannon-Type Inequalities.- 13.1 The Elemental Inequalities.- 13.2 A Linear Programming Approach.- 13.3 A Duality.- 13.4 Machine Proving.- 13.5 Tackling the Implication Problem.- 13.6 Minimality of the Elemental Inequalities.- Appendix 13.A: The Basic Inequalities and the Polymatroidal Axioms.- Problems.- Historical Notes.- Problems.- Historical Notes.- 14. Beyond Shannon-Type Inequalities.- 14.1 Characterizations of ?*2,?*3, and ?*n.- 14.2 A Non-Shannon-Type Unconstrained Inequality.- 14.3 A Non-Shannon-TypeConstrained Inequality.- 14.4 Applications.- Problems.- Historical Notes.- 978-1-4419-8608-5_15.- 15.1 Two Characteristics.- 15.2 Examples of Application.- 15.3 A Network Code for Acyclic Networks.- 15.4 An Inner Bound.- 15.5 An Outer Bound.- 15.6 The LP Bound and Its Tightness.- 15.7 Achievability of Rin.- Appendix 15.A: Approximation of Random Variables with Infinite Alphabets.- Problems.- Historical Notes.- 16. Entropy and Groups.- 16.1 Group Preliminaries.- 16.2 Group-Characterizable Entropy Functions.- 16.3 A Group Characterization of ?*n.- 16.4 Information Inequalities and Group Inequalities.- Problems.- Historical Notes.

Summary

This book provides an up-to-date introduction to information theory. In addition to the classical topics discussed, it provides the first comprehensive treatment of the theory of I-Measure, network coding theory, Shannon and non-Shannon type information inequalities, and a relation between entropy and group theory. ITIP, a software package for proving information inequalities, is also included. With a large number of examples, illustrations, and original problems, this book is excellent as a textbook or reference book for a senior or graduate level course on the subject, as well as a reference for researchers in related fields.

Customer reviews

No reviews have been written for this item yet. Write the first review and be helpful to other users when they decide on a purchase.

Write a review

Thumbs up or thumbs down? Write your own review.

For messages to CeDe.ch please use the contact form.

The input fields marked * are obligatory

By submitting this form you agree to our data privacy statement.