Fr. 44.50

Principles of Neural Information Theory - Computational Neuroscience and Metabolic Efficiency

English · Paperback / Softback

Shipping usually within 2 to 3 weeks (title will be printed to order)

Description

Read more










The brain is the most complex computational machine known to science, even though its components (neurons) are slow and unreliable compared to a laptop computer. In this richly illustrated book, Shannon's mathematical theory of information is used to explore the metabolic efficiency of neurons, with special reference to visual perception. Evidence from a diverse range of research papers is used to show how information theory defines absolute limits on neural efficiency; limits which ultimately determine the neuroanatomical microstructure of the eye and brain. Written in an informal style, with a comprehensive glossary, tutorial appendices, explainer boxes, and a list of annotated Further Readings, this book is an ideal introduction to cutting-edge research in neural information theory.

About the author










Visiting Professor, University of Sheffield, UK.

Product details

Authors James V Stone, James V. Stone
Publisher Tutorial Introductions
 
Languages English
Product format Paperback / Softback
Released 15.05.2018
 
EAN 9780993367922
ISBN 978-0-9933679-2-2
No. of pages 214
Dimensions 152 mm x 229 mm x 12 mm
Weight 318 g
Series Tutorial Introductions
Subjects Guides
Natural sciences, medicine, IT, technology > IT, data processing > IT

Informationstheorie, entropy; Shannon

Customer reviews

No reviews have been written for this item yet. Write the first review and be helpful to other users when they decide on a purchase.

Write a review

Thumbs up or thumbs down? Write your own review.

For messages to CeDe.ch please use the contact form.

The input fields marked * are obligatory

By submitting this form you agree to our data privacy statement.