Fr. 70.00

Memory and the Computational Brain - Why Cognitive Science Will Transform Neuroscience

Inglese · Tascabile

Spedizione di solito entro 3 a 5 settimane

Descrizione

Ulteriori informazioni

Informationen zum Autor C. R. Gallistel is Co-Director of the Rutgers Center for Cognitive Science. He is one of the foremost psychologists working on the foundations of cognitive neuroscience. His publications include The Symbolic Foundations of Conditional Behavior (2002), and The Organization of Learning (1990). Adam Philip King is Assistant Professor of Mathematics at Fairfield University. Klappentext Memory and the Computational Brain offers a provocative argument that goes to the heart of neuroscience, proposing that the field can and should benefit from the recent advances of cognitive science and the development of information theory over the course of the last several decades.* A provocative argument that impacts across the fields of linguistics, cognitive science, and neuroscience, suggesting new perspectives on learning mechanisms in the brain* Proposes that the field of neuroscience can and should benefit from the recent advances of cognitive science and the development of information theory* Suggests that the architecture of the brain is structured precisely for learning and for memory, and integrates the concept of an addressable read/write memory mechanism into the foundations of neuroscience* Based on lectures in the prestigious Blackwell-Maryland Lectures in Language and Cognition, and now significantly reworked and expanded to make it ideal for students and faculty Zusammenfassung Memory and the Computational Brain offers a provocative argument that goes to the heart of neuroscience, proposing that the field can and should benefit from the recent advances of cognitive science and the development of information theory over the course of the last several decades. Inhaltsverzeichnis Preface viii 1 Information 1 Shannon's Theory of Communication 2 Measuring Information 7 Efficient Coding 16 Information and the Brain 20 Digital and Analog Signals 24 Appendix: The Information Content of Rare Versus Common 25 Events and Signals 2 Bayesian Updating 27 Bayes' Theorem and Our Intuitions about Evidence 30 Using Bayes' Rule 32 Summary 41 3 Functions 43 Functions of One Argument 43 Composition and Decomposition of Functions 46 Functions of More than One Argument 48 The Limits to Functional Decomposition 49 Functions Can Map to Multi-Part Outputs 49 Mapping to Multiple-Element Outputs Does Not Increase Expressive Power 50 Defining Particular Functions 51 Summary: Physical/Neurobiological Implications of Facts about Functions 53 4 Representations 55 Some Simple Examples 56 Notation 59 The Algebraic Representation of Geometry 64 5 Symbols 72 Physical Properties of Good Symbols 72 Symbol Taxonomy 79 Summary 82 6 Procedures 85 Algorithms 85 Procedures, Computation, and Symbols 87 Coding and Procedures 89 Two Senses of Knowing 100 A Geometric Example 101 7 Computation 104 Formalizing Procedures 105 The Turing Machine 107 Turing Machine for the Successor Function 110 Turing Machines for fis even 111 Turing Machines for f+ 115 Minimal Memory Structure 121 General Purpose Computer 122 Summary 124 8 Architectures 126 One-Dimensional Look-Up Tables (If-Then Implementation) 128 Adding State Memory: Finite-State Machines 131 Adding Register Memory 137 Summary 144 9 Data Structures 149 Finding Information in Memory 151 An Illustrative Example 160 Procedures and the Coding of Data Structures 165 The Structure of the Read-Only Biological Memory 167 10 Computing with Neurons 170 Transducers and Con...

Sommario

Preface.
1. Information.
Shannon's Theory of Communication.
Measuring Information.
Efficient Coding.
Information and the Brain.
Digital and Analog Signals.
Appendix: The Information Content of Rare Versus Common Events and Signals.
2. Bayesian Updating.
Bayes' Theorem and Our Intuitions About Evidence.
Using Bayes' Rule.
Summary.
3. Functions.
Functions of One Argument.
Composition and Decomposition of Functions.
Functions of More than One Argument.
The Limits to Functional Decomposition.
Functions Can Map to Multi-Part Outputs.
Mapping to Multiple-Element Outputs Does Not Increase Expressive Power.
Defining Particular Functions.
Summary: Physical/Neurobiological Implications of Facts about Functions.
4. Representations.
Some Simple Examples.
Notation.
The Algebraic Representation of Geometry.
5. Symbols.
Physical Properties of Good Symbols.
Symbol Taxonomy.
Summary.
6. Procedures.
Algorithms.
Procedures, Computation, and Symbols.
Coding and Procedures.
Two Senses of Knowing.
A Geometric Example.
7. Computation.
Formalizing Procedures.
The Turing Machine.
Turing Machine for the Successor Function.
Turing Machines for ' is _even
Turing Machines for '+
Minimal Memory Structure.
General Purpose Computer.
Summary.
8. Architectures.
One-Dimensional Look-Up Tables (If-Then Implementation).
Adding State Memory: Finite-State Machines.
Adding Register Memory.
Summary.
9. Data Structures.
Finding Information in Memory.
An Illustrative Example.
Procedures and the Coding of Data Structures.
The Structure of the Read-Only Biological Memory.
10. Computing with Neurons.
Transducers and Conductors.
Synapses and the Logic Gates.
The Slowness of It All.
The Time-Scale Problem.
Synaptic Plasticity.
Recurrent Loops in Which Activity Reverberates.
11. The Nature of Learning.
Learning As Rewiring.
Synaptic Plasticity and the Associative Theory of Learning.
Why Associations Are Not Symbols.
Distributed Coding.
Learning As the Extraction and Preservation of Useful Information.
Updating an Estimate of One's Location.
12. Learning Time and Space.
Computational Accessibility.
Learning the Time of Day.
Learning Durations.
Episodic Memory.
13. The Modularity of Learning.
Example 1: Path Integration.
Example 2: Learning the Solar Ephemeris.
Example 3: "Associative" Learning.
Summary.
14. Dead Reckoning in a Neural Network.
Reverberating Circuits as Read/Write Memory Mechanisms.
Implementing Combinatorial Operations by Table-Look-Up.
The Full Model.
The Ontogeny of the Connections?
How Realistic is the Model?
Lessons to be Drawn.
Summary.
15. Neural Models of Interval Timing.
Timing an Interval on First Encounter.
Dworkin's Paradox.
Neurally Inspired Models.
The Deeper Problems.
16. The Molecular Basis of Memory.
The Need to Separate Theory of Memory from Theory of Learning.
The Coding Question.
A Cautionary Tale.
Why Not Synaptic Conductance?
A Molecular or Sub-Molecular Mechanism?
Bringing the Data to the Computational Machinery.
Is It Universal?
References.
Glossary.
Index.

Relazione

"The book covers wide-ranging ground--indeed, it passes for a computer science or philosophy textbook in places--but it does so in a consistently lucid and engaging fashion." ( CHOICE , December 2009)

"The authors provide a cogent set of ideas regarding a kind of brain functional architecture that could serve as a thought-provoking alternative to that envisioned by current dogma. If one is seriously concerned with understanding and investigating the brain and how it operates, taking the time to absorb the ideas conveyed in this book is likely to be time well spent." ( PsycCRITIQUES , November 2009)
"Along with a light complement of fascinating psychological case studies of representations of space and time, and a heavy set of polemical sideswipes at neuroscientists and their hapless computational fellow travelers, this book has the simple goal of persuading us of the importance of a particular information processing mechanism that it claims does not currently occupy center stage." ( Nature Neuroscience , October 2009)

Recensioni dei clienti

Per questo articolo non c'è ancora nessuna recensione. Scrivi la prima recensione e aiuta gli altri utenti a scegliere.

Scrivi una recensione

Top o flop? Scrivi la tua recensione.

Per i messaggi a CeDe.ch si prega di utilizzare il modulo di contatto.

I campi contrassegnati da * sono obbligatori.

Inviando questo modulo si accetta la nostra dichiarazione protezione dati.