Fr. 188.00

Statistical Mechanics of Neural Networks

English · Paperback / Softback

Shipping usually within 1 to 2 weeks (title will be printed to order)

Description

Read more

This book highlights a comprehensive introduction to the fundamental statistical mechanics underneath the inner workings of neural networks. The book discusses in details important concepts and techniques including the cavity method, the mean-field theory, replica techniques, the Nishimori condition, variational methods, the dynamical mean-field theory, unsupervised learning, associative memory models, perceptron models, the chaos theory of recurrent neural networks, and eigen-spectrums of neural networks, walking new learners through the theories and must-have skillsets to understand and use neural networks. The book focuses on quantitative frameworks of neural network models where the underlying mechanisms can be precisely isolated by physics of mathematical beauty and theoretical predictions. It is a good reference for students, researchers, and practitioners in the area of neural networks.

List of contents

Introduction.- Spin glass models and cavity method.- Variational mean-field theory and belief propagation.- Monte Carlo simulation methods.- High-temperature expansion.- Nishimori line.- Random energy model.- Statistical mechanical theory of Hopfield model.-  Replica symmetry and replica symmetry breaking.- Statistical mechanics of restricted Boltzmann machine.- Simplest model of unsupervised learning with binary synapses.-  Inherent-symmetry breaking in unsupervised learning.- Mean-field theory of Ising Perceptron.- Mean-field model of multi-layered Perceptron.- Mean-field theory of dimension reduction.- Chaos theory of random recurrent neural networks.- Statistical mechanics of random matrices.- Perspectives.

About the author










Haiping Huang

Dr. Haiping Huang received his Ph.D. degree in theoretical physics from the Institute of Theoretical Physics, the Chinese Academy of Sciences. He works as an associate professor at the School of Physics, Sun Yat-sen University, China. His research interests include the origin of the computational hardness of the binary perceptron model, the theory of dimension reduction in deep neural networks, and inherent symmetry breaking in unsupervised learning. In 2021, he was awarded Excellent Young Scientists Fund by National Natural Science Foundation of China.


Product details

Authors Haiping Huang
Publisher Springer, Berlin
 
Languages English
Product format Paperback / Softback
Released 01.01.2023
 
EAN 9789811675720
ISBN 978-981-1675-72-0
No. of pages 296
Dimensions 155 mm x 17 mm x 235 mm
Illustrations XVIII, 296 p. 62 illus., 40 illus. in color.
Subject Natural sciences, medicine, IT, technology > Physics, astronomy > Theoretical physics

Customer reviews

No reviews have been written for this item yet. Write the first review and be helpful to other users when they decide on a purchase.

Write a review

Thumbs up or thumbs down? Write your own review.

For messages to CeDe.ch please use the contact form.

The input fields marked * are obligatory

By submitting this form you agree to our data privacy statement.