Fr. 126.00

Analogue Imprecision In Mlp Training, Progress In Neural Processing, Vol 4

English · Hardback

Shipping usually takes at least 4 weeks (title will be specially ordered)

Description

Read more










Hardware inaccuracy and imprecision are important considerations when implementing neural algorithms. This book presents a study of synaptic weight noise as a typical fault model for analogue VLSI realisations of MLP neural networks and examines the implications for learning and network performance. The aim of the book is to present a study of how including an imprecision model into a learning scheme as a"fault tolerance hint" can aid understanding of accuracy and precision requirements for a particular implementation. In addition the study shows how such a scheme can give rise to significant performance enhancement.

Product details

Authors Peter Edwards, Peter J. Edwards, A.F. Murray, Alan F Murray, Alan F. Murray
Publisher World Scientific Publishing Company
 
Languages English
Product format Hardback
Released 01.08.1996
 
EAN 9789810227395
ISBN 978-981-02-2739-5
No. of pages 192
Series Progress in Neural Processing
Progress In Neural Processing
Subject Natural sciences, medicine, IT, technology > IT, data processing > Data communication, networks

Customer reviews

No reviews have been written for this item yet. Write the first review and be helpful to other users when they decide on a purchase.

Write a review

Thumbs up or thumbs down? Write your own review.

For messages to CeDe.ch please use the contact form.

The input fields marked * are obligatory

By submitting this form you agree to our data privacy statement.