HOME > 상세정보

상세정보

An introduction to neural networks

An introduction to neural networks (7회 대출)

자료유형
단행본
개인저자
Anderson, James A.
서명 / 저자사항
An introduction to neural networks / James A. Anderson.
발행사항
Cambridge, Mass. :   MIT Press,   c1995.  
형태사항
xi, 650 p. : ill. ; 27 cm.
ISBN
0262011441
일반주기
"A Bradford book."  
서지주기
Includes bibliographical references and index.
일반주제명
Neural networks (Neurobiology)
000 00779camuu2200253 a 4500
001 000000805272
005 20030224115602
008 940801s1995 maua b 001 0 eng
010 ▼a 94030749
020 ▼a 0262011441
035 ▼a KRIC00261835
040 ▼a 211032 ▼c 211032 ▼d 211009
049 1 ▼l 111236523
050 0 0 ▼a QP363.3 ▼b .A534 1995
082 0 0 ▼a 612.8 ▼2 21
090 ▼a 612.8 ▼b A547i
100 1 ▼a Anderson, James A.
245 1 3 ▼a An introduction to neural networks / ▼c James A. Anderson.
260 ▼a Cambridge, Mass. : ▼b MIT Press, ▼c c1995.
300 ▼a xi, 650 p. : ▼b ill. ; ▼c 27 cm.
500 ▼a "A Bradford book."
504 ▼a Includes bibliographical references and index.
650 0 0 ▼a Neural networks (Neurobiology)

No. 소장처 청구기호 등록번호 도서상태 반납예정일 예약 서비스
No. 1 소장처 중앙도서관/서고7층/ 청구기호 612.8 A547i 등록번호 111236523 (3회 대출) 도서상태 대출가능 반납예정일 예약 서비스 B M
No. 2 소장처 과학도서관/Sci-Info(2층서고)/ 청구기호 612.8 A547i 등록번호 121162332 (4회 대출) 도서상태 대출가능 반납예정일 예약 서비스 B M
No. 소장처 청구기호 등록번호 도서상태 반납예정일 예약 서비스
No. 1 소장처 중앙도서관/서고7층/ 청구기호 612.8 A547i 등록번호 111236523 (3회 대출) 도서상태 대출가능 반납예정일 예약 서비스 B M
No. 소장처 청구기호 등록번호 도서상태 반납예정일 예약 서비스
No. 1 소장처 과학도서관/Sci-Info(2층서고)/ 청구기호 612.8 A547i 등록번호 121162332 (4회 대출) 도서상태 대출가능 반납예정일 예약 서비스 B M

컨텐츠정보

책소개

Choice Outstanding Academic Title, 1996.

An Introduction to Neural Networks falls into a new ecological niche for texts. Based on notes that have been class-tested for more than a decade, it is aimed at cognitive science and neuroscience students who need to understand brain function in terms of computational modeling, and at engineers who want to go beyond formal algorithms to applications and computing strategies. It is the only current text to approach networks from a broad neuroscience and cognitive science perspective, with an emphasis on the biology and psychology behind the assumptions of the models, as well as on what the models might be used for. It describes the mathematical and computational tools needed and provides an account of the author's own ideas.

Students learn how to teach arithmetic to a neural network and get a short course on linear associative memory and adaptive maps. They are introduced to the author's brain-state-in-a-box (BSB) model and are provided with some of the neurobiological background necessary for a firm grasp of the general subject.

The field now known as neural networks has split in recent years into two major groups, mirrored in the texts that are currently available: the engineers who are primarily interested in practical applications of the new adaptive, parallel computing technology, and the cognitive scientists and neuroscientists who are interested in scientific applications. As the gap between these two groups widens, Anderson notes that the academics have tended to drift off into irrelevant, often excessively abstract research while the engineers have lost contact with the source of ideas in the field. Neuroscience, he points out, provides a rich and valuable source of ideas about data representation and setting up the data representation is the major part of neural network programming. Both cognitive science and neuroscience give insights into how this can be done effectively: cognitive science suggests what to compute and neuroscience suggests how to compute it.


정보제공 : Aladin

목차


CONTENTS
Introduction = ⅶ
Acknowledgments = xiii
1. Properties of single Neurons = 1
2. Synaptic Integration and Neuron Models = 37
3. Essential Vector Operations = 63
4. Lateral Inhibition and Sensory Processing = 85
5. Simple Matrix Operations = 129
6. The Linear Associator : Background and Foundations = 143
7. The Linear Associator : Simulations = 175
8. Early Network Models : The Perceptron = 209
9. Gradient Descent Algorithms = 239
10. Representation of Information = 281
11. Applications of Simple Associators : Concept Formation and Object Motion = 351
12. Energy and Neural Networks : Hopfield Networks and Boltzmann Machines = 401
13. Nearest Neighbor Models = 433
14. Adaptive Maps = 463
15. The BSB Model : A Simple Nonlinear Autoassociative Neural Network = 493
16. Associative Computation = 545
17. Teaching Arithmetic to a Neural Network = 585
Afterword = 629
Index = 631


관련분야 신착자료

Haier, Richard J (2025)