HOME > 상세정보

상세정보

Models of neural networks I 2nd updated ed

Models of neural networks I 2nd updated ed

자료유형
단행본
개인저자
Domany, E. (Eytan), 1947-. Hemmen, J. L. van (Jan Leonard), 1947-. Schulten, K. (Klaus).
서명 / 저자사항
Models of neural networks I / E. Domany, J.L. van Hemmen, K. Schulten (eds.).
판사항
2nd updated ed.
발행사항
New York :   Springer Verlag,   1995.  
형태사항
xviii, 355 p. ; 24 cm.
총서사항
Physics of neural networks.
ISBN
3540594035 (hardcover : alk. paper)
서지주기
Includes bibliographical references and index.
일반주제명
Neural networks (Computer science).
000 00913camuuu200277 a 4500
001 000000396098
005 19970910092626.0
008 960509s1995 nyu b 001 0 eng
010 ▼a 95031850
020 ▼a 3540594035 (hardcover : alk. paper)
040 ▼a DLC ▼c DLC
049 ▼a ACCL ▼l 111064319
050 0 0 ▼a QA76.87 ▼b .M57 1995
082 0 0 ▼a 006.3 ▼2 20
090 ▼a 006.3 ▼b M689-2
245 0 0 ▼a Models of neural networks I / ▼c E. Domany, J.L. van Hemmen, K. Schulten (eds.).
250 ▼a 2nd updated ed.
260 ▼a New York : ▼b Springer Verlag, ▼c 1995.
300 ▼a xviii, 355 p. ; ▼c 24 cm.
440 0 ▼a Physics of neural networks.
504 ▼a Includes bibliographical references and index.
650 0 ▼a Neural networks (Computer science).
700 1 0 ▼a Domany, E. ▼q (Eytan), ▼d 1947-.
700 1 0 ▼a Hemmen, J. L. van ▼q (Jan Leonard), ▼d 1947-.
700 1 0 ▼a Schulten, K. ▼q (Klaus).

No. 소장처 청구기호 등록번호 도서상태 반납예정일 예약 서비스
No. 1 소장처 학술정보관(CDL)/B1 국제기구자료실(보존서고8)/ 청구기호 006.3 M689-2 등록번호 111064319 도서상태 대출가능 반납예정일 예약 서비스 B M
No. 2 소장처 세종학술정보원/과학기술실(5층)/ 청구기호 006.3 M689-2 등록번호 151031975 도서상태 대출가능 반납예정일 예약 서비스 B M ?
No. 소장처 청구기호 등록번호 도서상태 반납예정일 예약 서비스
No. 1 소장처 학술정보관(CDL)/B1 국제기구자료실(보존서고8)/ 청구기호 006.3 M689-2 등록번호 111064319 도서상태 대출가능 반납예정일 예약 서비스 B M
No. 소장처 청구기호 등록번호 도서상태 반납예정일 예약 서비스
No. 1 소장처 세종학술정보원/과학기술실(5층)/ 청구기호 006.3 M689-2 등록번호 151031975 도서상태 대출가능 반납예정일 예약 서비스 B M ?

컨텐츠정보

책소개

This collection of articles responds to the urgent need for timely and comprehensive reviews in a multidisciplinary, rapidly developing field of research. The book starts out with an extensive introduction to the ideas used in the subsequent chapters, which are all centered around the theme of collective phenomena in neural networks: dynamics and storage capacity of networks of formal neurons with symmetric or asymmetric couplings, learning algorithms, temporal association, structured data (software), and structured nets (hardware). The style and level of this book make it particularly useful for advanced students and researchers looking for an accessible survey of today's theory of neural networks.


정보제공 : Aladin

목차


CONTENTS
1. Collective Phenomena in Neural Networks / By. JL. van Hemmen ; R. K u ·· hn(With 15 Figures) = 1
  1.1 Introduction and Overview = 1
    1.1.1 collective Phenomena in a Historical Perspective = 1
    1.1.2 the Role of Dynamics = 3
    1.1.3 Universality, Locality, and Learning = 5
    1.1.4 Outline of this Paper = 6
  1.2 Prerequisites = 8
    1.2.1 Large Deviations : A Case Study = 8
    1.2.2 Large Deviations : General Principles = 11
    1.2.3 A Mathematical Detour = 14
    1.2.4 Sublattice Magnetizations = 16
    1.2.5 The Replica Method = 17
  1.3 The Hopfield Model = 20
    1.3.1 The hopfield Model with Finitely Many Patterns = 20
    1.3.2 Stability = 23
    1.3.3 the Hopfield Model with Extensively Many (Weighted) Patterns = 23
    1.3.4 the Phase Diagram of the Hopfield Model = 31
    1.3.5 Discussion = 33
    1.3.6 Parallel Dynamics (the Little Model) = 36
    1.3.7 Continuous-Time Dynamics and Graded-Response Neurons = 40
  1.4 Nonlinear Neural Networks = 48
    1.4.1 Arbitrary Synaptic Kernel and Finitely Many Patterns = 49
    1.4.2 Spectral Theory = 51
    1.4.3 Extensively Many Patterns = 56
  1.5 Learning, Unlearning, and Forgetting = 62
    1.5.1 Introduction = 62
    1.5.2 The Pseudoinverse Learning Rule = 63
    1.5.3 The Perceptron Convergence Theorem = 68
    1.5.4 Hebbian Learning = 72
    1.5.5 Intermezzo = 81
    1.5.6 Hebbian Unlearning = 84
    1.5.7 Forgetting= 88
  1.6 Hierarchically Structured Information = 92
    1.6.1 Structured Information, Markov Chains, and Martingales = 92
    1.6.2 Signal-to-Noise-Ratio Analysis = 96
    1.6.3 Equivalence to the Hopfield Model = 98
    1.6.4 Weighted Hierarchies = 103
    1.6.5 Low-Activity Patterns = 105
    1.6.6 Discussion = 106
  1.7 Outlook = 108
  References = 108
2. Information from Structure : A Sketch of Neuroanatomy / By V. Braitenberg (With 6 Figures) = 115
  2.1 Development of the Brain = 115
  2.2 Neuroanatomy Related to Information Handling in the Brain = 116
  2.3 The Idea of Electronic Circuitry = 117
  2.4 The Projection from the Compound Eye onto the First Ganglion(Lamina) of the Fly = 118
  2.5 statistical Wiring = 119
  2.6 Symmetry of Neural Nets = 121
  2.7 The cerebellum = 122
  2.8 Variations in Size of the Elements = 125
  2.9 The Cerebral Cortex = 127
  2.10 Inborn Knowledge = 127
  References = 128
3. Storage Capacity and Learning in Ising-Spin Neural Networks / By B. M. Forrest ; D. J. Wallace (With 5 Figures) = 129
  3.1 Introduction = 129
    3.1.1 The Model = 129
    3.1.2 Content-addressable Memory = 130
    3.1.3 The Hopfield Model = 131
    3.1.4 The Spin-glass Analogy = 133
    3.1.5 Finite Temperature = 134
  3.2 Content-addressability : A Dynamics Problem = 136
    3.2.1 Numerical Tests = 138
  3.3 Learning = 140
    3.3.1 Learning Perfect storage = 140
    3.3.2 Enforcing Content-addressability = 141
    3.3.3 Optimal Learning = 144
    3.3.4 Training with Noise = 148
    3.3.5 Storing Correlated Patterns = 148
  3.4 Discussion = 151
  Appendix = 152
  References = 155
4. Dynamics of Learning / By W. Kinzel ; M. Opper (With 8 Figures) = 157
  4.1 Introduction = 157
  4.2 Definition of Supervised Learning = 159
  4.3 Adaline Learning = 162
  4.4 Perceptron Learning = 165
  4.5 Binary Synapses = 170
  4.6 Basins of Attraction = 172
  4.7 Forgetting = 175
  4.8 Outlook = 177
  References = 178
5. Hierarcihcal Organization of Memory / By M.V. Feigel'man ; L.B. Ioffe (With 2 Figures) = 181
  5.1 Introduction = 181
  5.2 Models : The Problem = 182
  5.3 A Toy Problem : Patterns with Low Activity = 183
  5.4 Models with Hierarchically Structured Information = 188
  5.5 Extensions = 195
  5.6 The Enhancement of Storage Capacity : Multineuron Interactions = 196
  5.7 Conclusion = 199
  References = 199
6. Asymmetrically Diluted Neural Networks / By R. Kree ; A. Zippelius = 201
  6.1 Introduction = 201
  6.2 Solvability and Retrieval Properties = 203
  6.3 Exact Solution with Dynamic Functionals = 208
  6.4 Extensions and Related Work = 216
  Appendix A = 217
  Appendix B = 218
  Appendix C = 219
  References = 219
7. Temporal Association / By R. Kuhn ; J.L. van Hemmen (With 22 Figures) = 221
  7.1 Introduction = 221
  7.2 Fast Synaptic Plasticity = 226
    7.2.1 Synaptic Plasticity in Hopfield-Type Networks = 226
    7.2.2 Sequence Generation by Selection = 229
  7.3 Noise-Driven Sequences of Biased Patterns = 236
  7.4 Stabilizing Sequences by Delays = 243
    7.4.1 Transition Mechanism and Persistence Times = 244
    7.4.2 Analytic Description of the Dynamics = 250
    7.4.3 Extreme Dilution of Synapses = 256
  7.5 Applications : Sequence Recognition, Counting, and the Generation of Complex Sequences = 260
    7.5.1 Sequence Recognition and Counting = 261
    7.5.2 Complex Sequences = 263
  7.6 Hebbian Learning with Delays = 269
  7.7 Epilogue = 281
  References = 285
8. Self-organizing Maps and Adaptive Filters / By H. Ritter ; K. Obermayer ; K.Schulten, ; J. Rubner(With 13Figures) = 289
  8.1 Introduction = 289
  8.2 Self-organizing Maps and Optimal Representation of Data = 291
  8.3 Learning Dynamics in the Vicinity of a Stationary State = 293
  8.4 Relation to Brain Modeling = 299
  8.5 Formation of a "Somatotopic Map" = 301
  8.6 Adaptive Orientation and Spatial Frequency Filters = 306
  8.7 Conclusion = 311
  References = 313
9. Layered Neural Networks / By E. Domany ; R. Meir (With 8 Figures) = 317
  9.1 Introduction = 317
  9.2 Dynamics of Feed-Forward Networks = 318
    9.2.1 General Overview = 318
    9.2.2 The Model : Its Definition and Solution by Gaussian Transforms = 320
    9.2.3 Generalization for Other Couplings = 327
  9.3 Unsupervised Learning in Layered Networks = 332
    9.3.1 Hebbian Learning and the Development of Orientation-Sensitive Cells and Columns = 333
    9.3.2 Information-Theoretic Principles Guiding the Development of the Perceptual System = 335
    9.3.3 Iterated Hebbian Learning in a Layered Network = 336
  9.4 Supervised Learning in Layered Networks = 337
  9.5 Summary and Discussion = 341
  References = 342
Elizabeth Gardner-An Appreciation = 345
Subject Index = 349

관련분야 신착자료

Negro, Alessandro (2026)
Dyer-Witheford, Nick (2026)