CONTENTS
1. Collective Phenomena in Neural Networks / By. JL. van Hemmen ; R. K u ·· hn(With 15 Figures) = 1
1.1 Introduction and Overview = 1
1.1.1 collective Phenomena in a Historical Perspective = 1
1.1.2 the Role of Dynamics = 3
1.1.3 Universality, Locality, and Learning = 5
1.1.4 Outline of this Paper = 6
1.2 Prerequisites = 8
1.2.1 Large Deviations : A Case Study = 8
1.2.2 Large Deviations : General Principles = 11
1.2.3 A Mathematical Detour = 14
1.2.4 Sublattice Magnetizations = 16
1.2.5 The Replica Method = 17
1.3 The Hopfield Model = 20
1.3.1 The hopfield Model with Finitely Many Patterns = 20
1.3.2 Stability = 23
1.3.3 the Hopfield Model with Extensively Many (Weighted) Patterns = 23
1.3.4 the Phase Diagram of the Hopfield Model = 31
1.3.5 Discussion = 33
1.3.6 Parallel Dynamics (the Little Model) = 36
1.3.7 Continuous-Time Dynamics and Graded-Response Neurons = 40
1.4 Nonlinear Neural Networks = 48
1.4.1 Arbitrary Synaptic Kernel and Finitely Many Patterns = 49
1.4.2 Spectral Theory = 51
1.4.3 Extensively Many Patterns = 56
1.5 Learning, Unlearning, and Forgetting = 62
1.5.1 Introduction = 62
1.5.2 The Pseudoinverse Learning Rule = 63
1.5.3 The Perceptron Convergence Theorem = 68
1.5.4 Hebbian Learning = 72
1.5.5 Intermezzo = 81
1.5.6 Hebbian Unlearning = 84
1.5.7 Forgetting= 88
1.6 Hierarchically Structured Information = 92
1.6.1 Structured Information, Markov Chains, and Martingales = 92
1.6.2 Signal-to-Noise-Ratio Analysis = 96
1.6.3 Equivalence to the Hopfield Model = 98
1.6.4 Weighted Hierarchies = 103
1.6.5 Low-Activity Patterns = 105
1.6.6 Discussion = 106
1.7 Outlook = 108
References = 108
2. Information from Structure : A Sketch of Neuroanatomy / By V. Braitenberg (With 6 Figures) = 115
2.1 Development of the Brain = 115
2.2 Neuroanatomy Related to Information Handling in the Brain = 116
2.3 The Idea of Electronic Circuitry = 117
2.4 The Projection from the Compound Eye onto the First Ganglion(Lamina) of the Fly = 118
2.5 statistical Wiring = 119
2.6 Symmetry of Neural Nets = 121
2.7 The cerebellum = 122
2.8 Variations in Size of the Elements = 125
2.9 The Cerebral Cortex = 127
2.10 Inborn Knowledge = 127
References = 128
3. Storage Capacity and Learning in Ising-Spin Neural Networks / By B. M. Forrest ; D. J. Wallace (With 5 Figures) = 129
3.1 Introduction = 129
3.1.1 The Model = 129
3.1.2 Content-addressable Memory = 130
3.1.3 The Hopfield Model = 131
3.1.4 The Spin-glass Analogy = 133
3.1.5 Finite Temperature = 134
3.2 Content-addressability : A Dynamics Problem = 136
3.2.1 Numerical Tests = 138
3.3 Learning = 140
3.3.1 Learning Perfect storage = 140
3.3.2 Enforcing Content-addressability = 141
3.3.3 Optimal Learning = 144
3.3.4 Training with Noise = 148
3.3.5 Storing Correlated Patterns = 148
3.4 Discussion = 151
Appendix = 152
References = 155
4. Dynamics of Learning / By W. Kinzel ; M. Opper (With 8 Figures) = 157
4.1 Introduction = 157
4.2 Definition of Supervised Learning = 159
4.3 Adaline Learning = 162
4.4 Perceptron Learning = 165
4.5 Binary Synapses = 170
4.6 Basins of Attraction = 172
4.7 Forgetting = 175
4.8 Outlook = 177
References = 178
5. Hierarcihcal Organization of Memory / By M.V. Feigel'man ; L.B. Ioffe (With 2 Figures) = 181
5.1 Introduction = 181
5.2 Models : The Problem = 182
5.3 A Toy Problem : Patterns with Low Activity = 183
5.4 Models with Hierarchically Structured Information = 188
5.5 Extensions = 195
5.6 The Enhancement of Storage Capacity : Multineuron Interactions = 196
5.7 Conclusion = 199
References = 199
6. Asymmetrically Diluted Neural Networks / By R. Kree ; A. Zippelius = 201
6.1 Introduction = 201
6.2 Solvability and Retrieval Properties = 203
6.3 Exact Solution with Dynamic Functionals = 208
6.4 Extensions and Related Work = 216
Appendix A = 217
Appendix B = 218
Appendix C = 219
References = 219
7. Temporal Association / By R. Kuhn ; J.L. van Hemmen (With 22 Figures) = 221
7.1 Introduction = 221
7.2 Fast Synaptic Plasticity = 226
7.2.1 Synaptic Plasticity in Hopfield-Type Networks = 226
7.2.2 Sequence Generation by Selection = 229
7.3 Noise-Driven Sequences of Biased Patterns = 236
7.4 Stabilizing Sequences by Delays = 243
7.4.1 Transition Mechanism and Persistence Times = 244
7.4.2 Analytic Description of the Dynamics = 250
7.4.3 Extreme Dilution of Synapses = 256
7.5 Applications : Sequence Recognition, Counting, and the Generation of Complex Sequences = 260
7.5.1 Sequence Recognition and Counting = 261
7.5.2 Complex Sequences = 263
7.6 Hebbian Learning with Delays = 269
7.7 Epilogue = 281
References = 285
8. Self-organizing Maps and Adaptive Filters / By H. Ritter ; K. Obermayer ; K.Schulten, ; J. Rubner(With 13Figures) = 289
8.1 Introduction = 289
8.2 Self-organizing Maps and Optimal Representation of Data = 291
8.3 Learning Dynamics in the Vicinity of a Stationary State = 293
8.4 Relation to Brain Modeling = 299
8.5 Formation of a "Somatotopic Map" = 301
8.6 Adaptive Orientation and Spatial Frequency Filters = 306
8.7 Conclusion = 311
References = 313
9. Layered Neural Networks / By E. Domany ; R. Meir (With 8 Figures) = 317
9.1 Introduction = 317
9.2 Dynamics of Feed-Forward Networks = 318
9.2.1 General Overview = 318
9.2.2 The Model : Its Definition and Solution by Gaussian Transforms = 320
9.2.3 Generalization for Other Couplings = 327
9.3 Unsupervised Learning in Layered Networks = 332
9.3.1 Hebbian Learning and the Development of Orientation-Sensitive Cells and Columns = 333
9.3.2 Information-Theoretic Principles Guiding the Development of the Perceptual System = 335
9.3.3 Iterated Hebbian Learning in a Layered Network = 336
9.4 Supervised Learning in Layered Networks = 337
9.5 Summary and Discussion = 341
References = 342
Elizabeth Gardner-An Appreciation = 345
Subject Index = 349