HOME > 상세정보

상세정보

Subsymbolic natural language processing : an integrated model of scripts, lexicon, and memory

Subsymbolic natural language processing : an integrated model of scripts, lexicon, and memory

자료유형
단행본
개인저자
Miikkulainen, Risto.
서명 / 저자사항
Subsymbolic natural language processing : an integrated model of scripts, lexicon, and memory / Risto Miikkulainen.
발행사항
Cambridge, Mass. :   MIT Press,   c1993.  
형태사항
xii, 391 p. : ill. ; 24 cm.
총서사항
Neural network modeling and connectionism.
ISBN
0262132907
일반주기
"A Bradford book."  
서지주기
Includes bibliographical references (p. [347]-374) and indexes.
일반주제명
Neural networks (Computer science). Natural language processing (Computer science).
비통제주제어
Computers, Use of, Natural language,,
000 00970camuuu200277 a 4500
001 000000241380
005 19980601113320.0
008 921006s1993 maua b 001 0 eng
010 ▼a 92037285
020 ▼a 0262132907
040 ▼a DLC ▼c DLC ▼d UKM
049 1 ▼l 111055035
050 0 0 ▼a QA76.87 ▼b .M54 1993
082 0 0 ▼a 006.3/5 ▼2 20
090 ▼a 006.35 ▼b M636s
100 1 ▼a Miikkulainen, Risto.
245 1 0 ▼a Subsymbolic natural language processing : ▼b an integrated model of scripts, lexicon, and memory / ▼c Risto Miikkulainen.
260 ▼a Cambridge, Mass. : ▼b MIT Press, ▼c c1993.
300 ▼a xii, 391 p. : ▼b ill. ; ▼c 24 cm.
440 0 ▼a Neural network modeling and connectionism.
500 ▼a "A Bradford book."
504 ▼a Includes bibliographical references (p. [347]-374) and indexes.
650 0 ▼a Neural networks (Computer science).
650 0 ▼a Natural language processing (Computer science).
653 0 ▼a Computers ▼a Use of ▼a Natural language

소장정보

No. 소장처 청구기호 등록번호 도서상태 반납예정일 예약 서비스
No. 1 소장처 학술정보관(CDL)/B1 국제기구자료실(보존서고8)/ 청구기호 006.35 M636s 등록번호 111055035 도서상태 대출가능 반납예정일 예약 서비스 B M

컨텐츠정보

책소개

Risto Miikkulainen draws on recent connectionist work in language comprehension tocreate a model that can understand natural language. Using the DISCERN system as an example, hedescribes a general approach to building high-level cognitive models from distributed neuralnetworks and shows how the special properties of such networks are useful in modeling humanperformance. In this approach connectionist networks are not only plausible models of isolatedcognitive phenomena, but also sufficient constituents for complete artificial intelligencesystems.Distributed neural networks have been very successful in modeling isolated cognitivephenomena, but complex high-level behavior has been tractable only with symbolic artificialintelligence techniques. Aiming to bridge this gap, Miikkulainen describes DISCERN, a completenatural language processing system implemented entirely at the subsymbolic level. In DISCERN,distributed neural network models of parsing, generating, reasoning, lexical processing, andepisodic memory are integrated into a single system that learns to read, paraphrase, and answerquestions about stereotypical narratives.Miikkulainen's work, which includes a comprehensive surveyof the connectionist literature related to natural language processing, will prove especiallyvaluable to researchers interested in practical techniques for high-level representation,inferencing, memory modeling, and modular connectionist architectures.Risto Miikkulainen is anAssistant Professor in the Department of Computer Sciences at The University of Texas atAustin.


정보제공 : Aladin

목차


CONTENTS
Preface = xi
PART Ⅰ Overview
 Chapter 1 Introduction = 3
  1.1 Task : Processing Script-Based Narratives = 3
  1.2 Motivation and Goals = 5
  1.3 Approach = 7
  1.4 Guide to the Reader = 10
 Chapter 2 Background = 13
  2.1 Scripts = 13
  2.2 Parallel Distributed Processing = 17
 Chapter 3 overview of DISCERN = 23
  3.1 System Architecture = 23
  3.2 I/O Example = 28
  3.3 Training and Performance = 30
PART Ⅱ  Processing Mechanisms
 Chapter 4 Backpropagation Networks = 37
  4.1 The Basic Iden = 37
  4.2 Detatils of the Algorithm = 39
  4.3 Variations = 41
  4.4 Application Considerations = 44
 Chapter 5 Developing Representations in FGREP Modules = 47
  5.1 The Basic FGREP Mechanism = 47
  5.2 Subtask : Assigning Case Roles to Sentence constituents = 50
  5.3 Properties of FGREP Representations = 53
  5.4 Cloning Synonymous Word Instances : The ID+content Technique = 69
  5.5 Processing Sequential Input and Output : The Recurrent FGREP Module = 77
  5.6 Limitations of FGREP = 82
 Chapter 6 Building from FGREP Modules = 85
  6.1 Performance Phase = 85
  6.2 Training Phase = 85
  6.3 Processing Modules in DISCERN = 90
  6.4 Limitations of the Modular FGREP Approach = 99
PART Ⅲ Memory Mechanisms
 Chapter 7 Self-Organizing Feature Maps = 105
  7.1 Topological Feature Maps = 105
  7.2 Self-Organization = 109
  7.3 Biological Feature Maps = 114
  7.4 Feature Maps as Memory Models = 117
 Chapter 8 Episodic Memory Organization : Hierarchical Feature Maps = 119
  8.1 The General Hierarchical Feature Map Architecture = 119
  8.2 Hierarchical Feature Maps in DISCERN = 122
  8.3 Memory Organization Properties = 133
  8.4 Self-Organization Properties = 137
 Chapter 9 Episodic Memory Storage and Retrieval : Trace Feature Maps = 141
  9.1 A General Model of Trace Feature Maps = 141
  9.2 Trace Feature Maps in DISCERN = 150
  9.3 Storage and Retrieval from Episodic Memory = 155
  9.4 Modeling Human Memory : Interpretation and Limitations = 159
 Chapter 10 Lexicon = 163
  10.1 Overview of the Architecture = 163
  10.2 Representation of Lexical Symbols = 165
  10.3 Properties of the Lexicon Model = 165
  10.4 The Lexicon in DISCERN = 178
  10.5 Modeling the Human Lexical System = 185
  10.6 Limitations = 190
PART Ⅳ Evaluation
 Chapter 11 Behavior of the Complete Model = 197
  11.1 Connecting the Modules = 197
  11.2 Example Run = 204
  11.3 Cleaning Up Errors = 219
  11.4 Error Behavior = 224
  11.5 Conclusion = 233
 Chapter 12 Discussion = 235
  12.1 DISCERN as a Physical Model = 235
  12.2 DISCERN as a Cognitive Model = 237
  12.3 DISCERN as a Developmental Model = 239
  12.4 Making Use of Modularity = 242
  12.5 The Role of the Central Lexicon = 245
  12.6 Robustness and stability = 247
  12.7 Generalization in Question Answering = 248
  12.8 Exceptions and Novel Situations = 249
 Chapter 13 Comparison to Related Work = 251
  13.1 Symbolic Models of Natural Language Processing = 251
  13.2 parallel Distributed Models of Natural Language Processing = 253
  13.3 Localist Models = 261
  13.4 Hybrid Models = 264
  13.5 Models of the Lexicon = 272
  13.6 Models of Episodic Memory = 275
  13.7 Issues in Subsymbolic Cognitive Modeling = 279
 Chapter 14 Extensions and Future work = 301
  14.1 Sentence Processing = 301
  14.2 Script Processing = 304
  14.3 Concept Representations = 307
  14.4 Lexicon = 309
  14.5 Episodic Memory = 313
  14.6 Question Answering = 319
  14.7 Parallel Distributed Control = 320
  14.8 Processing Multiple Languages = 322
  14.9 Representing and Learning Knowledge Structures = 326
 Chapter 15 Conclusions = 331
  15.1 Summary of the DISCERN Model = 331
  15.2 Conclusion = 335
Appendix A Story Data = 337
Appendix B Implementation Details = 343
Appendix C Instructions for Obtaining the DISCERN Software = 345
Bibliography = 347
Author Index = 375
Subject Index = 381


관련분야 신착자료

Dyer-Witheford, Nick (2026)
양성봉 (2025)