| 000 | 00912camuu22002774a 4500 | |
| 001 | 000000822210 | |
| 005 | 20030722132857 | |
| 008 | 990715s2000 nyua b 001 0 eng | |
| 010 | ▼a 99039803 | |
| 020 | ▼a 0387987800 (hc. : acid-free paper) | |
| 040 | ▼a DLC ▼c DLC ▼d 211009 | |
| 042 | ▼a pcc | |
| 049 | 1 | ▼l 121081570 ▼f 과학 |
| 050 | 0 0 | ▼a Q325.7 ▼b .V37 2000 |
| 082 | 0 0 | ▼a 006.3/1/015195 ▼2 21 |
| 090 | ▼a 006.31 ▼b V286n2 | |
| 100 | 1 | ▼a Vapnik, Vladimir Naumovich. |
| 245 | 1 4 | ▼a The nature of statistical learning theory / ▼c Vladimir N. Vapnik. |
| 250 | ▼a 2nd ed. | |
| 260 | ▼a New York : ▼b Springer, ▼c c2000. | |
| 300 | ▼a xix, 314 p. : ▼b ill. ; ▼c 24 cm. | |
| 440 | 0 | ▼a Statistics for engineering and information science |
| 504 | ▼a Includes bibliographical references (p. [301]-309) and index. | |
| 650 | 0 | ▼a Computational learning theory. |
| 650 | 0 | ▼a Reasoning. |
소장정보
| No. | 소장처 | 청구기호 | 등록번호 | 도서상태 | 반납예정일 | 예약 | 서비스 |
|---|---|---|---|---|---|---|---|
| No. 1 | 소장처 과학도서관/Sci-Info(2층서고)/ | 청구기호 006.31 V286n2 | 등록번호 121081570 (28회 대출) | 도서상태 대출가능 | 반납예정일 | 예약 | 서비스 |
컨텐츠정보
책소개
The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning as a general problem of function estimation based on empirical data. Omitting proofs and technical details, the author concentrates on discussing the main results of learning theory and their connections to fundamental problems in statistics. This second edition contains three new chapters devoted to further development of the learning theory and SVM techniques. Written in a readable and concise style, the book is intended for statisticians, mathematicians, physicists, and computer scientists.
The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning as a general problem of function estimation based on empirical data. Omitting proofs and technical details, the author concentrates on discussing the main results of learning theory and their connections to fundamental problems in statistics. These include:
* the setting of learning problems based on the model of minimizing the risk functional from empirical data
* a comprehensive analysis of the empirical risk minimization principle including necessary and sufficient conditions for its consistency
* non-asymptotic bounds for the risk achieved using the empirical risk minimization principle
* principles for controlling the generalization ability of learning machines using small sample sizes based on these bounds
* the Support Vector methods that control the generalization ability when estimating function using small sample size.
The second edition of the book contains three new chapters devoted to further development of the learning theory and SVM techniques. These include:
* the theory of direct method of learning based on solving multidimensional integral equations for density, conditional probability, and conditional density estimation
* a new inductive principle of learning.
Written in a readable and concise style, the book is intended for statisticians, mathematicians, physicists, and computer scientists.
Vladimir N. Vapnik is Technology Leader AT&T Labs-Research and Professor of London University. He is one of the founders of statistical learning theory, and the author of seven books published in English, Russian, German, and Chinese.
정보제공 :
목차
Informal Reasoning and Comments.- Consistency of Learning Processes.- Bounds on the Rate of Convergence of Learing Processes.- Controlling the Generalization Ability of Learning Processes.- Methods of Pattern Recognition.- Methods of Function Estimation.- Direct Methods in Statistical Learning Theory.- The Vicinal Risk Minimization Principle and the SVMs.
정보제공 :
