HOME > 상세정보

상세정보

Feature extraction : foundations and applications

Feature extraction : foundations and applications

자료유형
단행본
개인저자
Guyon, Isabelle.
서명 / 저자사항
Feature extraction : foundations and applications / Isabelle Guyon ... [et al.] (eds.).
발행사항
Berlin ;   New York :   Springer-Verlag,   c2006.  
형태사항
xxiv, 778 p. : ill. ; 25 cm.
총서사항
Studies in fuzziness and soft computing,1434-9922 ; v. 207
ISBN
9783540354871 3540354875
서지주기
Includes bibliographical references and index.
일반주제명
Database management.
000 00000cam u2200205 a 4500
001 000045945160
005 20180621143802
008 180621s2006 gw a b 001 0 eng d
010 ▼a 2006928001
015 ▼a GBA669189 ▼2 bnb
020 ▼a 9783540354871
020 ▼a 3540354875
035 ▼a (KERIS)REF000012912676
040 ▼a OHX ▼c OHX ▼d UKM ▼d BAKER ▼d HNK ▼d DLC ▼d 211009
050 0 0 ▼a QA76.9.D3 ▼b F433 2006
082 0 4 ▼a 006.3 ▼2 23
084 ▼a 006.3 ▼2 DDCK
090 ▼a 006.3 ▼b F2882
245 0 0 ▼a Feature extraction : ▼b foundations and applications / ▼c Isabelle Guyon ... [et al.] (eds.).
260 ▼a Berlin ; ▼a New York : ▼b Springer-Verlag, ▼c c2006.
300 ▼a xxiv, 778 p. : ▼b ill. ; ▼c 25 cm.
490 1 ▼a Studies in fuzziness and soft computing, ▼x 1434-9922 ; ▼v v. 207
504 ▼a Includes bibliographical references and index.
538 ▼a System requirements for disc: IBM PC or compatible; Windows; Adobe Reader, CD-ROM drive.
650 0 ▼a Database management.
700 1 ▼a Guyon, Isabelle.
830 0 ▼a Studies in fuzziness and soft computing ; ▼v v. 207.
945 ▼a KLPA

소장정보

No. 소장처 청구기호 등록번호 도서상태 반납예정일 예약 서비스
No. 1 소장처 과학도서관/Sci-Info(2층서고)/ 청구기호 006.3 F2882 등록번호 121245057 도서상태 대출가능 반납예정일 예약 서비스 B M

컨텐츠정보

책소개

This book is both a reference for engineers and scientists and a teaching resource, featuring tutorial chapters and research papers on feature extraction. Until now there has been insufficient consideration of feature selection algorithms, no unified presentation of leading methods, and no systematic comparisons.



Everyonelovesagoodcompetition. AsIwritethis,twobillionfansareeagerly anticipating the 2006 World Cup. Meanwhile, a fan base that is somewhat smaller (but presumably includes you, dear reader) is equally eager to read all about the results of the NIPS 2003 Feature Selection Challenge, contained herein. Fans of Radford Neal and Jianguo Zhang (or of Bayesian neural n- works and Dirichlet di?usion trees) are gloating “I told you so” and looking forproofthattheirwinwasnota?uke. Butthematterisbynomeanssettled, and fans of SVMs are shouting “wait ’til next year!” You know this book is a bit more edgy than your standard academic treatise as soon as you see the dedication: “To our friends and foes. ” Competition breeds improvement. Fifty years ago, the champion in 100m butter?yswimmingwas22percentslowerthantoday’schampion;thewomen’s marathon champion from just 30 years ago was 26 percent slower. Who knows how much better our machine learning algorithms would be today if Turing in 1950 had proposed an e?ective competition rather than his elusive Test? But what makes an e?ective competition? The ?eld of Speech Recognition hashadNIST-runcompetitionssince1988;errorrateshavebeenreducedbya factorofthreeormore,butthe?eldhasnotyethadtheimpactexpectedofit. Information Retrieval has had its TREC competition since 1992; progress has been steady and refugees from the competition have played important roles in the hundred-billion-dollar search industry. Robotics has had the DARPA Grand Challenge for only two years, but in that time we have seen the results go from complete failure to resounding success (although it may have helped that the second year’s course was somewhat easier than the ?rst’s).

New feature

This book is both a reference for engineers and scientists and a teaching resource, featuring tutorial chapters and research papers on feature extraction.
"This book compiles some very promising techniques, coming from an extremely smart collection of researchers, delivering their best ideas in a competitive environment."
Trevor Hastie, Stanford University
"Feature selection is a key technology for making sense of the high dimensional data. Isabelle Guyon et al. have done a splendid job in designing a challenging competition, and collecting the lessons learned."
Bernhard Schoelkopf, Max Planck Institute
"There has been until now insufficient consideration of feature selection algorithms, no unified presentation of leading methods, and no systematic comparisons. This volume is noteworthy for the breadth of methods covered, the clarity of presentations, the unity in notation and the helpful statistical appendices."
David G. Stork, Ricoh Innovations
"Feature extraction finds application in biotechnology, industrial inspection, the Internet, radar, sonar, and speech recognition. This book will make a difference to the literature on machine learning."
Simon Haykin, Mc Master University
"This book sets a high standard as the public record of an interesting and effective competition."
Peter Norvig, Google Inc.




정보제공 : Aladin

목차

An Introduction to Feature Extraction.- An Introduction to Feature Extraction.- Feature Extraction Fundamentals.- Learning Machines.- Assessment Methods.- Filter Methods.- Search Strategies.- Embedded Methods.- Information-Theoretic Methods.- Ensemble Learning.- Fuzzy Neural Networks.- Feature Selection Challenge.- Design and Analysis of the NIPS2003 Challenge.- High Dimensional Classification with Bayesian Neural Networks and Dirichlet Diffusion Trees.- Ensembles of Regularized Least Squares Classifiers for High-Dimensional Problems.- Combining SVMs with Various Feature Selection Strategies.- Feature Selection with Transductive Support Vector Machines.- Variable Selection using Correlation and Single Variable Classifier Methods: Applications.- Tree-Based Ensembles with Dynamic Soft Feature Selection.- Sparse, Flexible and Efficient Modeling using L 1 Regularization.- Margin Based Feature Selection and Infogain with Standard Classifiers.- Bayesian Support Vector Machines for Feature Ranking and Selection.- Nonlinear Feature Selection with the Potential Support Vector Machine.- Combining a Filter Method with SVMs.- Feature Selection via Sensitivity Analysis with Direct Kernel PLS.- Information Gain, Correlation and Support Vector Machines.- Mining for Complex Models Comprising Feature Selection and Classification.- Combining Information-Based Supervised and Unsupervised Feature Selection.- An Enhanced Selective Naive Bayes Method with Optimal Discretization.- An Input Variable Importance Definition based on Empirical Data Probability Distribution.- New Perspectives in Feature Extraction.- Spectral Dimensionality Reduction.- Constructing Orthogonal Latent Features for Arbitrary Loss.- Large Margin Principles for Feature Selection.- Feature Extraction for Classification of Proteomic Mass Spectra: A Comparative Study.- Sequence Motifs: Highly Predictive Features of Protein Function.


정보제공 : Aladin

관련분야 신착자료

Negro, Alessandro (2026)
Dyer-Witheford, Nick (2026)