Feature Extraction: Foundations and Applications

Capa
Isabelle Guyon, Steve Gunn, Masoud Nikravesh, Lofti A. Zadeh
Springer Science & Business Media, 20/07/2006 - 778 páginas
Everyonelovesagoodcompetition. AsIwritethis,twobillionfansareeagerly anticipating the 2006 World Cup. Meanwhile, a fan base that is somewhat smaller (but presumably includes you, dear reader) is equally eager to read all about the results of the NIPS 2003 Feature Selection Challenge, contained herein. Fans of Radford Neal and Jianguo Zhang (or of Bayesian neural n- works and Dirichlet di?usion trees) are gloating “I told you so” and looking forproofthattheirwinwasnota?uke. Butthematterisbynomeanssettled, and fans of SVMs are shouting “wait ’til next year!” You know this book is a bit more edgy than your standard academic treatise as soon as you see the dedication: “To our friends and foes. ” Competition breeds improvement. Fifty years ago, the champion in 100m butter?yswimmingwas22percentslowerthantoday’schampion;thewomen’s marathon champion from just 30 years ago was 26 percent slower. Who knows how much better our machine learning algorithms would be today if Turing in 1950 had proposed an e?ective competition rather than his elusive Test? But what makes an e?ective competition? The ?eld of Speech Recognition hashadNIST-runcompetitionssince1988;errorrateshavebeenreducedbya factorofthreeormore,butthe?eldhasnotyethadtheimpactexpectedofit. Information Retrieval has had its TREC competition since 1992; progress has been steady and refugees from the competition have played important roles in the hundred-billion-dollar search industry. Robotics has had the DARPA Grand Challenge for only two years, but in that time we have seen the results go from complete failure to resounding success (although it may have helped that the second year’s course was somewhat easier than the ?rst’s).
 

Páginas seleccionadas

Índice

An Introduction to Feature Extraction
1
References
22
3
30
References
58
Assessment Methods
65
Filter Methods
89
References
114
References
135
Combining a Filter Method with SVMs
439
References
445
References
461
Information Gain Correlation and Support Vector
463
References
470
References
487
Combining InformationBased Supervised
489
An Input Variable Importance Definition
509

References
162
References
182
Ensemble Learning
187
References
203
References
231
References
260
References
295
Ensembles of Regularized Least Squares Classifiers
297
References
313
Combining SVMs with Various Feature Selection
315
Variable Selection using Correlation and Single Variable
342
References
357
TreeBased Ensembles with Dynamic Soft Feature
359
References
374
Sparse Flexible and Efficient Modeling
375
References
393
Margin Based Feature Selection and Infogain
395
Nonlinear Feature Selection with the Potential Support
419
References
547
Constructing Orthogonal Latent Features
551
References
582
References
604
Highly Predictive Features
625
Elementary Statistics
649
Confidence Intervals
655
References
662
ARCENE
669
GISETTE
677
DOROTHEA
687
MATLAB Code of the Lambda Method 697
696
High Dimensional Classification with Bayesian Neural
707
Krzysztof Grabczewski Norbert Jankowski
735
Lemaire F Clérot
743
Index
771
Direitos de autor

Outras edições - Ver tudo

Palavras e frases frequentes

Informação bibliográfica