Features
- Supplies the basics for readers unfamiliar with machine learning and pattern recognition
- Introduces the use of ensemble methods in computer vision, computer security, medical imaging, and famous data mining competitions, such as the KDD-Cup and Netflix Prize
- Presents the theoretical foundations and extensions of many ensemble methods, including Boosting, Bagging, Random Trees, and Stacking
- Covers nearly all aspects of ensemble techniques such as combination methods and diversity generation methods
- Highlights future research directions
- Provides additional reading sections in each chapter and references at the back of the book
SummaryAn up-to-date, self-contained introduction to a state-of-the-art machine learning approach, Ensemble Methods: Foundations and Algorithms shows how these accurate methods are used in real-world tasks. It gives you the necessary groundwork to carry out further research in this evolving field.
After presenting background and terminology, the book covers the main algorithms and theories, including Boosting, Bagging, Random Forest, averaging and voting schemes, the Stacking method, mixture of experts, and diversity measures. It also discusses multiclass extension, noise tolerance, error-ambiguity and bias-variance decompositions, and recent progress in information theoretic diversity.
Moving on to more advanced topics, the author explains how to achieve better performance through ensemble pruning and how to generate better clustering results by combining multiple clusterings. In addition, he describes developments of ensemble methods in semi-supervised learning, active learning, cost-sensitive learning, class-imbalance learning, and comprehensibility enhancement.
Table of ContentsIntroduction
Basic Concepts
Popular Learning Algorithms
Evaluation and Comparison
Ensemble Methods
Applications of Ensemble Methods
Boosting
A General Boosting Procedure
The AdaBoost Algorithm
Illustrative Examples
Theoretical Issues
Multiclass Extension
Noise Tolerance
Bagging
Two Ensemble Paradigms
The Bagging Algorithm
Illustrative Examples
Theoretical Issues
Random Tree Ensembles
Combination Methods
Benefits of Combination
Averaging
Voting
Combining by Learning
Other Combination Methods
Relevant Methods
Diversity
Ensemble Diversity
Error Decomposition
Diversity Measures
Information Theoretic Diversity
Diversity Generation
Ensemble Pruning
What Is Ensemble Pruning
Many Could Be Better Than All
Categorization of Pruning Methods
Ordering-Based Pruning
Clustering-Based Pruning
Optimization-Based Pruning
Clustering Ensembles
Clustering
Categorization of Clustering Ensemble Methods
Similarity-Based Methods
Graph-Based Methods
Relabeling-Based Methods
Transformation-Based Methods
Advanced Topics
Semi-Supervised Learning
Active Learning
Cost-Sensitive Learning
Class-Imbalance Learning
Improving Comprehensibility
Future Directions of Ensembles
References
Index
Further Readings appear at the end of each chapter.
鼓励大家参与论坛,收取3论坛币,希望大家能多回复