Modern Multivariate Statistical Techniques: Regression, Classification, and Manifold Learning (Springer Texts in Statistics)
by: Alan Julian Izenman
Modern Multivariate Statistical Techniques: Regression, Classification, and Manifold Learning (Springer Texts in Statistics)
By Alan Julian Izenman
Publisher: Springer
Number Of Pages: 734
ISBN-10 / ASIN: 0387781889
ISBN-13 / EAN: 9780387781884
Publish in 2008
Product Description:
Remarkable advances in computation and data storage and the ready availability of huge data sets have been the keys to the growth of the new disciplines of data mining and machine learning, while the enormous success of the Human Genome Project has opened up the field of bioinformatics.
These exciting developments, which led to the introduction of many innovative statistical tools for high-dimensional data analysis, are described here in detail. The author takes a broad perspective; for the first time in a book on multivariate analysis, nonlinear methods are discussed in detail as well as linear methods. Techniques covered range from traditional multivariate methods, such as multiple regression, principal components, canonical variates, linear discriminant analysis, factor analysis, clustering, multidimensional scaling, and correspondence analysis, to the newer methods of density estimation, projection pursuit, neural networks, multivariate reduced-rank regression, nonlinear manifold learning, bagging, boosting, random forests, independent component analysis, support vector machines, and classification and regression trees. Another unique feature of this book is the discussion of database management systems.
This book is appropriate for advanced undergraduate students, graduate students, and researchers in statistics, computer science, artificial intelligence, psychology, cognitive sciences, business, medicine, bioinformatics, and engineering. Familiarity with multivariable calculus, linear algebra, and probability and statistics is required. The book presents a carefully-integrated mixture of theory and applications, and of classical and modern multivariate statistical techniques, including Bayesian methods. There are over 60 interesting data sets used as examples in the book, over 200 exercises, and many color illustrations and photographs.
Summary: A great step forward in the way we look at multivariate data
Rating: 5
This book surprised me. I was expecting a book filled with a discussion of mostly traditional multivariate techniques supplemented by a few chapters of more recent developments. Instead, I found a completely new and refreshing approach to statistics and data exploration that framed the classical regression approach to most issues as a special, limiting case of a broader view of data exploration and analysis.
Sections on random vectors and matrices, nonparametric density estimation, tree methods, ANI, support vector machines, random forests, bagging and boosting, latent variables, manifold learning, and other topics are discussed and explored in adequate depth for an introductory text. The book assumes you know matrix algebra and have had some exposure to probability distributions, and common multivariate methods, but it extends the discussion in areas that are usually only covered in separate advanced texts and research papers.
The book is a little light on Bayesian methods but some compromises had to be made considering the bulk of the range of new material discussed. I especially liked the broad array of examples from genetics, medicine, physics, and other application areas and the nice color graphs where needed. The references to Matlab, R, S-Plus and other standard math packages was much appreciated although I would have liked Mathematica to have been included as well.
Overall, this is a wonderful survey of a wide range of multivariate techniques and methods. I hope it gets incorporated in college grad and undergrad courses.
[此贴子已经被作者于2009-5-19 11:55:54编辑过]