楼主: kxjs2007
10836 26

【下载】Finite Mixture and Markov Switching Models~Sylvia Frühwirth-Schnatter [推广有奖]

  • 0关注
  • 31粉丝

已卖:16700份资源

讲师

45%

还不是VIP/贵宾

-

威望
0
论坛币
16254 个
通用积分
137.0947
学术水平
38 点
热心指数
50 点
信用等级
29 点
经验
21907 点
帖子
471
精华
0
在线时间
313 小时
注册时间
2009-11-7
最后登录
2024-7-1

楼主
kxjs2007 发表于 2010-5-14 22:12:27 |AI写论文

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币
Finite Mixture and Markov Switching Models (Springer Series in Statistics) (Hardcover)
Sylvia Frühwirth-Schnatter (Author)


Beschreibung von buecher.de
The prominence of finite mixture modelling is greater than ever. Many important statistical topics like clustering data, outlier treatment, or dealing with unobserved heterogeneity involve finite mixture models in some way or other. The area of potential applications goes beyond simple data analysis and extends to regression analysis and to non-linear time series analysis using Markov switching models.
For more than the hundred years since Karl Pearson showed in 1894 how to estimate the five parameters of a mixture of two normal distributions using the method of moments, statistical inference for finite mixture models has been a challenge to everybody who deals with them. In the past ten years, very powerful computational tools emerged for dealing with these models which combine a Bayesian approach with recent Monte simulation techniques based on Markov chains. This book reviews these techniques and covers the most recent advances in the field, among them bridge sampling techniques and reversible jump Markov chain Monte Carlo methods.
It is the first time that the Bayesian perspective of finite mixture modelling is systematically presented in book form. It is argued that the Bayesian approach provides much insight in this context and is easily implemented in practice. Although the main focus is on Bayesian inference, the author reviews several frequentist techniques, especially selecting the number of components of a finite mixture model, and discusses some of their shortcomings compared to the Bayesian approach.
The aim of this book is to impart the finite mixture and Markov switching approach to statistical modelling to a wide-ranging community. This includes not only statisticians, but also biologists, economists, engineers, financial agents, market researcher, medical researchers or any other frequent user of statistical models. This book should help newcomers to the field to understand how finite mixture and Markov switching models are formulated, what structures they imply on the data, what they could be used for, and how they are estimated. Researchers familiar with the subject also will profit from reading this book. The presentation is rather informal without abandoning mathematical correctness. Previous notions of Bayesian inference and Monte Carlo simulation are useful but not needed.


Editorial Reviews
Review
From the reviews:
"At first glance, the numerous equations and formulas may seem to be daunting for psychologists with limited statistical background; however, the descriptions and explanations of the various models are actually quite reader friendly (more so than many advanced statistical textbooks). The author has done an excellent job of inviting newcomers to enter the world of mixture models, more impressively, the author did so without sacrificing mathematical and statistical rigor. Mixture models are appealing in many applications in social and psychological studies. This book not only offers a gentle introduction to mixture models but also provides more in depth coverage for those who look beyond the surface. I believe that psychologists who are interested in related models (e.g., latent class models, latent Markov models, and latent class regression models) will benefit greatly from this book. I highly recommend this book to all psychologists who are interested in mixture models." (Hsiu-Ting Yu, PSYCHOMETRIKA—VOL. 74, NO. 3, 559–560 SEPTEMBER 2009)
"The book is impressive in its mathematical and formal correctness, in generality and in details....it would be helfpful as an additional reference among a wider range of available textbooks in the area. [I]t will find many friends among experts and newcomers to the world of mixture models." (Atanu Biswas, Biometrics, Issue 63, September 2007)
"Finite mixture distributions are important for many models. Therefore they constitute a very active field of research. This book gives an up to date overview over the various models of this kind. … The aim of this book is to impart the finite mixture and Markov switching approach to statistical modeling to a wide-ranging community. … For the frequentists, it offers a good opportunity to explore the advantages of the Bayesian approach in the context of mixing models." (Gheorghe Pitis, Zentralblatt MATH, Vol. 1108 (10), 2007)
"Readership: Statisticians, biologists, economists, engineers, financial agents, market researchers, medical researchers or any other frequent user of statistical models. The first nine chapters of the book are concerned with static mixture models, and the last four with Markov switching models. … especially valuable for students, serving to demonstrate how different statistical techniques, which superficially appear to be unrelated, are in fact part of an integrated whole. This book struck me as being particularly clearly written – it is a pleasure to read." (David J. Hand, International Statistical Review, Vol. 75 (2), 2007)
"The book is excellent, giving a most readable overview of the topic of finite mixtures, aimed at a broad readership … . Students will like the text because of the pedagogical writing style; researchers will definitely welcome the broad treatment of the subject. Both will benefit from the extensive and up-to-date bibliography … as well as the well-organized index. No doubt, this book is a valuable addition to the field of statistics and will surely find its rightful place in many a statistician’s library." (Valerie Chavez-Demoulin, Journal of the American Statistical Association, Vol. 104 (485), March, 2009)


Product Description
WINNER OF THE 2007 DEGROOT PRIZE!
The prominence of finite mixture modelling is greater than ever. Many important statistical topics like clustering data, outlier treatment, or dealing with unobserved heterogeneity involve finite mixture models in some way or other. The area of potential applications goes beyond simple data analysis and extends to regression analysis and to non-linear time series analysis using Markov switching models.
For more than the hundred years since Karl Pearson showed in 1894 how to estimate the five parameters of a mixture of two normal distributions using the method of moments, statistical inference for finite mixture models has been a challenge to everybody who deals with them. In the past ten years, very powerful computational tools emerged for dealing with these models which combine a Bayesian approach with recent Monte simulation techniques based on Markov chains. This book reviews these techniques and covers the most recent advances in the field, among them bridge sampling techniques and reversible jump Markov chain Monte Carlo methods.
It is the first time that the Bayesian perspective of finite mixture modelling is systematically presented in book form. It is argued that the Bayesian approach provides much insight in this context and is easily implemented in practice. Although the main focus is on Bayesian inference, the author reviews several frequentist techniques, especially selecting the number of components of a finite mixture model, and discusses some of their shortcomings compared to the Bayesian approach.
The aim of this book is to impart the finite mixture and Markov switching approach to statistical modelling to a wide-ranging community. This includes not only statisticians, but also biologists, economists, engineers, financial agents, market researcher, medical researchers or any other frequent user of statistical models. This book should help newcomers to the field to understand how finite mixture and Markov switching models are formulated, what structures they imply on the data, what they could be used for, and how they are estimated. Researchers familiar with the subject also will profit from reading this book. The presentation is rather informal without abandoning mathematical correctness. Previous notions of Bayesian inference and Monte Carlo simulation are useful but not needed.





Product Details
  • Hardcover: 492 pages
  • Publisher: Springer; 1 edition (August 8, 2006)
  • Language: English
  • ISBN-10: 0387329099
  • ISBN-13: 978-0387329093




content

1 Finite Mixture Modeling 1
2 Statistical Inference for a Finite Mixture Model with Known Number of Components 25
3 Practical Bayesian Inference for a Finite Mixture Model with Known Number of Components 57
4 Statistical Inference for Finite Mixture Models Under Model Specification Uncertainty 99
5 Computational Tools for Bayesian Inference for Finite Mixtures Models Under Model Specification Uncertainty 125
6 Finite Mixture Models with Normal Components 169
7 Data Analysis Based on Finite Mixtures 203
8 Finite Mixtures of Regression Models 241
9 Finite Mixture Models with Nonnormal Components 277
10 Finite Markov Mixture Modeling 301
11 Statistical Inference for Markov Switching Models 319
12 Nonlinear Time Series Analysis Based on Markov Switching Models 357
13 Switching State Space Models 389
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:switching Mixture models Sylvia Markov models Markov switching finite Mixture

Cover.PNG (493.71 KB)

Finite Mixture and Markov Switching Models

Finite Mixture and Markov Switching Models

book_matlab_version_2.0.pdf
下载链接: https://bbs.pinggu.org/a-636553.html

907.92 KB

Code Book

bayesf_version_1.0.zip

181.28 KB

A manual describing the use of this package contained V1.0

本附件包括:

  • start.m
  • start_book.m
  • SIMUNI.M
  • warn.m
  • simuniform.m
  • matlab12.m
  • marstat.m
  • matlab15.m
  • Countex.m
  • matlab11.m
  • matlab14.m
  • matlab95.m
  • matlab21.m
  • matlab51.m
  • matlab63.m
  • matlab22.m
  • statecount.m
  • moment_mix_poisson.m
  • mcmcstore.m
  • Istation.m
  • plot_biv_normal.m
  • Dirichlog_eye.m
  • Dirichlog_mbclust.m
  • plotsub.m
  • ranwi_eye.m
  • fish_data.m
  • matlab43.m
  • matlab13.m
  • eye_dat.mat
  • Likeli_normal_old.m
  • logprior_mixpoi.m
  • boxplotvar_old.m
  • mixturecdweight.m
  • Plotac.m
  • Plotac_discrete.m
  • Plotconverge.m
  • Likeli_normal.m
  • QINCOL.M
  • matlab_univortrag.m
  • boxplotvar.m
  • compute_em_poi.m
  • matlab_WII.m
  • mixturemomentsnor.m
  • mychol.m
  • plotclass.m
  • mcmcregression.m
  • plotdichte.m
  • Histneu.m
  • Prodgamlog.m
  • qincolmult.m
  • matlab35.m
  • lamb_dat.mat
  • Likeli_poisson.m
  • star_clust_dat.m
  • Raninvwi_neu.m
  • Invwisim.m
  • logprior_mixpoi_hpmarg.m
  • likeli_normult.m
  • mcmc_mix_poi_mean.m
  • matlab64.m
  • make_contan_neu.m
  • matlabxx.m
  • Likeli_poisson_old.m
  • matlab115.m
  • plot_point_process.m
  • qinmatr.M
  • matlab65.m
  • Simstate.m
  • pmultnormlog.m
  • qinmatrmult.M
  • Simstate_student.m
  • mcmc_sim_sst.m
  • mixturemar.m
  • autocovemp.m
  • matlab64all.m
  • matlab_poireg.m
  • mixpoiprior_old.m
  • Pwilog.m
  • Dirichsim.m
  • mcmc_sim_eta.m
  • Pinvwilog_neu.m
  • matlab96.m
  • matlab65all.m
  • Autocov_alt.m
  • Dirichpdflog.m
  • multimix_ipprior.m
  • mcmcmargmom.m
  • mcmcpredsam.m
  • designpoints.m
  • multimix_cdprior.m
  • mcmcbfplot.m
  • mixturepdf.m
  • gdp_us_dat.mat
  • mcmcpredmom.m
  • multimix_cdpost.m
  • multimix_ippost.m
  • prioreval.m
  • mixpoiprior.m
  • mcmcsubseq_alt.m
  • star_clust_dat.mat
  • mcmcput.m
  • matlab91.m
  • moments_test.m
  • mixturecdpar.m
  • contmix.m
  • proddirichpdflog.m
  • autocovneu.m
  • Prodgamsim.m
  • normalpdflog.m
  • mcmcestimate.m
  • Prodgampdflog.m
  • Prodinvgampdflog.m
  • prodnorpdflog.m
  • mixturediag.m
  • plotpred.m
  • matlab93.m
  • matlab81.m
  • mlcf_multimix.m
  • matlab97.m
  • Plottheta.m
  • matlab92.m
  • mcmcic.m
  • matlab61_old.m
  • surfcontmix.m
  • matlab102.m
  • matlab82.m
  • matlab111_noperm.m
  • matlab94.m
  • matlab111_ergodic.m
  • mixpoissonbf.m
  • likelihoodeval.m
  • matlab111.m
  • matlab98.m
  • fullperm.m
  • prodnormultsim.m
  • mcmcclustplot.m
  • matlab83.m
  • normultsim.m
  • simulatestart.m
  • mc.m
  • mlip_multimix.m
  • compute_posteriormode.m
  • mcmcextract.m
  • mcmcaverage.m
  • multimix_start.m
  • mcmcclust.m
  • simstate_ms_old.m
  • matlab103.m
  • compute_prior.m
  • dataget.m
  • matlab61.m
  • simstate_ms.m
  • datamoments.m
  • matlab62.m
  • Iris_data.dat
  • designar.m
  • mixnorprior_old.m
  • mixtureplot.m
  • matlab116.m
  • matlab101.m
  • mcmcsubseq.m
  • posteriorlog.m
  • dataplot.m
  • checkprior.m
  • moments.m
  • simstate_msmult.m
  • polio.html
  • mcmcpreddens.m
  • matlab114.m
  • mixnorprior.m
  • matlab113.m
  • matlab112.m
  • Mlbsall.m
  • mcmc_nor_musig.m
  • dataclass.m
  • mcmcstart.m
  • mcmcpermute.m
  • mixturepoint.m
  • GNP.DAT
  • run_mixture.m
  • simulate.m
  • mcmcsamrep.m
  • posterior.m
  • mcmcdiag.m
  • mcmcplot.m
  • prodnormultpdflog.m
  • polio.mat
  • compute_mixture_old.m
  • prodinvwipdflog.m
  • compute_mixture.m
  • mixnorbf.m
  • mixnorbf_new.m
  • priordefine.m
  • mcmcbf.m
  • highlight.m
  • mixturemcmc.m
  • matlab84.m
  • stabdiagramm.m
  • plotclasscross.m

bayesf_version_2.0.zip

167.21 KB

A manual describing the use of this package contained V2.0

本附件包括:

  • mixturemomentsnor.m
  • iris_data.dat
  • pwilog.m
  • lamb_dat.mat
  • fish_data.m
  • eye_dat.mat
  • qincol.m
  • countex.m
  • simuniform.m
  • simuni.m
  • GNP.DAT
  • dataget.m
  • start_lamb.m
  • start_gdp.m
  • start_fabricfault_negbin.m
  • start_eye.m
  • mcmcplot.m
  • demo_mix_student_Kunknown.m
  • demo_msar_reg_mixeffects.m
  • demo_msar_reg.m
  • demo_msreg_mixeffects.m
  • demo_msreg.m
  • demo_regression_mix_binomial.m
  • demo_mix_binomial.m
  • demo_mix_student.m
  • demo_mix_normal.m
  • mcmcbf.m
  • mixturemcmc.m
  • mcmcstart.m
  • start_gdp_swi.m
  • demo_mixreg_mixeffects.m
  • demo_mixreg.m
  • start_fabricfault_mixed_effects.m
  • dataplot.m
  • start_iris.m
  • start_fishery.m
  • demo_mix_multivariate_student_Kunknown.m
  • demo_mix_multivariate_student.m
  • demo_mix_exponential.m
  • demo_mix_multivariate_normal_Kunknown.m
  • demo_mix_multivariate_normal.m
  • demo_mix_normal_Kunknown.m
  • start_fishery_K4.m
  • start_iris_K3.m
  • start_gdp_marmix.m
  • start_fabricfault.m
  • dataclass.m
  • demo_poisson_mix_reg_mixed_effects.m
  • simulate.m
  • priordefine.m
  • demo_regression_negbin.m
  • mcmcic.m
  • posterior.m
  • mcmcpermute.m
  • mixnorprior.m
  • mcmcestimate.m
  • demo_mix_normal_mu0_Kunknown.m
  • start_fishery_plot.m
  • demo_figure2_1.m
  • mcmcaverage.m
  • mcmcextract.m
  • mcmcclustplot.m
  • mcmcdiag.m
  • plotac_discrete.m
  • datamoments.m
  • demo_msm_multivariate_normal_Kunknown.m
  • plotac.m
  • mixstudprior.m
  • skewn_transform.m
  • likeli_skewstudmult.m
  • mcmcsamrep.m
  • demo_multivariate_skewnormal.m
  • ranpermute.m
  • mcmcput.m
  • mcmcsubseq.m
  • moments.m
  • mcmc_student_df.m
  • mcmc_negbin_df.m
  • stabdiagramm.m
  • prodstudmultpdflog.m
  • plotclasscross.m
  • likeli_multinomial.m
  • mcmcbfplot.m
  • demo_multinomial.m
  • mixglmprior.m
  • prioreval.m
  • mixturepdf.m
  • mixtureplot.m
  • mixturemar.m
  • momentest.m
  • start_eye_plot.m
  • autocov.m
  • mixturepoint.m
  • demo_figure2_2.m
  • mcmcclustsim.m
  • mcmcclust.m
  • mcmcpm.m
  • mc.m
  • invwisim.m
  • statecount.m
  • histneu.m
  • maketab.m
  • qinmatrmult.m
  • mixtureplot_biv.m
  • contmixskewstud.m
  • likeli_skewstudent.m
  • skewn_parameter.m
  • simulate_truncated_normal.m
  • eval_tcdf_skewt.m
  • likeli_skewstudmult_cd.m
  • demo_negbin.m
  • demo_binomial.m
  • prodnormultpdflog.m
  • prodstudmultsim.m
  • studmultsim.m
  • likeli_negbin.m
  • prodbetapdflog.m
  • likeli_binomial.m
  • likelihoodeval.m
  • prodbetasim.m
  • auxmix_initialize_binomial.m
  • auxmix_binomial.m
  • likeli_poisson.m
  • data_fabric_fault.m
  • auxmix_poisson.m
  • auxmix_initialize_poisson.m
  • likeli_skewnormal.m
  • likeli_stumult.m
  • likeli_student.m
  • contmixskewnormal.m
  • likeli_skewnormult.m
  • likeli_normal.m
  • plotclass.m
  • contmix.m
  • marginallikelihood_eval.m
  • contmixstud.m
  • likeli_expon.m
  • likeli_normult.m
  • mcmcpredmom.m
  • mixexpprior.m
  • bridgesampling_se.m
  • chib_se.m
  • mcmcpreddens.m
  • designar.m
  • demo_poisson_mix_reg.m
  • demo_msar.m
  • warn.m
  • mcmcstore.m
  • mcmcmargmom.m
  • compute_mixture.m
  • normultsim.m
  • simstate_ms.m
  • boxplotvar.m
  • proddirichpdflog.m
  • marstat.m
  • demo_poisson_reg_mixed_effects.m
  • demo_poisson_mix_reg_Kfix.m
  • prodnormultsim.m
  • normalpdflog.m
  • prodinvwipdflog.m
  • prodinvgampdflog.m
  • prodgamsim.m
  • dirichsim.m
  • prodnorpdflog.m
  • prodgampdflog.m
  • dirichpdflog.m
  • prodgamlog.m
  • qincolmult.m
  • logprior_mixpoi_hpmarg.m
  • qinmatr.m
  • plotsub.m

errata.pdf

59.79 KB

纠错

Finite Mixture and Markov Switching Models.pdf

5.63 MB

需要: 1 个论坛币  [购买]

Book

已有 1 人评分学术水平 热心指数 信用等级 收起 理由
ericcatherine + 1 + 1 + 1 good!

总评分: 学术水平 + 1  热心指数 + 1  信用等级 + 1   查看全部评分

本帖被以下文库推荐

为了幸福,努力!

沙发
gssdzc(未真实交易用户) 在职认证  发表于 2010-5-14 22:16:07
非常感谢奋斗

藤椅
kxjs2007(未真实交易用户) 发表于 2010-5-15 01:05:42

Contents

1 Finite Mixture Modeling 1

1.1 Introduction 1

1.2 Finite Mixture Distributions 3

1.2.1 Basic Definitions 3

1.2.2 Some Descriptive Features of Finite Mixture Distributions 5

1.2.3 Diagnosing Similarity of Mixture Components 9

1.2.4 Moments of a Finite Mixture Distribution 10

1.2.5 Statistical Modeling Based on Finite Mixture Distributions 11

1.3 Identifiability of a Finite Mixture Distribution 14

1.3.1 Nonidentifiability Due to Invariance to Relabeling the Components 15

1.3.2 Nonidentifiability Due to Potential Overfitting 17

1.3.3 Formal Identifiability Constraints 19

1.3.4 Generic Identifiability 21

2 Statistical Inference for a Finite Mixture Model with
Known Number of Components 25

2.1 Introduction 25

2.2 Classification for Known Component Parameters 26

2.2.1 Bayes Rule for Classifying a Single Observation 26

2.2.2 The Bayes Classifier for a Whole Data Set 27

2.3 Parameter Estimation for Known Allocation 29

2.3.1 The Complete-Data Likelihood Function 29

2.3.2 Complete-Data Maximum Likelihood Estimation 30

2.3.3 Complete-Data Bayesian Estimation of the Component Parameters 31

2.3.4 Complete-Data Bayesian Estimation of the Weights 35

2.4 Parameter Estimation When the Allocations Are Unknown 41

2.4.1 Method of Moments 42

2.4.2 The Mixture Likelihood Function 43

2.4.3 A Helicopter Tour of the Mixture Likelihood Surface for Two Examples 44

2.4.4 Maximum Likelihood Estimation 49

2.4.5 Bayesian Parameter Estimation 53

2.4.6 Distance-Based Methods 54

2.4.7 Comparing Various Estimation Methods 54

为了幸福,努力!

板凳
kxjs2007(未真实交易用户) 发表于 2010-5-23 10:24:24

3 Practical Bayesian Inference for a Finite Mixture Model
with Known Number of Components 57

3.1 Introduction 57

3.2 Choosing the Prior for the Parameters of a Mixture Model 58

3.2.1 Objective and Subjective Priors 58

3.2.2 Improper Priors May Cause Improper Mixture Posteriors 59

3.2.3 Conditionally Conjugate Priors 60

3.2.4 Hierarchical Priors and Partially Proper Priors 61

3.2.5 Other Priors 62

3.2.6 Invariant Prior Distributions 62

3.3 Some Properties of the Mixture Posterior Density 63

3.3.1 Invariance of the Posterior Distribution 63

3.3.2 Invariance of Seemingly Component-Specific Functionals 64

3.3.3 The Marginal Posterior Distribution of the Allocations 65

3.3.4 Invariance of the Posterior Distribution of the Allocations 67

3.4 Classification Without Parameter Estimation 68

3.4.1 Single-Move Gibbs Sampling 69

3.4.2 The Metropolis–Hastings Algorithm 72

3.5 Parameter Estimation Through Data Augmentation and MCMC 73

3.5.1 Treating Mixture Models as a Missing Data Problem 73

3.5.2 Data Augmentation and MCMC for a Mixture of Poisson Distributions 74

3.5.3 Data Augmentation and MCMC for General Mixtures 76

3.5.4 MCMC Sampling Under Improper Priors 78

3.5.5 Label Switching 78

3.5.6 Permutation MCMC Sampling 81

3.6 Other Monte Carlo Methods Useful for Mixture Models 83

3.6.1 A Metropolis–Hastings Algorithm for the Parameters 83

3.6.2 Importance Sampling for the Allocations 84

3.6.3 Perfect Sampling 85

3.7 Bayesian Inference for Finite Mixture Models Using Posterior Draws 85

3.7.1 Sampling Representations of the Mixture Posterior Density 85

3.7.2 Using Posterior Draws for Bayesian Inference 87

3.7.3 Predictive Density Estimation 89

3.7.4 Individual Parameter Inference 91

3.7.5 Inference on the Hyperparameter of a Hierarchical Prior 92

3.7.6 Inference on Component Parameters 92

3.7.7 Model Identification 94

为了幸福,努力!

报纸
kxjs2007(未真实交易用户) 发表于 2010-6-8 15:48:46

4 Statistical Inference for Finite Mixture Models Under
Model Specification Uncertainty 99

4.1 Introduction 99

4.2 Parameter Estimation Under Model Specification Uncertainty 100

4.2.1 Maximum Likelihood Estimation Under Model Specification Uncertainty 100

4.2.2 Practical Bayesian Parameter Estimation for Overfitting Finite Mixture Models 103

4.2.3 Potential Overfitting 105

4.3 Informal Methods for Identifying the Number of Components 107

4.3.1 Mode Hunting in the Mixture Posterior 108

4.3.2 Mode Hunting in the Sample Histogram 109

4.3.3 Diagnosing Mixtures Through the Method of Moments 110

4.3.4 Diagnosing Mixtures Through Predictive Methods 112

4.3.5 Further Approaches 114

4.4 Likelihood-Based Methods 114

4.4.1 The Likelihood Ratio Statistic 114

4.4.2 AIC, BIC, and the Schwarz Criterion 116

4.4.3 Further Approaches 117

4.5 Bayesian Inference Under Model Uncertainty 117

4.5.1 Trans-Dimensional Bayesian Inference 117

4.5.2 Marginal Likelihoods 118

4.5.3 Bayes Factors for Model Comparison 119

4.5.4 Formal Bayesian Model Selection 121

4.5.5 Choosing Priors for Model Selection 122

4.5.6 Further Approaches 123

5 Computational Tools for Bayesian Inference for Finite
Mixtures Models Under Model Specification Uncertainty 125

5.1 Introduction 125

5.2 Trans-Dimensional Markov Chain Monte Carlo Methods 125

5.2.1 Product-Space MCMC 126

5.2.2 Reversible Jump MCMC 129

5.2.3 Birth and Death MCMC Methods 137

5.3 Marginal Likelihoods for Finite Mixture Models 139

5.3.1 Defining the Marginal Likelihood 139

5.3.2 Choosing Priors for Selecting the Number of Components 141

5.3.3 Computation of the Marginal Likelihood for Mixture Models 143

5.4 Simulation-Based Approximations of the Marginal Likelihood 143

5.4.1 Some Background on Monte Carlo Integration 143

5.4.2 Sampling-Based Approximations for Mixture Models 144

5.4.3 Importance Sampling 146

5.4.4 Reciprocal Importance Sampling 147

5.4.5 Harmonic Mean Estimator 148

5.4.6 Bridge Sampling Technique 150

5.4.7 Comparison of Different Simulation-Based Estimators 154

5.4.8 Dealing with Hierarchical Priors 159

5.5 Approximations to the Marginal Likelihood Based on Density Ratios 159

5.5.1 The Posterior Density Ratio 159

5.5.2 Chib’s Estimator 160

5.5.3 Laplace Approximation 164

5.6 Reversible Jump MCMC Versus Marginal Likelihoods? 165

为了幸福,努力!

地板
kxjs2007(未真实交易用户) 发表于 2010-6-8 15:49:03

6 Finite Mixture Models with Normal Components 169

6.1 Finite Mixtures of Normal Distributions 169

6.1.1 Model Formulation 169

6.1.2 Parameter Estimation for Mixtures of Normals 171

6.1.3 The Kiefer–Wolfowitz Example 174

6.1.4 Applications of Mixture of Normal Distributions 176

6.2 Bayesian Estimation of Univariate Mixtures of Normals 177

6.2.1 Bayesian Inference When the Allocations Are Known 177

6.2.2 Standard Prior Distributions 179

6.2.3 The Influence of the Prior on the Variance Ratio 179

6.2.4 Bayesian Estimation Using MCMC 180

6.2.5 MCMC Estimation Under Standard Improper Priors 182

6.2.6 Introducing Prior Dependence Among the Components185

6.2.7 Further Sampling-Based Approaches 187

6.2.8 Application to the Fishery Data 188

6.3 Bayesian Estimation of Multivariate Mixtures of Normals 190

6.3.1 Bayesian Inference When the Allocations Are Known 190

6.3.2 Prior Distributions 192

6.3.3 Bayesian Parameter Estimation Using MCMC 193

6.3.4 Application to Fisher’s Iris Data 195

6.4 Further Issues 195

6.4.1 Parsimonious Finite Normal Mixtures 195

6.4.2 Model Selection Problems for Mixtures of Normals 199

7 Data Analysis Based on Finite Mixtures 203

7.1 Model-Based Clustering 203

7.1.1 Some Background on Cluster Analysis 203

7.1.2 Model-Based Clustering Using Finite Mixture Models 204

7.1.3 The Classification Likelihood and the Bayesian MAP Approach 207

7.1.4 Choosing Clustering Criteria and the Number of Components 210

7.1.5 Model Choice for the Fishery Data 216

7.1.6 Model Choice for Fisher’s Iris Data 218

7.1.7 Bayesian Clustering Based on Loss Functions 220

7.1.8 Clustering for Fisher’s Iris Data 224

7.2 Outlier Modeling 224

7.2.1 Outlier Modeling Using Finite Mixtures 224

7.2.2 Bayesian Inference for Outlier Models Based on Finite Mixtures 225

7.2.3 Outlier Modeling of Darwin’s Data 226

7.2.4 Clustering Under Outliers and Noise 227

7.3 Robust Finite Mixtures Based on the Student-t Distribution 230

7.3.1 Parameter Estimation230

7.3.2 Dealing with Unknown Number of Components 233

7.4 Further Issues 233

7.4.1 Clustering High-Dimensional Data 233

7.4.2 Discriminant Analysis 235

7.4.3 Combining Classified and Unclassified Observations 236

7.4.4 Density Estimation Using Finite Mixtures 237

7.4.5 Finite Mixtures as an Auxiliary Computational Tool in Bayesian Analysis 238

为了幸福,努力!

7
kxjs2007(未真实交易用户) 发表于 2010-6-8 15:49:38

8 Finite Mixtures of Regression Models 241

8.1 Introduction 241

8.2 Finite Mixture of Multiple Regression Models 242

8.2.1 Model Definition 242

8.2.2 Identifiability 243

8.2.3 Statistical Modeling Based on Finite Mixture of Regression Models 246

8.2.4 Outliers in a Regression Model 249

8.3 Statistical Inference for Finite Mixtures of Multiple Regression Models 249

8.3.1 Maximum Likelihood Estimation 249

8.3.2 Bayesian Inference When the Allocations Are Known 250

8.3.3 Choosing Prior Distributions 252

8.3.4 Bayesian Inference When the Allocations Are Unknown 253

8.3.5 Bayesian Inference Using Posterior Draws 254

8.3.6 Dealing with Model Specification Uncertainty 255

8.4 Mixed-Effects Finite Mixtures of Regression Models 256

8.4.1 Model Definition 256

8.4.2 Choosing Priors for Bayesian Estimation 256

8.4.3 Bayesian Parameter Estimation When the Allocations Are Known 257

8.4.4 Bayesian Parameter Estimation When the Allocations Are Unknown 258

8.5 Finite Mixture Models for Repeated Measurements 259

8.5.1 Pooling Information Across Similar Units 260

8.5.2 Finite Mixtures of Random-Effects Models 260

8.5.3 Choosing the Prior for Bayesian Estimation 265

8.5.4 Bayesian Parameter Estimation When the Allocations Are Known 265

8.5.5 Practical Bayesian Estimation Using MCMC267

8.5.6 Dealing with Model Specification Uncertainty 269

8.5.7 Application to the Marketing Data 270

8.6 Further Issues 273

8.6.1 Regression Modeling Based on Multivariate Mixtures of Normals 273

8.6.2 Modeling the Weight Distribution 274

8.6.3 Mixtures-of-Experts Models 274

9 Finite Mixture Models with Nonnormal Components 277

9.1 Finite Mixtures of Exponential Distributions 277

9.1.1 Model Formulation and Parameter Estimation 277

9.1.2 Bayesian Inference 278

9.2 Finite Mixtures of Poisson Distributions 279

9.2.1 Model Formulation and Estimation 279

9.2.2 Capturing Overdispersion in Count Data 280

9.2.3 Modeling Excess Zeros 282

9.2.4 Application to the Eye Tracking Data 283

9.3 Finite Mixture Models for Binary and Categorical Data 286

9.3.1 Finite Mixtures of Binomial Distributions 286

9.3.2 Finite Mixtures of Multinomial Distributions 288

9.4 Finite Mixtures of Generalized Linear Models 289

9.4.1 Finite Mixture Regression Models for Count Data 290

9.4.2 Finite Mixtures of Logit and Probit Regression Models 292

9.4.3 Parameter Estimation for Finite Mixtures of GLMs 293

9.4.4 Model Selection for Finite Mixtures of GLMs 294

9.5 Finite Mixture Models for Multivariate Binary and Categorical Data 294

9.5.1 The Basic Latent Class Model 295

9.5.2 Identification and Parameter Estimation 296

9.5.3 Extensions of the Basic Latent Class Model 297

9.6 Further Issues 298

9.6.1 Finite Mixture Modeling of Mixed-Mode Data 298

9.6.2 Finite Mixtures of GLMs with Random Effects 299

为了幸福,努力!

8
kxjs2007(未真实交易用户) 发表于 2010-6-8 15:50:25

10 Finite Markov Mixture Modeling 301

10.1 Introduction 301

10.2 Finite Markov Mixture Distributions 301

10.2.1 Basic Definitions 302

10.2.2 Irreducible Aperiodic Markov Chains 304

10.2.3 Moments of a Markov Mixture Distribution 308

10.2.4 The Autocorrelation Function of a Process Generated by a Markov Mixture Distribution 310

10.2.5 The Autocorrelation Function of the Squared Process 311

10.2.6 The Standard Finite Mixture Distribution as a Limiting Case 312

10.2.7 Identifiability of a Finite Markov Mixture Distribution313

10.3 Statistical Modeling Based on Finite Markov Mixture Distributions314

10.3.1 The Basic Markov Switching Model 314

10.3.2 The Markov Switching Regression Model 315

10.3.3 Nonergodic Markov Chains 316

10.3.4 Relaxing the Assumptions of the Basic Markov Switching Model 316

11 Statistical Inference for Markov Switching Models 319

11.1 Introduction 319

11.2 State Estimation for Known Parameters 319

11.2.1 Statistical Inference About the States 320

11.2.2 Filtered State Probabilities 320

11.2.3 Filtering for Special Cases 323

11.2.4 Smoothing the States 324

11.2.5 Filtering and Smoothing for More General Models 326

11.3 Parameter Estimation for Known States 327

11.3.1 The Complete-Data Likelihood Function 327

11.3.2 Complete-Data Bayesian Parameter Estimation 329

11.3.3 Complete-Data Bayesian Estimation of the Transition Matrix 329

11.4 Parameter Estimation When the States are Unknown 330

11.4.1 The Markov Mixture Likelihood Function 330

11.4.2 Maximum Likelihood Estimation 333

11.4.3 Bayesian Estimation 334

11.4.4 Alternative Estimation Methods 334

11.5 Bayesian Parameter Estimation with Known Number of States 335

11.5.1 Choosing the Prior for the Parameters of a Markov Mixture Model 335

11.5.2 Some Properties of the Posterior Distribution of a Markov Switching Model 336

11.5.3 Parameter Estimation Through Data Augmentation and MCMC 337

11.5.4 Permutation MCMC Sampling 340

11.5.5 Sampling the Unknown Transition Matrix 340

11.5.6 Sampling Posterior Paths of the Hidden Markov Chain 342

11.5.7 Other Sampling-Based Approaches 345

11.5.8 Bayesian Inference Using Posterior Draws 345

11.6 Statistical Inference Under Model Specification Uncertainty 346

11.6.1 Diagnosing Markov Switching Models 346

11.6.2 Likelihood-Based Methods 346

11.6.3 Marginal Likelihoods for Markov Switching Models 347

11.6.4 Model Space MCMC 348

11.6.5 Further Issues 348

11.7 Modeling Overdispersion and Autocorrelation in Time Series of Count Data 348

11.7.1 Motivating Example 348

11.7.2 Capturing Overdispersion and Autocorrelation Using Poisson Markov Mixture Models 349

11.7.3 Application to the Lamb Data 351
为了幸福,努力!

9
kxjs2007(未真实交易用户) 发表于 2010-6-8 15:50:49

12 Nonlinear Time Series Analysis Based on Markov
Switching Models 357

12.1 Introduction 357

12.2 The Markov Switching Autoregressive Model 358

12.2.1 Motivating Example 358

12.2.2 Model Definition 360

12.2.3 Features of the MSAR Model 362

12.2.4 Markov Switching Models for Nonstationary Time Series 363

12.2.5 Parameter Estimation and Model Selection 365

12.2.6 Application to Business Cycle Analysis of the U.S.GDP Data 365

12.3 Markov Switching Dynamic Regression Models 371

12.3.1 Model Definition 371

12.3.2 Bayesian Estimation 371

12.4 Prediction of Time Series Based on Markov Switching Models 372

12.4.1 Flexible Predictive Distributions 372

12.4.2 Forecasting of Markov Switching Models via Sampling-Based Methods 374

12.5 Markov Switching Conditional Heteroscedasticity 375

12.5.1 Motivating Example 375

12.5.2 Capturing Features of Financial Time Series Through Markov Switching Models 377

12.5.3 Switching ARCH Models 378

12.5.4 Statistical Inference for Switching ARCH Models 380

12.5.5 Switching GARCH Models 383

12.6 Some Extensions 384

12.6.1 Time-Varying Transition Matrices 384

12.6.2 Markov Switching Models for Longitudinal and Panel Data 385

12.6.3 Markov Switching Models for Multivariate Time Series 386

13 Switching State Space Models 389

13.1 State Space Modeling 389

13.1.1 The Local Level Model with and Without Switching 389

13.1.2 The Linear Gaussian State Space Form 391

13.1.3 Multiprocess Models 393

13.1.4 Switching Linear Gaussian State Space Models 393

13.1.5 The General State Space Form 394

13.2 Nonlinear Time Series Analysis Based on Switching State Space Models 396

13.2.1 ARMA Models with and Without Switching 396

13.2.2 Unobserved Component Time Series Models 397

13.2.3 Capturing Sudden Changes in Time Series 398

13.2.4 Switching Dynamic Factor Models 400

13.2.5 Switching State Space Models as a Semi-Parametric Smoothing Device 401

13.3 Filtering for Switching Linear Gaussian State Space Models 401

13.3.1 The Filtering Problem 402

13.3.2 Bayesian Inference for a General Linear Regression Model402

13.3.3 Filtering for the Linear Gaussian State Space Model 404

13.3.4 Filtering for Multiprocess Models 406

13.3.5 Approximate Filtering for Switching Linear Gaussian State Space Models 406

13.4 Parameter Estimation for Switching State Space Models 410

13.4.1 The Likelihood Function of a State Space Model 411

13.4.2 Maximum Likelihood Estimation 412

13.4.3 Bayesian Inference 412

13.5 Practical Bayesian Estimation Using MCMC 415

13.5.1 Various Data Augmentation Schemes 416

13.5.2 Sampling the Continuous State Process from the Smoother Density 417

13.5.3 Sampling the Discrete States for a Switching State Space Model 420

13.6 Further Issues 421

13.6.1 Model Specification Uncertainty in Switching State Space Modeling 421

13.6.2 Auxiliary Mixture Sampling for Nonlinear and Nonnormal State Space Models 422

13.7 Illustrative Application to Modeling Exchange Rate Data 423

为了幸福,努力!

10
kxjs2007(未真实交易用户) 发表于 2010-6-8 15:51:26

A Appendix 431

A.1 Summary of Probability Distributions 431

A.1.1 The Beta Distribution 431

A.1.2 The Binomial Distribution432

A.1.3 The Dirichlet Distribution 432

A.1.4 The Exponential Distribution 433

A.1.5 The F-Distribution 433

A.1.6 The Gamma Distribution434

A.1.7 The Geometric Distribution 435

A.1.8 The Multinomial Distribution 435

A.1.9 The Negative Binomial Distribution 435

A.1.10 The Normal Distribution 436

A.1.11 The Poisson Distribution 437

A.1.12 The Student-t Distribution 437

A.1.13 The Uniform Distribution 438

A.1.14 The Wishart Distribution 438

A.2 Software 439

References 441

Index 481
为了幸福,努力!

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注jltj
拉您入交流群
GMT+8, 2025-12-25 01:02