楼主: kxjs2007
5098 11

【下载】Gaussian Processes for Machine Learning~Carl Edward Rasmussen.2006 [推广有奖]

  • 0关注
  • 31粉丝

讲师

45%

还不是VIP/贵宾

-

威望
0
论坛币
16162 个
通用积分
103.4476
学术水平
38 点
热心指数
50 点
信用等级
29 点
经验
21907 点
帖子
471
精华
0
在线时间
312 小时
注册时间
2009-11-7
最后登录
2024-3-27

相似文件 换一批

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning) (Hardcover)
Carl Edward Rasmussen(Author), Christopher K. I. Williams (Author)

Editorial Reviews
Product Description
Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kernel machines. GPs have received increased attention in the machine-learning community over the past decade, and this book provides a long-needed systematic and unified treatment of theoretical and practical aspects of GPs in machine learning. The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics.

The book deals with the supervised-learning problem for both regression and classification, and includes detailed algorithms. A wide variety of covariance (kernel) functions are presented and their properties discussed. Model selection is discussed both from a Bayesian and a classical perspective. Many connections to other well-known techniques from machine learning and statistics are discussed, including support-vector machines, neural networks, splines, regularization networks, relevance vector machines and others. Theoretical issues including learning curves and the PAC-Bayesian framework are treated, and several approximation methods for learning with large datasets are discussed. The book contains illustrative examples and exercises, and code and datasets are available on the Web. Appendixes provide mathematical background and a discussion of Gaussian Markov processes.

About the Author
Carl Edward Rasmussen is a Lecturer at the Department of Engineering, University of Cambridge, and Adjunct Research Scientist at the Max Planck Institute for Biological Cybernetics, Tübingen.

Christopher K. I. Williams is Professor of Machine Learning and Director of the Institute for Adaptive and Neural Computation in the School of Informatics, University of Edinburgh.

Product Details
  • Hardcover: 266 pages
  • Publisher: The MIT Press (December 1, 2005)
  • Language: English
  • ISBN-10: 026218253X
  • ISBN-13: 978-0262182539
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:Processes Asmussen Gaussian Learning Process Learning machine Edward Carl Gaussian

Gaussian Processes for Machine Learning~Carl Edward Rasmussen.2006.pdf

3.86 MB

需要: 1 个论坛币  [购买]

本帖被以下文库推荐

为了幸福,努力!
沙发
kxjs2007 发表于 2010-6-6 07:39:31 |只看作者 |坛友微信交流群

Contents

Series Foreword xi

Preface xiii

Symbols and Notation xvii

1 Introduction 1

1.1 A Pictorial Introduction to Bayesian Modelling 3

1.2 Roadmap 5

2 Regression 7

2.1 Weight-space View 7

2.1.1 The Standard Linear Model 8

2.1.2 Projections of Inputs into Feature Space 11

2.2 Function-space View 13

2.3 Varying the Hyperparameters 19

2.4 Decision Theory for Regression 21

2.5 An Example Application 22

2.6 Smoothing, Weight Functions and Equivalent Kernels 24

2.7 Incorporating Explicit Basis Functions 27

2.7.1 Marginal Likelihood 29

2.8 History and Related Work 29

2.9 Exercises 30

3 Classification 33

3.1 Classification Problems 34

3.1.1 Decision Theory for Classification 35

3.2 Linear Models for Classification 37

3.3 Gaussian Process Classification 39

3.4 The Laplace Approximation for the Binary GP Classifier 41

3.4.1 Posterior 42

3.4.2 Predictions 44

3.4.3 Implementation 45

3.4.4 Marginal Likelihood 47

3.5 Multi-class Laplace Approximation 48

3.5.1 Implementation 51

3.6 Expectation Propagation 52

3.6.1 Predictions 56

3.6.2 Marginal Likelihood 57

3.6.3 Implementation 57

3.7 Experiments 60

3.7.1 A Toy Problem 60

3.7.2 One-dimensional Example 62

3.7.3 Binary Handwritten Digit Classification Example 63

3.7.4 10-class Handwritten Digit Classification Example 70

3.8 Discussion 72

3.9 Appendix: Moment Derivations 74

3.10 Exercises 75

4 Covariance Functions 79

4.1 Preliminaries 79

4.1.1 Mean Square Continuity and Differentiability 81

4.2 Examples of Covariance Functions 81

4.2.1 Stationary Covariance Functions 82

4.2.2 Dot Product Covariance Functions 89

4.2.3 Other Non-stationary Covariance Functions 90

4.2.4 Making New Kernels from Old 94

4.3 Eigenfunction Analysis of Kernels 96

4.3.1 An Analytic Example 97

4.3.2 Numerical Approximation of Eigenfunctions 98

4.4 Kernels for Non-vectorial Inputs 99

4.4.1 String Kernels 100

4.4.2 Fisher Kernels 101

4.5 Exercises 102

5 Model Selection and Adaptation of Hyperparameters 105

5.1 The Model Selection Problem 106

5.2 Bayesian Model Selection 108

5.3 Cross-validation 111

5.4 Model Selection for GP Regression 112

5.4.1 Marginal Likelihood 112

5.4.2 Cross-validation 116

5.4.3 Examples and Discussion 118

5.5 Model Selection for GP Classification 124

5.5.1 Derivatives of the Marginal Likelihood for Laplace’s Approximation 125

5.5.2 Derivatives of the Marginal Likelihood for EP 127

5.5.3 Cross-validation 127

5.5.4 Example 128

5.6 Exercises 128

6 Relationships between GPs and Other Models 129

6.1 Reproducing Kernel Hilbert Spaces 129

6.2 Regularization 132

6.2.1 Regularization Defined by Differential Operators 133

6.2.2 Obtaining the Regularized Solution 135

6.2.3 The Relationship of the Regularization View to Gaussian Process

Prediction 135

6.3 Spline Models 136

6.3.1 A 1-d Gaussian Process Spline Construction 138

6.4 Support Vector Machines 141

6.4.1 Support Vector Classification 141

6.4.2 Support Vector Regression 145

6.5 Least-squares Classification 146

6.5.1 Probabilistic Least-squares Classification 147

6.6 Relevance Vector Machines 149

6.7 Exercises 150
为了幸福,努力!

使用道具

藤椅
kxjs2007 发表于 2010-6-6 07:39:52 |只看作者 |坛友微信交流群

7 Theoretical Perspectives 151

7.1 The Equivalent Kernel 151

7.1.1 Some Specific Examples of Equivalent Kernels 153

7.2 Asymptotic Analysis 155

7.2.1 Consistency 155

7.2.2 Equivalence and Orthogonality 157

7.3 Average-case Learning Curves 159

7.4 PAC-Bayesian Analysis 161

7.4.1 The PAC Framework 162

7.4.2 PAC-Bayesian Analysis 163

7.4.3 PAC-Bayesian Analysis of GP Classification 164

7.5 Comparison with Other Supervised Learning Methods 165

7.6 Appendix: Learning Curve for the Ornstein-Uhlenbeck Process 168

7.7 Exercises 169

8 Approximation Methods for Large Datasets 171

8.1 Reduced-rank Approximations of the Gram Matrix 171

8.2 Greedy Approximation 174

8.3 Approximations for GPR with Fixed Hyperparameters 175

8.3.1 Subset of Regressors 175

8.3.2 The Nystr¨om Method 177

8.3.3 Subset of Datapoints 177

8.3.4 Projected Process Approximation 178

8.3.5 Bayesian Committee Machine 180

8.3.6 Iterative Solution of Linear Systems 181

8.3.7 Comparison of Approximate GPR Methods 182

8.4 Approximations for GPC with Fixed Hyperparameters 185

8.5 Approximating the Marginal Likelihood and its Derivatives 185

8.6 Appendix: Equivalence of SR and GPR Using the Nystr¨om Approximate

Kernel 187

8.7 Exercises 187

9 Further Issues and Conclusions 189

9.1 Multiple Outputs 190

9.2 Noise Models with Dependencies 190

9.3 Non-Gaussian Likelihoods 191

9.4 Derivative Observations 191

9.5 Prediction with Uncertain Inputs 192

9.6 Mixtures of Gaussian Processes 192

9.7 Global Optimization 193

9.8 Evaluation of Integrals 193

9.9 Student’s t Process 194

9.10 Invariances 194

9.11 Latent Variable Models 196

9.12 Conclusions and Future Directions 196

Appendix A Mathematical Background 199

A.1 Joint, Marginal and Conditional Probability 199

A.2 Gaussian Identities 200

A.3 Matrix Identities 201

A.3.1 Matrix Derivatives 202

A.3.2 Matrix Norms 202

A.4 Cholesky Decomposition 202

A.5 Entropy and Kullback-Leibler Divergence 203

A.6 Limits 204

A.7 Measure and Integration 204

A.7.1 Lp Spaces 205

A.8 Fourier Transforms 205

A.9 Convexity 206

Appendix B Gaussian Markov Processes 207

B.1 Fourier Analysis 208

B.1.1 Sampling and Periodization 209

B.2 Continuous-time Gaussian Markov Processes 211

B.2.1 Continuous-time GMPs on R 211

B.2.2 The Solution of the Corresponding SDE on the Circle 213

B.3 Discrete-time Gaussian Markov Processes 214

B.3.1 Discrete-time GMPs on Z 214

B.3.2 The Solution of the Corresponding Difference Equation on PN 215

B.4 The Relationship Between Discrete-time and Sampled Continuous-time

GMPs 217

B.5 Markov Processes in Higher Dimensions 218

Appendix C Datasets and Code 221

Bibliography 223

Author Index 239

Subject Index 245
为了幸福,努力!

使用道具

板凳
gssdzc 在职认证  发表于 2010-6-6 07:46:37 |只看作者 |坛友微信交流群
Thanks a lot

使用道具

报纸
jonck 发表于 2010-6-6 11:19:15 |只看作者 |坛友微信交流群
谢谢分享!回帖赚分!

使用道具

地板
shouwangxn 发表于 2010-7-24 21:03:26 |只看作者 |坛友微信交流群
谢谢,好东西啊

使用道具

7
kongchunyuan 发表于 2011-1-2 16:36:41 |只看作者 |坛友微信交流群
谢谢分享!回帖赚分!

使用道具

8
eaglewarrior 发表于 2011-1-18 17:15:06 |只看作者 |坛友微信交流群
好书,值得一看啊

使用道具

9
m8843620 发表于 2011-5-27 17:44:34 |只看作者 |坛友微信交流群
謝謝樓主的分享

使用道具

10
kobe3a 发表于 2014-7-24 11:31:58 |只看作者 |坛友微信交流群
多谢分享了,正好打算下来一看

使用道具

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注jltj
拉您入交流群

京ICP备16021002-2号 京B2-20170662号 京公网安备 11010802022788号 论坛法律顾问:王进律师 知识产权保护声明   免责及隐私声明

GMT+8, 2024-4-27 18:30