楼主: kxjs2007
7670 10

【下载】Information Theory, Inference, and Learning Algorithms~David J.C. MacKay [推广有奖]

  • 0关注
  • 31粉丝

已卖:16699份资源

讲师

45%

还不是VIP/贵宾

-

威望
0
论坛币
16253 个
通用积分
137.0347
学术水平
38 点
热心指数
50 点
信用等级
29 点
经验
21907 点
帖子
471
精华
0
在线时间
313 小时
注册时间
2009-11-7
最后登录
2024-7-1

楼主
kxjs2007 发表于 2010-6-6 08:01:53 |AI写论文

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币
Information Theory, Inference & Learning Algorithms (Hardcover)
David J. C. MacKay (Author)

Editorial Reviews

Review
"...a valuable reference...enjoyable and highly useful."
American Scientist


"...an impressive book, intended as a class text on the subject of the title but having the character and robustness of a focused encyclopedia. The presentation is finely detailed, well documented, and stocked with artistic flourishes."
Mathematical Reviews


"Essential reading for students of electrical engineering and computer science; also a great heads-up for mathematics students concerning the subtlety of many commonsense questions."
Choice


"An utterly original book that shows the connections between such disparate fields as information theory and coding, inference, and statistical physics."
Dave Forney, Massachusetts Institute of Technology


"This is an extraordinary and important book, generous with insight and rich with detail in statistics, information theory, and probabilistic modeling across a wide swathe of standard, creatively original, and delightfully quirky topics. David MacKay is an uncompromisingly lucid thinker, from whom students, faculty and practitioners all can learn."
Peter Dayan and Zoubin Ghahramani, Gatsby Computational Neuroscience Unit, University College, London


"An instant classic, covering everything from Shannon's fundamental theorems to the postmodern theory of LDPC codes. You'll want two copies of this astonishing book, one for the office and one for the fireside at home."
Bob McEliece, California Institute of Technology


"An excellent textbook in the areas of infomation theory, Bayesian inference and learning alorithms. Undergraduate and post-graduate students will find it extremely useful for gaining insight into these topics."
REDNOVA


"Most of the theories are accompanied by motivations, and explanations with the corresponding examples...the book achieves its goal of being a good textbook on information theory."
ACM SIGACT News

Product Description
Information theory and inference, often taught separately, are here united in one entertaining textbook. These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.


Product Details
  • Hardcover: 550 pages
  • Publisher: Cambridge University Press; 1st edition (June 15, 2002)
  • Language: English
  • ISBN-10: 0521642981
  • ISBN-13: 978-0521642989

二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:information Algorithms Informatio formation Algorithm Theory David Inference information Learning

本帖被以下文库推荐

为了幸福,努力!

沙发
kxjs2007(未真实交易用户) 发表于 2010-6-6 08:02:14

Contents

Preface v

1 Introduction to Information Theory 3

2 Probability, Entropy, and Inference 22

3 More about Inference 48

I Data Compression 65

4 The Source Coding Theorem 67

5 Symbol Codes 91

6 Stream Codes 110

7 Codes for Integers 132

II Noisy-Channel Coding 137

8 Dependent Random Variables 138

9 Communication over a Noisy Channel 146

10 The Noisy-Channel Coding Theorem 162

11 Error-Correcting Codes and Real Channels 177

III Further Topics in Information Theory 191

12 Hash Codes: Codes for E_cient Information Retrieval 193

13 Binary Codes 206

14 Very Good Linear Codes Exist 229

15 Further Exercises on Information Theory 233

16 Message Passing 241

17 Communication over Constrained Noiseless Channels 248

18 Crosswords and Codebreaking 260

19 Why have Sex? Information Acquisition and Evolution 269

IV Probabilities and Inference 281

20 An Example Inference Task: Clustering 284

21 Exact Inference by Complete Enumeration 293

22 Maximum Likelihood and Clustering 300

23 Useful Probability Distributions 311

24 Exact Marginalization 319

25 Exact Marginalization in Trellises 324

26 Exact Marginalization in Graphs 334

27 Laplace's Method 341

28 Model Comparison and Occam's Razor 343

29 Monte Carlo Methods 357

30 E_cient Monte Carlo Methods 387

31 Ising Models 400

32 Exact Monte Carlo Sampling 413

33 Variational Methods 422

34 Independent Component Analysis and Latent Variable Modelling 437

35 Random Inference Topics 445

36 Decision Theory 451

37 Bayesian Inference and Sampling Theory 457

V Neural networks 467

38 Introduction to Neural Networks 468

39 The Single Neuron as a Classi_er 471

40 Capacity of a Single Neuron 483

41 Learning as Inference 492

42 Hop_eld Networks 505

43 Boltzmann Machines 522

44 Supervised Learning in Multilayer Networks 527

45 Gaussian Processes 535

46 Deconvolution 549

VI Sparse Graph Codes 555

47 Low-Density Parity-Check Codes 557

48 Convolutional Codes and Turbo Codes 574

49 Repeat{Accumulate Codes 582

50 Digital Fountain Codes 589

VII Appendices 597

A Notation 598

B Some Physics 601

C Some Mathematics 605

Bibliography 613

Index 620
为了幸福,努力!

藤椅
gssdzc(未真实交易用户) 在职认证  发表于 2010-6-6 08:34:47
Thanks a lot

板凳
jonck(真实交易用户) 发表于 2010-6-6 11:15:48
楼主资料真丰富!

报纸
liuqian0814(真实交易用户) 发表于 2010-6-22 13:02:21
1# kxjs2007

Hhaha哈哈 不错的文章

地板
zzzppp(真实交易用户) 发表于 2010-6-22 21:33:24
thanks for sharing!

7
lzhangsas(真实交易用户) 发表于 2011-6-18 11:47:15
多谢多谢!!很好的参考书

8
tlyy1996(真实交易用户) 发表于 2014-7-20 11:36:10
这是本样书,从Cambridge University Press网站就可以下载了!不过还是谢谢哈!

9
jgchen1966(真实交易用户) 发表于 2014-7-22 01:30:50
谢谢!!!

10
狂奔的菠菜(未真实交易用户) 发表于 2016-10-15 14:58:43
好东西,不过原书好像是640页?2003年出版的?

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注jltj
拉您入交流群
GMT+8, 2025-12-9 03:34