楼主: kxjs2007
8732 23

【下载】Statistical Mechanics:Entropy, Order Parameters and Complexity.2009 [推广有奖]

  • 0关注
  • 31粉丝

已卖:16702份资源

讲师

45%

还不是VIP/贵宾

-

威望
0
论坛币
16256 个
通用积分
161.9270
学术水平
38 点
热心指数
50 点
信用等级
29 点
经验
21907 点
帖子
471
精华
0
在线时间
313 小时
注册时间
2009-11-7
最后登录
2024-7-1

楼主
kxjs2007 发表于 2010-6-12 08:43:00 |AI写论文

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币
Statistical Mechanics: Entropy, Order Parameters and Complexity (Oxford Master Series in Physics) [Paperback]
James P. Sethna (Author)



Editorial Reviews


Review



"It would take quite a long time to list all the interesting subjects treated in this textbook, but overall, this is a very good starting point for an undergraduate student who is interested in pursuing a career in research and wants to have a global idea of the different problems that are currently developed in laboratories."--Mathematical Reviews



"Sethna's book provides am important service to students who want to learn modern statistical mechanics."-- Physics Today



"An extremely intelligent and elegant introduction to fundamental concepts, well suited for the beginning graduate level."--William Gelbart, University of California at Los Angeles



"The author's style, although quite concentrated, is simple to understand, and has many lovely visual examples to accompany formal ideas and concepts, which makes the exposition live and intuitively appealing."--Journal of Statistical Physics




Product Description


In each generation, scientists must redefine their fields: abstracting, simplifying and distilling the previous standard topics to make room for new advances and methods. Sethna's book takes this step for statistical mechanics--a field rooted in physics and chemistry whose ideas and methods are now central to information theory, complexity, and modern biology. Aimed at advanced undergraduates and early graduate students in all of these fields, Sethna limits his main presentation to the topics that future mathematicians and biologists, as well as physicists and chemists, will find fascinating and central to their work. The amazing breadth of the field is reflected in the author's large supply of carefully crafted exercises, each an introduction to a whole field of study: everything from chaos through information theory to life at the end of the universe.






Product Details
  • Paperback: 376 pages
  • Publisher: Oxford University Press, USA; illustrated edition edition (June 1, 2006)
  • Language: English
  • ISBN-10: 0198566778
  • ISBN-13: 978-0198566779


二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:Statistical parameters statistica Complexity Parameter Mechanics Complexity Order parameters Entropy

已有 1 人评分学术水平 热心指数 信用等级 收起 理由
actuary_chen + 1 + 1 + 1 Excellent Introdcution

总评分: 学术水平 + 1  热心指数 + 1  信用等级 + 1   查看全部评分

为了幸福,努力!

沙发
kxjs2007(未真实交易用户) 发表于 2010-6-12 08:46:34
Contents
Preface v
Contents ix
List of figures xvii
1 What is statistical mechanics? 1
Exercises 4
1.1 Quantum dice 4
1.2 Probability distributions 5
1.3 Waiting times 6
1.4 Stirling’s approximation 7
1.5 Stirling and asymptotic series 7
1.6 Random matrix theory 8
1.7 Six degrees of separation 9
1.8 Satisfactory map colorings 12
2 Random walks and emergent properties 15
2.1 Random walk examples: universality and scale invariance 15
2.2 The diffusion equation 19
2.3 Currents and external forces 20
2.4 Solving the diffusion equation 22
2.4.1 Fourier 23
2.4.2 Green 23
Exercises 25
2.1 Random walks in grade space 25
2.2 Photon diffusion in the Sun 26
2.3 Molecular motors and random walks 26
2.4 Perfume walk 27
2.5 Generating random walks 28
2.6 Fourier and Green 28
2.7 Periodic diffusion 29
2.8 Thermal diffusion 30
2.9 Frying pan 30
2.10 Polymers and random walks 30
2.11 Stocks, volatility, and diversification 31
2.12 Computational finance: pricing derivatives 32
2.13 Building a percolation network 33

3 Temperature and equilibrium 37

3.1 The microcanonical ensemble 37

3.2 The microcanonical ideal gas 39

3.2.1 Configuration space 39

3.2.2 Momentum space 41

3.3 What is temperature? 44

3.4 Pressure and chemical potential 47

3.4.1 Advanced topic: pressure in mechanics and statistical

mechanics. 48

3.5 Entropy, the ideal gas, and phase-space refinements 51

Exercises 53

3.1 Temperature and energy 54

3.2 Large and very large numbers 54

3.3 Escape velocity 54

3.4 Pressure computation 54

3.5 Hard sphere gas 55

3.6 Connecting two macroscopic systems 55

3.7 Gas mixture 56

3.8 Microcanonical energy fluctuations 56

3.9 Gauss and Poisson 57

3.10 Triple product relation 58

3.11 Maxwell relations 58

3.12 Solving differential equations: the pendulum 58

4 Phase-space dynamics and ergodicity 63

4.1 Liouville’s theorem 63

4.2 Ergodicity 65

Exercises 69

4.1 Equilibration 69

4.2 Liouville vs. the damped pendulum 70

4.3 Invariant measures 70

4.4 Jupiter! and the KAM theorem 72

为了幸福,努力!

藤椅
kxjs2007(未真实交易用户) 发表于 2010-6-12 08:47:09

5 Entropy 77

5.1 Entropy as irreversibility: engines and the heat death of

the Universe 77

5.2 Entropy as disorder 81

5.2.1 Entropy of mixing: Maxwell’s demon and osmotic

pressure 82

5.2.2 Residual entropy of glasses: the roads not taken 83

5.3 Entropy as ignorance: information and memory 85

5.3.1 Non-equilibrium entropy 86

5.3.2 Information entropy 87

Exercises 90

5.1 Life and the heat death of the Universe 91

5.2 Burning information and Maxwellian demons 91

5.3 Reversible computation 93

5.4 Black hole thermodynamics 93

5.5 Pressure–volume diagram 94

5.6 Carnot refrigerator 95

5.7 Does entropy increase? 95

5.8 The Arnol’d cat map 95

5.9 Chaos, Lyapunov, and entropy increase 96

5.10 Entropy increases: diffusion 97

5.11 Entropy of glasses 97

5.12 Rubber band 98

5.13 How many shuffles? 99

5.14 Information entropy 100

5.15 Shannon entropy 100

5.16 Fractal dimensions 101

5.17 Deriving entropy 102

6 Free energies 105

6.1 The canonical ensemble 106

6.2 Uncoupled systems and canonical ensembles 109

6.3 Grand canonical ensemble 112

6.4 What is thermodynamics? 113

6.5 Mechanics: friction and fluctuations 117

6.6 Chemical equilibrium and reaction rates 118

6.7 Free energy density for the ideal gas 121

Exercises 123

6.1 Exponential atmosphere 124

6.2 Two-state system 125

6.3 Negative temperature 125

6.4 Molecular motors and free energies 126

6.5 Laplace 127

6.6 Lagrange 128

6.7 Legendre 128

6.8 Euler 128

6.9 Gibbs–Duhem 129

6.10 Clausius–Clapeyron 129

6.11 Barrier crossing 129

6.12 Michaelis–Menten and Hill 131

6.13 Pollen and hard squares 132

6.14 Statistical mechanics and statistics 133

为了幸福,努力!

板凳
kxjs2007(未真实交易用户) 发表于 2010-6-12 08:47:48

7 Quantum statistical mechanics 135

7.1 Mixed states and density matrices 135

7.1.1 Advanced topic: density matrices. 136

7.2 Quantum harmonic oscillator 139

7.3 Bose and Fermi statistics 140

7.4 Non-interacting bosons and fermions 141

7.5 Maxwell–Boltzmann ‘quantum’ statistics 144

7.6 Black-body radiation and Bose condensation 146

7.6.1 Free particles in a box 146

7.6.2 Black-body radiation 147

7.6.3 Bose condensation 148

7.7 Metals and the Fermi gas 150

Exercises 151

7.1 Ensembles and quantum statistics 151

7.2 Phonons and photons are bosons 152

7.3 Phase-space units and the zero of entropy 153

7.4 Does entropy increase in quantum systems? 153

7.5 Photon density matrices 154

7.6 Spin density matrix 154

7.7 Light emission and absorption 154

7.8 Einstein’s A and B 155

7.9 Bosons are gregarious: superfluids and lasers 156

7.10 Crystal defects 157

7.11 Phonons on a string 157

7.12 Semiconductors 157

7.13 Bose condensation in a band 158

7.14 Bose condensation: the experiment 158

7.15 The photon-dominated Universe 159

7.16 White dwarfs, neutron stars, and black holes 161

8 Calculation and computation 163

8.1 The Ising model 163

8.1.1 Magnetism 164

8.1.2 Binary alloys 165

8.1.3 Liquids, gases, and the critical point 166

8.1.4 How to solve the Ising model 166

8.2 Markov chains 167

8.3 What is a phase? Perturbation theory 171

Exercises 174

8.1 The Ising model 174

8.2 Ising fluctuations and susceptibilities 174

8.3 Coin flips and Markov 175

8.4 Red and green bacteria 175

8.5 Detailed balance 176

8.6 Metropolis 176

8.7 Implementing Ising 176

8.8 Wolff 177

8.9 Implementing Wolff 177

8.10 Stochastic cells 178

8.11 The repressilator 179

8.12 Entropy increases! Markov chains 182

8.13 Hysteresis and avalanches 182

8.14 Hysteresis algorithms 185

8.15 NP-completeness and kSAT 186

为了幸福,努力!

报纸
kxjs2007(未真实交易用户) 发表于 2010-6-12 08:48:12

9 Order parameters, broken symmetry, and topology 191

9.1 Identify the broken symmetry 192

9.2 Define the order parameter 192
9.3 Examine the elementary excitations 196
9.4 Classify the topological defects 198
Exercises 203
9.1 Topological defects in nematic liquid crystals 203
9.2 Topological defects in the XY model 204
9.3 Defect energetics and total divergence terms 205
9.4 Domain walls in magnets 206
9.5 Landau theory for the Ising model 206
9.6 Symmetries and wave equations 209
9.7 Superfluid order and vortices 210
9.8 Superfluids: density matrices and ODLRO 211
10 Correlations, response, and dissipation 215
10.1 Correlation functions: motivation 215
10.2 Experimental probes of correlations 217
10.3 Equal-time correlations in the ideal gas 218
10.4 Onsager’s regression hypothesis and time correlations 220
10.5 Susceptibility and linear response 222
10.6 Dissipation and the imaginary part 223
10.7 Static susceptibility 224
10.8 The fluctuation-dissipation theorem 227
10.9 Causality and Kramers–Kr¨onig 229
Exercises 231
10.1 Microwave background radiation 231
10.2 Pair distributions and molecular dynamics 233
10.3 Damped oscillator 235
10.4 Spin 236
10.5 Telegraph noise in nanojunctions 236
10.6 Fluctuation-dissipation: Ising 237
10.7 Noise and Langevin equations 238
10.8 Magnetic dynamics 238
10.9 Quasiparticle poles and Goldstone’s theorem 239
11 Abrupt phase transitions 241
11.1 Stable and metastable phases 241
11.2 Maxwell construction 243
11.3 Nucleation: critical droplet theory 244
11.4 Morphology of abrupt transitions 246
11.4.1 Coarsening 246
11.4.2 Martensites 250
11.4.3 Dendritic growth 250
Exercises 251
11.1 Maxwell and van der Waals 251
11.2 The van der Waals critical point 252
11.3 Interfaces and van der Waals 252
11.4 Nucleation in the Ising model 253
11.5 Nucleation of dislocation pairs 254
11.6 Coarsening in the Ising model 255

11.7 Origami microstructure 255

11.8 Minimizing sequences and microstructure 258

11.9 Snowflakes and linear stability 259

12 Continuous phase transitions 263

12.1 Universality 265

12.2 Scale invariance 272

12.3 Examples of critical points 277

12.3.1 Equilibrium criticality: energy versus entropy 278

12.3.2 Quantum criticality: zero-point fluctuations versus

energy 278

12.3.3 Dynamical systems and the onset of chaos 279

12.3.4 Glassy systems: random but frozen 280

12.3.5 Perspectives 281

Exercises 282

12.1 Ising self-similarity 282

12.2 Scaling and corrections to scaling 282

12.3 Scaling and coarsening 282

12.4 Bifurcation theory 283

12.5 Mean-field theory 284

12.6 The onset of lasing 284

12.7 Renormalization-group trajectories 285

12.8 Superconductivity and the renormalization group 286

12.9 Period doubling 288

12.10 The renormalization group and the central limit

theorem: short 291

12.11 The renormalization group and the central limit

theorem: long 291

12.12 Percolation and universality 293

12.13 Hysteresis and avalanches: scaling 296

为了幸福,努力!

地板
kxjs2007(未真实交易用户) 发表于 2010-6-12 08:48:30

A Appendix: Fourier methods 299

A.1 Fourier conventions 299

A.2 Derivatives, convolutions, and correlations 302

A.3 Fourier methods and function space 303

A.4 Fourier and translational symmetry 305

Exercises 307

A.1 Sound wave 307

A.2 Fourier cosines 307

A.3 Double sinusoid 307

A.4 Fourier Gaussians 308

A.5 Uncertainty 309

A.6 Fourier relationships 309

A.7 Aliasing and windowing 310

A.8 White noise 311

A.9 Fourier matching 311

A.10 Gibbs phenomenon 311
References 313
Index 323
EndPapers 350
为了幸福,努力!

7
oddstone(真实交易用户) 在职认证  发表于 2010-6-12 08:56:28
很前瞻性的一本书,估计读起来有点难度
异石

8
kxjs2007(未真实交易用户) 发表于 2010-6-12 09:06:02
This review is from: Statistical Mechanics: Entropy, Order Parameters and Complexity (Oxford Master Series in Physics) (Paperback)
I haven't yet had a chance to read this book from cover to cover. However, after several hours with it, some of its strengths and weaknesses became evident. Many of these complement each other.

It covers an exciting range of contemporary applications -- take a look at the table of contents. The problems are long, discursive, and even more intriguing than the main text, covering topics like the cosmic microwave background, origami microstructures, Langevin equations, snowflakes, biochemical reaction rates and NP-completeness. The book is rich in illustrations, and in footnotes that give an informal commentary on the main text.

One downside is that, being so wide, the coverage is also a bit thin in places. Many of the most interesting contemporary topics, such as the statistical mechanics of networks, are covered *only* in exercises. Thermodynamics is dismissed in less than 10 pages in the middle of the book, owing to that subject's being "cluttered" with a "zoo of partial derivatives, transformations and relations."

The exercises look to be more fun and tempting than usual in books on this subject. So it's a definite bummer that the book neither includes answers or hints, nor states problems in closed form ("Show that this stuff = X"). The book's web site contains only some hints for computational exercises, plus a bunch of additional problems (again, without answers). If you're interested in self-study, this tease is frustrating - an automatic one-star deduction.

There's more good news/bad news with the author's aim to be relevant to fields outside traditional physics -- e.g. in econophysics and social science. This certainly makes the book up-to-date and attractive, and was one of the reasons I bought it. But applying physics to social science is a tricky business. There's a couple hundred years of failed attempts, because people blithely modeled stuff without thinking enough about the limits within which such an analogy might be appropriate. And many who do think about those limits when deriving a model often forget about them when applying it.

An example is the Black-Scholes model of option pricing. The model's results are "simply wrong" (B. Mandelbrot). Its assumptions about volatility and the structure of the option contract aren't empirically justified. Its blind application contributed to the 1987 stock market break. And the investment fund run by one of its Nobel-laureate inventors went bust in flames in 1998. In this book, there's an exercise that walks you through some of the underlying concepts of Black-Scholes (pp. 32-33). But the author only praises the model, without so much as a footnote mentioning its darker side.

Even when doing "traditional" physics, one ignores philosophical issues at one's peril. A lot of the great physicists of the past century weren't being stupid to fret over them. On the other hand, there are lots of folks like my QM professor in the 1970s, who explained that the only reason Bohr, Heisenberg and Einstein discussed philosophy was that they didn't understand QM, "but today we understand it very well, so we don't need to worry about that stuff."

Unfortunately, this book continues that gung-ho, what-me-worry tradition. A disappointing example is the discussion of information and entropy (pp. 85 ff). The author states that interpreting entropy "not as a property of the system, but of our knowledge of the system ... cleanly resolves many otherwise confusing issues" (@ 85). This "cleanly" is a bit disingenuous, since plenty of people wouldn't agree with this interpretation (see, e.g., J. Bricmont's 1995 paper "Science of Chaos, or Chaos in Science?", available on the arXiv). The discussion of the arrow of time (pp. 80-81) does mention a couple of nuggets of relevant history, but the level of treatment is more suitable for a pre-med physics survey class than for a graduate course in stat mech.

A couple of pages later (pp. 87-90), the author slides from a discussion of Shannon entropy to discussing an algorithm for helping your roommate find her keys by asking her questions. Without acknowledging it, he introduces the notion of meaning into "information" -- but meaning wasn't relevant for Shannon. Indeed, the historical background for why Shannon called his quantity "entropy" -- John von Neumann advised him to use the term because "nobody understands entropy" -- suggests one should be very cautious about mashing up the various scientific and colloquial meanings of "information".

It's just this kind of unreflective enthusiasm when applying physics techniques outside their usual domain that leads to so many junk "Physics and Society" papers on the arXiv. At least one-half star deduction, for an upper bound of 3.5 stars.

NOTE ADDED 2007/03/27: I recently received a very gracious email from the author addressing some of the above comments. I wasn't convinced by him about Black-Scholes or entropy (which he claimed to understand "in the broad context" better than Claude Shannon or J. Bricmont), but I do appreciate his engaging me on those points. He's also prepared an answer key to the exercises, though you'll need to write to him and convince him that you aren't taking the course for credit before he'll send them to you. (In my case my review apparently was credible evidence enough; not sure what it might take in yours, but from his note it sounds like it's not an impossible task.) I can't say that this materially changes my rating of the book, but I certainly give five stars to the author for his sincerity.
为了幸福,努力!

9
kxjs2007(未真实交易用户) 发表于 2010-6-12 09:06:19
This review is from: Statistical Mechanics: Entropy, Order Parameters and Complexity (Oxford Master Series in Physics) (Paperback)
I immensely enjoyed studying this statistical mechanics book. I think that the author, James Sethna, has a "Feynman-like" ability to explore his subject matter with much depth, insight, and many playfully creative excursions. The exercises cover such topics as the thermodynamics of Dyson Spheres and black holes; of how many shuffles it takes to fully randomize a card deck; and of whether an advanced, intelligent being or civilization can, from a thermodynamic standpoint, manage to process an infinite number of thoughts before the heat death of the universe, or whether they are limited to a finite number of thoughts. I think that there is a lot of wisdom and insights in this book which is missing in other books I've read on statistical mechanics and thermodynamics, where I often feel overwhelmed by a zoo of partial derivatives and thermodynamic equations with little guidance given on how the entire structure fits together. I strongly recommend this book for anyone who has studied some statistical mechanics and/or thermodynamics in a lower-level undergraduate course, and is looking for more advanced upper-level undergraduate or graduate-level text.
为了幸福,努力!

10
kxjs2007(未真实交易用户) 发表于 2010-6-12 09:06:36
This review is from: Statistical Mechanics: Entropy, Order Parameters and Complexity (Oxford Master Series in Physics) (Paperback)
The book Statistical Mechanics: Entropy, Order Parameters and Complexity by James Sethna is excellent. I have used it as the main textbook in my course on Statistical Physics for first year graduate students at the Universidade Estadual de Campinas (UNICAMP) in Brazil. The students and I liked it very much.

I think that the main quality of the book is that it presents Statistical Physics as a very dynamical subject, interconnected with several subjects within physics, as well as outside it.

Since the book is aimed for a one semester course on the subject, the author had to make important choices. I really liked his choices. For instance, the book does not discuss approximate methods used to treat systems with interacting particles, instead the author has chosen to have a chapter on Calculation and Computation. Although these methods have played an important role in the past, nowadays the study of the relevant problems in the field require computer simulations. The chapter on Computer Simulation is excellent. Instead of only discussing how to perform a Monte Carlo simulation, it proofs mathematically in detail (except for the Perron-Frobenius theorem) why one ends up with an equilibrium probability distribution after a number of Monte Carlo steps. Another important subject covered in the book is that of Abrupt Phase Transitions. For most Statistical Physics books, only Second Order or Continuous Transitions exist. The exercises are also another very important and interesting choice made by the author. They are very different from the usual exercises one can find in a regular textbook on Statistical Physics. The exercises are in general very intelligent and they appear in a broad range of difficulty, from those which can be solved by inspection to those that are small projects. I recall two great examples, exercises 5.7 and 5.10, where it is shown in a very clear and clever way that, when we know the system from a microscopic point of view, its entropy does not increase, whereas if we know only a coarse-grained description of it, then its entropy does increase. Some exercises lead the reader, in a secure way, through aspects of the theory that are not covered in the text. For instance, Landau's theory for phase transitions is presented in a very nice way in exercise 9.5.

Perhaps, the aspect that I have enjoyed most in the book is that the author does not shy away from discussing one of the thorniest points in the fundamentals of Statistical Physics: what entropy really is. The book discusses in some detail Phase Space Dynamics and Ergodicity. It presents some physical situations where the ergodic hypothesis breaks down. Usually this problem with the theory is swept under the rug in most textbooks. One very interesting case is that of the entropy of glasses. A subject the author himself has worked on. If a liquid is cooled down very fast it may become a glass, undergoing what is called a glass transition. When the system is in the liquid phase its atoms are diffusing and the system goes through all different possible configurations, that is believed to be the cause for its entropy (ergodicity). When the liquid undergoes a glass transition, the atoms cease diffusing and the system is jammed in one (a single one) structure of the liquid that generated it. If the system is not anymore going through all the possible configurations available what has happened to its entropy? No heat is released in this transition, therefore, one does not expect a change in its entropy. A hardcore purist would answer that the glass is not a system in equilibrium and, therefore, the entropy is not well defined. The point is, it may take much more than the age of the Universe for the glass to reach the final equilibrium and become a crystal (reported changes in glasses of ancient churches are urban legends). The question about what has happened to the entropy of the liquid remains there, despite the purist's answer. The experimentalists can measure very well the residual entropy of a glass. For the author, for me and fortunately nowadays for many others, the satisfactory answer is that the entropy of a glass is the missing information about the system. Another example of residual entropy can be found in the ice cubes in your refrigerator.

At last but not least, I would like to comment on a misconception of a previous reviewer about Shannon's Information Theory. The entropy proposed by Shannon is a measure of the uncertainty of a set of possible messages that can be exchanged, regardless the content of each message. Therefore, this entropy is related to the probability distribution associated with the ensemble of possible messages, regardless of their content. If there are any doubts, I would suggest reading the first chapter of the book Mathematical Foundations of Information Theory by A. Ya. Khinchin. In section 5.3.2 of the book, the author is just analyzing the properties of the Shannon entropy of a probability distribution using a humorous example. The probability distribution can be associated with anything, even with a key lost by a careless room-mate. This entropy is a property of the probability distribution, independent of any possible meaning attributed to it by a human being.
为了幸福,努力!

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注jltj
拉您入交流群
GMT+8, 2025-12-31 13:59