翰林国际教育全网首发
力争超快速发布最全资料
为你
千千万万遍
Online线上学术活动部分此卷共4大题
但每题下分不同数量的小题
完整版下载链接见文末
The subject matter of this document is statistical mechanics, or the study of how macroscopic results manifest from microscopic interactions in systems with many interacting parts.
Our goal is to provide a unified introduction to statistical mechanics using the concept of entropy. A precise definition of entropy pervades statistical mechanics and other scientific subjects and is useful in its own right. While many students may have heard the word entropy before, entropy is rarely explained in its full detail or with rigorous mathematics, leaving students confused about many of its implications. Moreover, when students learn about thermodynamic laws, laws that describe the macroscopic results of statistical mechanics, like “change in internal energy = heat flow in + work done on a system”, the concepts of internal energy, heat, and work are all left at the mercy of a student’s vague, intuitive understanding.
In this document, we will see that formulating statistical mechanics with a focus on entropy can provide a more unified and symmetric understanding of many of the laws of thermodynamics. Indeed some laws of thermodynamics which appear confusing and potentially unrelated at first glance can in fact all be seen to follow from the same treatment of entropy in statistical mechanics.
At this point in your life, you may have heard the word entropy, but chances are, it was given a vague, non-committal definition. The goal of this section is to introduce a more explicit concept of entropy from an abstract standpoint before considering its experimental and observational signatures.
1.1 Quantifying the amount of information in the answer to a question
Entropy, while useful in physics, also has applications in computer science and information theory. This section will explore the concept of information entropy as an abstract object.
We will first consider entropy not as related to the concept of heat in objects, but as a purely axiomatic quantification of what we mean by the information we receive when we hear the answer to a question. For a concrete example, imagine that someone flips a coin and doesn’t reveal which side landed upright and we ask “what was the outcome of the coin flip? Heads or tails?” When they now tell us the answer, how much “information” do we gain by learning what the outcome was? In other words, we are faced with determining how much information is received when we hear the answer to a question that has a probability distribution of outcomes. This question was asked by Claude Shannon in 1948.
After pondering this question for a long time, you might come up with a few criteria that any reasonable measure must obey, such as:
完整版真题资料可以底部二维码免费领取↓↓↓
免费领取最新年份学术活动真题及解析
翰林学员全站资料免费打包下载,专享高速下载通道。
翰林课程体验,退费流程快速投诉邮箱: yuxi@linstitute.net 沪ICP备2023009024号-1