# 6805: 1094 Activities 11

Write your name and answers on this sheet and hand it in at the end.

## Thermodynamics from Statistical Mechanics

Goal: Derive two basic results from simple arguments and Taylor expansions.

1. Entropy and temperature Suppose we have two subsystems, labeled 1 and 2, with a total fixed energy Etot = E + E'. They can exchange energy back and forth, and eventually reach equilibrium.
1. What does it mean that they are in equilibrium? (E.g., how does time dependence come in? Are they completely static?)

2. We call Ω(E) the number of ways we can get E (i.e., the number of configurations with that energy) and Ω(E') the number of ways we can get E'. Given E and E', what is the total number of configurations?

3. The underlying principle is that any accessible state (one that satisfies E + E' = Etot) is equally likely. So the probability of finding a particular E is simply the total number of configurations that have that energy divided by the grand total number of configurations. Therefore the most likely E maximizes your answer to the last question. Given that E and E' are not independent (because their sum must always be Etot), how would you carry out this maximization? [Hints: How do you maximize a function? How can you eliminate E' in favor of E?]

4. Explain why this maximum corresponds to equilibrium between the two subsystems.

5. Given that the entropy S is defined by Ω(E) = eS(E)/kB, express your answer to the last question in terms of the entropies S and S' of the subsystems. What quantity is maximized?

6. Use the chain rule to show that the derivatives are equal: dS(E)/dE = dS'(E')/dE'.

7. The derivative dS(E)/dE is what we use to define the temperature; in particular dS(E)/dE = 1/T. We could define this to be a different function of temperature, but we find this definition agrees with conventional definitions in classical thermodynamics, e.g., for an ideal gas (PV = N kB T). So we have an equilibrium temperature because the system finds the set of most likely possibilities. Because the numbers typically get very large very quickly, "likely" becomes essentially a certainty. Does this make sense to you? If not, ask!

8. How is maximizing the entropy related to why shuffling a deck of cards ends up with them mixed up rather than in order?

2. Probability and Boltzmann factors. We will derive the probability of an energy E for the special case of two subsystems as in the last section, but with one much, much smaller than the other, so that E is much less than Etot.
1. We will find the ratio of probabilities of energies E1 and E2 in the subsystem described by E. Following the last section, this ratio is
P(E1)/P(E2) = [Ω(E1) Ω(Etot−E1)] / [Ω(E2) Ω(Etot−E2)]
Write the two terms with Etot in terms of entropy.

2. Now expand these in a Taylor expansion to first order about Etot. That is, use f(x + Δx) = f(x) + (Δx) df/dx when Δx is much smaller than x.

3. Use the definition of temperature from the last section to eliminate the derivative. What do you get for the probability ratio?

4. Argue that this result means that P(E) ∝ Ω(E) e−E/kB T = (1/Z) Ω(E) e−E/kB T and that the denominator of the proportionality constant, which is called the partition function Z, is given by Σi Ω(Ei) e−Ei/kB T, where the sum is over all energies Ei.

## Path integrals, Part 1

Goal: Understand the basics of path integrals using a quantum mechanics example.

1. Open the Mathematica notebook path_integral_qm_part1.nb and read the top. The paper by Lepage is linked in the "extra" section of today's session on the 6805 webpage. Take a quick look. We'll be calculating equations (6) through (8) in discrete form, as given by (16) and (17), as described in the Exercise on page 4. Do you have questions about equations (6) and (7)? (E.g., notation, inserting complete sets of states, etc.).

2. Monte Carlo integration.
1. First run this section and look at the intermediate steps. What questions do you have?

2. As the notebook says just before "Make a plot of the scatter of points compared to exact", create tables t1, t2, and t3 with the indicated number of points and add t1 and t2 to the scatter plot, which shows 20 estimates of the integral. What do you notice about the scatter of the estimates around the exact answer as you increase the number of points used for the Monte Carlo integration?

3. In the "Try fitting the data" section the idea is to see how the spread of the data, as measured by the standard deviation of a set of 40 estimates, decreases with the number of Monte Carlo points used. We expect a power law, so we do a fit like we did when timing matrix diagonalizations. The expectation is that is decreases as 1/Sqrt[N], but this is not seen here. Change the range of n, which is initially 5 to 10, to a larger range (10 to 18 works on my laptop in a reasonable time; yours may be fast enough to use a higher range). What does the fit show? Why did it give the "wrong" answer at first?

3. Integrating over each component of a vector.
1. This section is just to illustrate how to integrate over vectors in Mathematica (or, one way to do it :). Run the section and try a different test function (at least change nexp). Did it work? Do you have any questions?

4. Propagator as a path integral.
1. This section implements the exercise on page 4 of Lepage's lecture. Evaluate this section step-by-step. How good is the path integral energy estimate compared to the exact answer. What questions do you have?

2. What are the parameters that determine the accuracy of your result? Change one of them to improve the result for the propagator. [Hint: remember part 1 of the notebook.] What did you change and what was the result?

3. Change the calculation to use the anharmonic oscillator. We don't have an exact result, but we can compare to the result from the Discretization Games notebook. Do your results agree to the expected accuracy?