**The 2 ^{nd }Law of Thermodynamics and Entropy:**

The 2^{nd}law of thermodynamics is extremely important regarding how the universe works. In fact, the whole idea of an arrow of time is due to the 2^{nd}law and entropy. The work of Sadi Carnot, Rudolf Clausius, and Lord Kelvin allowed us to understand how heat transfers from one body to another, and how we can never make engines that are 100% efficient.

**What is the 2 ^{nd}Law?**

Although it is preceded by the 0^{th}(that is A and B, and A and C are in thermal equilibrium, B and C must also be) and the 1^{st}(a thermodynamic adaptation of the conservation of energy) , it was created before them and does rely upon them. The 2^{nd}law describes the motion of heat – that heat only transfers from hot to cold. Now, you maybe thinking how do fridges/freezers work, as they are actively cooling hot things down – the key word in that statement was actively, work must be done in order to make warm things cooler (and as you’ll see later, this still ensures that entropy increases).

The 2^{nd}law is based around two statements. The first by Clausius (and Carnot), that heat will only travel from a warm body to a cold body, without work being done. The second by Kelvin, that heat cannot be solely transferred into work, there must be some waste heat also. These two statements may sound different to each other, but it can be shown that violating Clausius’ statement violates Kelvin’s statement (and vice versa) – therefore they are the same.

(The following video from Sixty Symbols: https://youtu.be/C0fGk1d4oIc?t=303shows this very well).

After the discovery of these statements, the idea of entropy was formulated, by Clausius, and later by Boltzmann. It revolves around the statement by Kelvin that there will always be waste heat. This leads us to say that the 2^{nd}law tells us that Entropy is always going to increase (or stay the same if a system is reversible).

**Entropy:**

Often entropy is described as a measure of disorder of a system, and as the 2^{nd}law states this always increases. Therefore, we often hear that the universe is becoming more and more disordered. This somewhat makes sense, but then how do for example crystals form from a solution – surely that’s getting more ordered. The atoms of crystals themselves are becoming more ordered, but the overall system of the formation of the crystals is becoming less ordered, if we look at energy concentration

There are two ways of looking at entropy then, the first is often due to Boltzmann and his famous equation S = K_{b}ln W. Where S is the entropy of a system, K_{b }is Boltzmann’s constant, and W can be thought of the number of ways molecules/atoms can be arranged. Here Boltzmann, is looking at how we can arrange molecules in a gas, and this gives us the entropy of that gas. This is where the idea of disorder comes from, as statistically speaking are is only one way to order a system so that all the molecules of one type are together, and vice versa. (Fig 1.1) However, there are many more ways for the system to be organised so that the molecules are mixed up – or less ordered. This leads to a statistical idea that a system will always become more disordered as there are more ways for it to be disordered than ordered. Fig 1.2 shows one of many ways for this simple system to be organises where it is ‘disordered’,

However, another way to think of entropy is that it is a measure of how spread out energy is. When you have a warm object for example, the energy is concentrated in that object. But over time, the energy is spread out around the room – therefore the entropy of the system increases. And looking at the spread-out energy, you can see that it is less useful (even useless) when it is spread out then concentrated. This is more in line with what Clausius said, where , for a reversible system. This states that the change in entropy is equal to the change in heat over the heat of the surroundings. This led rise to the Clausius inequality, which states that the entropy of a system is either ≥0. It can only be 0 in a reversible process.

In short, it is best to represent entropy as the measure of concentration of energy. This therefore makes sense that entropy always increases. And thus, if entropy is always increasing energy is always being dissipated. This gives us the arrow of time, as entropy increases in the direction time moves. To reverse time would be to clump energy back together, which not only is impossible but something we would be able to recognise (ie a broken glass coming back together). Many other physical processes can be seen independent of time – a pendulum is a great example. But entropy allows us to see this direction of time.

Entropy in the beginning of the universe was zero – the energy of the entire universe was concentrated at the big bang. From then it has always spread out. This has led scientists to postulate that the universe will die when energy is so spread out that nothing can be done with it – the so-called heat death of the universe.

**Carnot Engines:**

The 2^{nd}law was first theorised by Carnot when he was trying to find out how to make steam engines as efficient as possible (1820s). He came up with a Carnot cycle which describes the most efficient engine possible – one that is reversible. In reality, Carnot engines are not possible as he had to ignore processes such as friction to come up with his cycle, but they allow us to measure how efficient our engines are off his cycle. Before explaining the Carnot cycle, you have to know what we not only look at the efficiency of an engine, but also its power out-put. If we produced an engine as close to a Carnot engine as possible, the power output would be essentially zero (as Carnot engines require a lot of time to work). Therefore, for modern engineers, they need to make the engines efficient in terms of energy in and energy out and in terms of energy in vs power.

But if we look at a Carnot cycle, we can see how to make this engine. It involves two isothermal processes (one with a constant temperature) and two adiabatic processes (where there is no heat transfer).

Fig 2.1 shows the cycle very well; it involves having two heat stores – a hot (T_{H}) and cold (T_{C}). The hot reservoir in step 1 transfers heat to the gas, which expands pushing up the piston. This is isothermal, as the gas stays at a constant temperature (as it expands, it cools but is also heated back up by the energy store).In step 2 the heat store is replaced by insulation, allowing the gas to expand and cool – adiabatic as there isn’t any heat transfer. Now the insulation is replaced with a cold heat store. The piston pushes the gas back in, the gain in heat from the compression of the gas is transferred to the heat store and therefore isothermal. And finally, in step 4 the heat store is removed for insulation, which allows the gas to heat back up as it is compresses back to its original state.

The Carnot cycle relies upon these processes being reversible – in reality there are not as things like friction get in the way. But by using this as a model, engineers can get energy efficient engines (though with little power output).

This brings us back to the 2^{nd}law and Kelvin’s statement – there will always be heat lost in any system and entropy therefore will increase.