Up Learn – A Level Chemistry (AQA)

Thermodynamics

1. Introduction to Entropy
2. Order and Disorder
3. What is Entropy?
4. How Does Temperature Affect Entropy?
5. How Does State Change Affect Entropy?
6. Comparing Entropy Between Substances?
6. How Does Dissolving a Substance Affect Entropy?
8. How Does the Number of Particles Affect Entropy?
9. Entropy Changes
10. Predicting the Entropy Change of a Reaction 1
11. Predicting the Entropy Change of a Reaction 2
1. Introduction to a Microscopic, Mathematical Definition of Entropy
2. A Simple System 1
3. A Simple System 2
4. Relating Configurations to Entropy
5. The Exact Mathematical Definition of Entropy
6. Relating Our Simple System to Atomic Systems
7. Why Does Temperature Affect Entropy?
8. Why Does Number of Particles Affect Entropy?
9. Why Does State Affect Entropy?
10. So Is Entropy Really a Measure of Disorder?
1. Introduction to Calculating Entropy Changes
2. Measuring Entropy for Larger Systems
3. Entropy at Absolute Zero
4. Explaining Entropy at 0 K Mathematically
5. Entropy at Non-Zero Temperatures
6. Graphing Entropy
7. Standard Molar Entropies
8. Investigating the Trends in the Table of Absolute Entropies
9. Calculating the Entropy Change of a Reaction
10. Why Did We Bother Predicting Entropy Changes in the First Place?
11. Why Are the Units of Entropy Change ‘Per Mole’?
1. The Surroundings
2. The Entropy Change of the Surroundings
3. Calculating the Entropy Change of the Surroundings
4. The Entropy Change of the Universe
5. What Reactions Can’t Happen?
6. Feasibility
7. Why Do Some Feasible Reactions Not Happen?
8. The 2nd Law of Thermodynamics
9. Gibbs Free Energy Change
10. The Units of Gibbs Free Energy Change
11. Calculating Gibbs Free Energy Change
12 .Assessing Feasibility
13. Assessing Feasibility – Making Ice
14. Assessing Feasibility – Thermal Decomposition of Calcium Carbonate
15. Exam Technique: Explaining Feasibility
16. Graphing Gibbs Free Energy Change
17. Using Graphs to Find Enthalpy and Entropy Changes
18. Assessing Feasiblility from Graphs
19. Finding the Temperature Where Reactions Become Feasible
20. The Limitations of Our Temperature-Finding Equation
21. Doesn’t Entropy Change…. Change With Temperature?
22. Calculating Gibbs Free Energy Change for Reverse Reactions
23. What About Reversible Reactions?
24. How Are Reversible Reactions Compatible With the Second Law of Thermodynamics?

We’ve said it’s not just enthalpy that determines whether reactions can happen…

That something called entropy plays a role too.

So, okay…what is entropy?

Well, just like how all matter has a mass…and a temperature…and a volume…all matter has an entropy. It’s a property that matter has.

That means that chemists can say things like ‘gold has an entropy of 47.4 joules per kelvin per mole’ [S(Gold) = 47.4 J K-1mol-1]

So, which of these sentences might make sense?

These sentences might make sense since they all treat entropy as a property of matter. 

Okay, so entropy is a property of matter. What else?

Well, entropy is hard to describe using words.

That’s because, unlike mass, temperature and volume, it isn’t a property that we can directly feel, see, sense, or measure

Really, the best, most precise way to understand the abstract world of entropy is with maths – and we’ll actually explore that maths a little later.

But for now, we can consider entropy to be a measurement of disorder.

And we can say that, the higher the disorder of a something, the higher it’s entropy. 

But while we’re comfortable comparing the disorder of everyday objects, how on earth do we compare the disorder of chemicals? 

For example, how do we compare the disorder of a glass of water at 25 degrees and a glass of water at 26 degrees?

Well, to do that, we need to dive down into the atomic scale, which we’ll do shortly.

But before we do, let’s pad out this woefully empty factfile with a bit of historical context

Entropy was first introduced by Rudolf Clausius in 1865.

Clausius is commonly remembered as one of founding fathers of thermodynamics, but he’s also remembered for being the scientist with the most christmassy name. 

And Clausius gave entropy the symbol, S – after the scientist Nicholas Sadi-Carnot, whose work laid the foundations for this area of research. 

So, to sum up, entropy is…

Entropy is a measurement of disorder. 

The more disordered something is, the higher its entropy.
And lastly, to represent entropy, we use the capital letter S.