The Let's Play Archive

SpaceChem (2013 Tournament)

by Wild M

Part 36: Closed Tournament - Round 3 - The Chem

The Chem in SpaceChem, Closed Tournament Round 3

Challenge accepted.


Red finals: Тетрис
Tetris is a video game. What does this have to do with chemistry, you ask? Nothing at all!

The game was made by Alexey Pajitnov and released in 1984 in the Soviet Union. It quickly gained popularity in the Soviet Union, and at some point it ended up in Budapest, where it was discovered by some British folks. They brought it to the west around 1987 while trying to make a licensing deal with Pajitnov. They failed to do so, I guess that at those cold war times there were a lot of political barriers to overcome. But the game was out in the west and wouldn’t be stopped. In 1988, the Russian government started claiming the rights to the game, after Pajitnov had granted it to them. I’m not sure if that was voluntary. The game kept spreading like wildfire, and the best known version was probably the one that came bundled with every Nintendo Gameboy.

Pajitnov didn’t get any royalties until 1996, when the license ownership reverted from the government back to himself. In that year, he and Henk Rogers, a Dutch man who had negotiated with the Russian government about getting Nintendo a license for Tetris, founded The Tetris Company, which owns the Tetris trademark till this day.

Tetris is available in some shape of form for nearly every game console, operating system, and also for many other devices. The different versions featured quite the variety of music, but “Music A” from the Gameboy version is probably the best known one. It is actually part of the melody of a 19th century Russian folk song called Коробейники (Korobeiniki). It ‘tells of a meeting between a peddler and a girl, in which they haggle over the price of goods in a veiled metaphor for courtship.’[1]

If you’d like to know more about Tetris’ home country while listening to Tetris Music A please check this video. Seriously, watch it, it’s great!


Remember when I said Tetris has nothing to do with chemistry? Well, I kinda lied.

It is very difficult to simulate large amounts of complex molecules, there’s so many variables and things we just don’t know, and the math is so difficult that computers just fail. So we have to simplify. In 2009, the researchers Barnes, Siderius and Gelb did just that. They wanted to simulate adsorption (‘sticking to a surface’) of complex molecules. Their simplification was to use Tetris pieces instead of actual molecules. They took out most variables, so the distribution of the tetrominoes was driven by entropy, which is a subject I will talk about for the next puzzle. Interestingly, the blocks naturally formed some quite interesting patterns. I think it is quite incredible how much of the actual chemistry subject you can explain/predict by just using these simple tetris blocks.
The article is open-access so check it out if you’re interested. There’s nice pictures.[2][3]

Well, in case you get tired from solving the puzzles or reading my text, feel free to play a relaxing game of First Person Tetris.

---

Blue finals: Entropy Machine

Let’s talk about entropy. Entropy is a concept from physics that’s used a lot in chemistry. It might even be more important in chemistry than in physics, but you should ask a physicist about that. Entropy is one of those things that are easy to understand in a general way, but hard to ‘master’. I’ll try to explain entropy from a chemist’s perspective while not making things too difficult. I will probably fail at the latter part, so don’t feel bad about skipping this section (you can jump to the ‘Chemistry’ subsection if you like, it gets simpler from there).

It’s useful to take a quick look at energy first. Energy comes in many forms such as heat, radiation, mechanical energy and chemical energy. You have seen examples of chemical energy being released or taken up when I talked about reaction energy. Reactions that release chemical energy to the environment are called exothermic, the opposite ones are endothermic. Most spontaneous reactions are exothermic, but not all of them. It’s not just energy that determines spontaneity. This is where entropy will come in later.

Disorder
There are multiple equivalent definitions of entropy. One common definition has to do with ‘disorder’. Imagine a bucket with hot water and a bucket with cold water. Such a “two bucket system” has a certain amount of order. When we mix the water and we get only lukewarm water as a result, some of the order is lost. Entropy is a measure of the amount of disorder in a system. So, in other words, in the example the entropy increased by mixing the water. “Unmixing” the water in hot and cold parts again would decrease the entropy. You can do that, but it takes outside work to do so, it won’t happen by itself.

We call this the Second Law of Thermodynamics: No process is possible in which the total entropy decreases, when all systems taking part in the process are included.

Using this definition of entropy, we can come to a formula:
S = kb * ln Ω
S is the entropy, Kb is a constant, ln is the natural logarithm function and Ω is the amount of possible microstates for a certain macrostate.

That’ll take some explanation. Imagine we cool our bucket of water to absolute zero, making sure that as the water freezes, we get perfectly ordered crystals. "A bucket of water at absolute zero" is a macrostate. At absolute zero, the molecules would be completely stuck in the crystal lattice. They can only be in one possible position and won’t move around. Ignore that quantum physicist shouting in the corner, please. The amount of microstates is 1, and the formula shows that the entropy equals zero.

Now, we warm up the water again until we get to the macrostate "a bucket of water at room temperature". At room temperature, the water molecules will move around, turn and tumble. Quite a mess. Each molecule can be in a lot of ‘states’ (place, position, speed etc. can change) and there’s an incredible number of molecules in a bucket. At this macrostate, Ω is an enormous number. Luckily, we take its logarithm and the constant Kb is a tiny number, so the orders of magnitude cancel out and our entropy will be a number we can understand, like 70.

The Second Law of Thermodynamics, then, is a simple matter of statistics. Theoretically, there is a chance that all ‘hot’ water molecules will go to one side of the bucket, while all ‘cold’ ones will go to the other. Each possible distribution of molecules has a chance to occur. But the amount of distributions (microstates) where the temperature is the same throughout the bucket is so much larger than the amount of microstates where the water divides, that it would probably take way way longer than the age of the universe to ever see this division happen spontaneously.

Heat
The original definition of entropy was different, as it wasn’t based on disorder but on heat. As you can see from the example above, disorder increases when there’s more heat, so this original definition turns out to be exactly equivalent to the previous one.

For a process that is reversible even when isolated, the entropy change equals the heat change divided by the temperature (At an infinitesimal step. If the temperature changes, you need to integrate). In formula:
dS = dq / T
d indicates a change, q is the heat energy and T the temperature.

For a process that is irreversible when isolated, the entropy change will always be greater than the heat change divided by the temperature:
dS > dq / T

As it is practically impossible to have a reversible isolated process, the entropy for any isolated process will increase. As far as we know, the entire universe is isolated, and the universe can be seen as one enormous process, so the entropy of the universe increases, making matter and energy more randomly distributed as time goes on. This might actually turn out to be the end of our universe in the very long run.

Chemistry
Finally, back to the real stuff. Molecules have entropy, the amount depends on the type of molecule and the temperature. For an isolated system, entropy will try to increase. So, if an isolated molecule can increase its entropy by changing into another molecule, it will. However, molecules are never fully isolated. When they react, they give or take energy from their environment. This also means that their own entropy can decrease, because the Second Law has nothing to say about a non-isolated system.

As it happens, spontaneity of real-life reactions is determined by a combination of both entropy and energy. Several formulas can be used for this, but chemists most commonly use the so-called Gibbs free energy, named after Josiah Willard Gibbs, an American physicist. The formula is simple enough:
G = H – T * S

G is the Gibbs free energy. H is the ‘enthalpy’ which is basically a specific definition for (chemical) energy that works at constant pressure. That’s useful, because most reactions we care about take place at atmospheric pressure, which is nearly constant. T is temperature again and S is entropy. The main alternative formula is for the Helmholtz energy, which is defined at constant volume but variable pressure.

A chemical reaction is spontaneous if the total Gibbs free energy decreases. This means that either the change in H has to be negative (energy should be released, exothermic) or the change in S should be positive (entropy should increase), or any kind of combination, as long as the total thing ends up negative.

Entropy effects are also the cause for many other physical/chemical processes. In the Tetris section, I noted how it plays a role in adsorption. It's also important in processes such as mixing of gases and mixing of liquids, dissolving, melting, boiling, you name it.

By the way, this formula has nothing to say about the rate (‘speed’) of a reaction. For instance, the reaction of hydrogen gas and oxygen gas to water has strong decrease of Gibbs energy, so it is spontaneous. But, without a spark to get things going, a mixture of those gases can be stable for millions of years, although they will form water eventually. Reaction rate is a completely different matter altogether, which I won’t talk about today.

---

All Elements Challenge
This is a good time to talk about the periodic table of elements. You might have seen it, it looks something like this:


Since Lavoisier in the 18th century, there have been many attempts to somehow order the elements.

Lavoisier ordered them by type: gases, metals, nonmetals and ‘earths’. Next was Döbereiner, who ordered elements in ‘triads’, groups of three. A few examples of triads were lithium, sodium, potassium, and chlorine, bromine, iodine. Another one was carbon, nitrogen and oxygen. As you can see, these are all groups of elements that are either above each other or next to each other in the modern periodic table. Many more chemists followed. Their goal was to somehow make an order of all the elements. De Chancourtois, a geologist, arranged the elements in a spiral or helix, showing that elements with similar properties occurred at regular intervals. In other words, the list of elements showed some kind of periodicity.

These attempts went on for a while, until in 1869 Russian chemist Dmitri Mendeleev published the first version of the modern periodic table. Click here to view it. The table shows elements and their masses. The periods are vertically arranged, instead of horizontally in modern tables. The table has some interesting things, such as the gaps. There are two question marks after Zn in the third column. Those correspond to gallium and germanium, two elements that hadn’t been discovered yet. The table’s ability to predict properties of undiscovered elements proved that Mendeleev was on to something. Something else he did was order elements by chemical similarities within the group, even if that meant they weren't ordered by atomic mass any more.

Later on, when protons and electrons were discovered, it was found that Mendeleev’s table was actually sorted by atomic number, that is the number of protons or electrons in the atom. Atomic mass (protons + neutrons) is usually less important than atomic number for chemical properties.
They also discovered the nonreactive noble gases, a group of elements that Mendeleev knew nothing about. I guess, at the time there was nothing suggesting they should exist.

In the modern table, groups of elements with similar properties form columns, the rows are called periods. The first period only contains two elements, period 2 and 3 have eight each, and the periods keep getting larger. This is because each period corresponds to electron shells being filled, and there’s more space for electrons in the larger shells that get filled further down the table.

Extending the table
The periodic table is still subject to change. As new, very heavy elements are being synthesized in labs, they get a temporary name of the type ‘ununoctium’ (Uuo) which is simply a combination of the Greek numbers one, one, eight, so element 118. If others manage to reproduce the results, after some years the IUPAC (International Union for Pure and Applied Chemistry) will give the element an official name. This happened in 2012 with elements 114 and 116, which were named Flerovium and Livermorium. There have been claims of elements up to 118 (the last element to fit in period 7) being synthesized, so those might get proper names a few years from now, too. It’s theoretically possible to go beyond that, although the nuclei just get less and less stable. These elements are so unstable that any chemistry is impossible.

It has been theorized that somewhere around the current end of the table, maybe a little beyond, there’s an “island of stability” where nuclei are more stable, because of, once again, quantum mechanics. These nuclei need a lot of extra neutrons, and they haven’t been created yet. Physicists have studied lighter nuclei with similar ‘stability’ properties, but the research seems inconclusive so far.

I don’t understand too much of this stuff, but I think it’s most likely that if those “island of stability” atoms exist, they might be stable for whole seconds or minutes instead of for microseconds. That might be interesting in particle accelerators and such, but I don’t expect it will be possible to build anything macroscopic from ‘element omega’.

[1] http://en.wikipedia.org/wiki/Korobeiniki
[2] http://arstechnica.com/science/2009...mics-of-tetris/
[3] http://pubs.acs.org/doi/abs/10.1021/la900196b (click on PDF)