Humanity did something really cool in November – as a species, we redefined all our main measuring units to be based on universal physical constants. This represents the confluence of the two favorite subjects of many people – large international bureaucracies and science, more specifically metrology, or the science of measurement, so I’m sure everyone already knows about this. Just kidding. But it actually is interesting. The International Bureau of Weights and Measures (BIPM) had its 26th General Conference and voted to redefine the kilogram, ampere, degree kelvin and mole (that famous unit you learned about in chemistry class). The vote was unanimous, which is kind of a big deal when you consider that BIPM members include countries with historically tense relations like Israel and Saudi Arabia or the US with Iran and Venezuela. So what does it mean to change these definitions and why did we do this?
First, it’s worth going over some history. You can sort of view this as the ultimate culmination of the hopes of the original proponents of the metric system. Fitting the spirit of the times, scientists in revolutionary France proposed new units of measurement that could be used to describe all the universe, not just a single kingdom or even just a fiefdom. Their initial system only proposed two of the units we now use: the meter and the gram. The meter was defined as one ten millionth of the distance from the North Pole to the Equator, as measured on the line of longitude going through Paris (because, France), which they estimated with pretty solid accuracy even back then. The gram was defined as the mass of a cubic centimeter of water at freezing point. To point out a recurring problem, the French defined the gram to lay out the system, but decided that making a standard reference of something that small was too hard and so they commissioned a kilogram standard to be made as the reference instead. Thomas Jefferson heard about the new French units when he was Secretary of State and actually asked for a copy of the kilogram standard, but it never made it to America because the ship (and scientist) carrying it were hijacked by pirates ofthe Caribbean, so America ended up without a reference kilogram for decades.
(As an aside, temperature wasn’t defined in the revolutionary system at all. The Fahrenheit and Celsius scales were both developed earlier in the 18th century. For some weird reason, Celsius was also originally reversed, so that the numbers actually decreased with rising temperature – 100 degrees C was the freezing point of water and 0 was the boiling point. Also, my hot take is that Fahrenheit actually has a better zero point for reference than Celsius. As you know, Celsius is zero for the freezing/melting point of water, but it’s really easy to overshoot that depending on your cooling/heating rate as any video of supercooled water can show you. Fahrenheit used a mixture of water, ice, and the flavoring in salty licorice that always stabilizes to a specific temperature.)
In 1875, 17 countries (including the US) signed the Metre Convention, which established the definitions of the kilogram and meter based on new physical artifacts – a 1 kg cylinder of platinum-iridium alloy (the International Prototype Kilogram, affectionately called “le Grand K”) and a platinum-iridium bar with markings spaced 1 meter apart – and establishes the BIPM and associate organizations. The BIPM makes copies of these prototypes for national metrology organizations, and the US finally gets a kilogram, and a few years later, the US actually adopts the metric system. You might think that’s a joke, but Americans’ continued usage of imperial units is really just a surface level thing. Since 1893, the US has always defined the pound and yard in relation to their metric counterparts instead of a separate standard. We just can’t convince people to stick with the metric units in everyday life.
The problem with these artifact-based definitions is that it makes it hard to pinpoint where uncertainty develops. Every time a physical object is being handled, you risk scratching some microscopic sliver off or leaving some tiny amount of residue changing the mass of the kilogram standard or perhaps creating some strain that changes the length of the meter standard. For instance, nearly all the national kilogram standards have gained mass relative to le grand K, but people aren’t sure if this is a sign of them gaining mass, le grand K losing mass, or le grand K gaining mass slower than the other standards. Defining in terms of a physical constant offers more stability because we do have good evidence that they are, well, constant. This is also only reliable if you can measure the constant to high precision.
The meter has already been redefined this way a few times. First in the 1960s, it was defined as being a certain multiple of the wavelength of light emitted by a certain atomic transition of krypton. As optics advanced, it was discovered that light spectrum had some irregularities that could result in different values depending on experiments, so it was later replaced by a definition relating the meter explicitly to the speed of light. This means the meter is now based on the second, which is very precisely defined by an atomic transition of cesium.
Until recently measurements of the relevant constants for mass weren’t precise enough to replace le grand K. However, it’s also been awkward because the process of “tracing” the mass from le grand K to smaller masses becomes more imprecise as you go down. The balances only compare masses, so the only way to standardize something smaller than a kilogram is to see if it along with a bunch of other masses add up to a kilogram, and then just keep working down to smaller and smaller values. This quickly adds up difficulty and accumulated error, so the smallest NIST standard you can obtain goes to half a milligram. Unfortunately, industrial processes like developing active ingredients in pharmaceuticals or manufacturing nanoscale parts for electronics routinely deal with masses a lot smaller than that.
The new definition defines the kilogram in terms of the Planck constant, the unit that relates the frequency of light to the energy it carries. Or technically, it precisely defines the Planck constant as 6.62607015×10−3 joule-seconds or square meter-kilograms/second, and since we have physical values for seconds and meters, better measurements of the Planck constant now give the value of the kilogram. This is done with a watt balance, which uses the force of an electric current in a magnetic field to balance out the weight of a mass by varying the current until the forces are equal. A benefit of this definition (and the one for the meter) is that it is easier to define larger or smaller values for standards. Instead of needing to go through the full sequence of steps for scaling down mass standards, once a watt balance has found the current needed for a kilogram, you can easily scale that current down by 10 or 100 to make a new 100 or 10 g standard.
AND Materials Advent 2018 part 6 – Platinum-Iridium Alloys
So why make standard artifacts out of a mixture of platinum and iridium? It turns out this is really durable in a lot of ways. Pure platinum is already desired for how resistant it is to rusting. Iridium is an even rarer element – so rare that probably it’s biggest claim to fame is that a worldwide layer of it deep in the soil was one of the earliest proofs of the asteroid impact we think killed the dinosaurs. Adding a bit of iridium to platinum not only improves the rust resistance, but makes it stronger and harder, so handling the standards shouldn’t degrade them as much. It’s also a mixture that doesn’t change size much when heated or cooled, helping minimize another potential headache in measuring length.