In November 1966, the Arno River in Florence, Italy, flooded the Piazza del Duomo, located in the heart of the Renaissance city’s historic center, submerging millions of classic pieces of art, rare books, and manuscripts. Among these documents were the earliest instrumental weather records from the Medici Network, the first international network of meteorological observations, which recorded temperatures from 1654 to 1670.
The weather logs survived the flood, but it took decades to restore and reorganize them. When the documents finally became available again a few years ago, scientists embarked on a project to analyze their contents for the first time, but soon discovered that they faced a new set of challenges. The sheer number of data points was daunting in itself—the documents included information from 13 different locations, each of which recorded the weather every three to four hours for 16 years—but each of these readings also had to be translated into modern measurements. The dates and times had to be recalculated, since the old Florentine calendar began March 25 instead of January 1, and days started at twilight rather than midnight. Temperatures also had to be adapted from the Galileo scale used at the time to the Celsius scale. And there were other changes to consider: deforestation around weather stations, the construction of asphalt roads and buildings.
Temperature data from hundreds of years ago is no easier to process if it’s been obtained at sea. In 2004, for example, a team of researchers analyzing data from 18th- and 19th-century ships’ logs had to correct for the fact that in those days, sea-surface temperatures were measured with a thermometer in a bucket of seawater. Today, water temperatures are measured by intake pipes where seawater is taken aboard a ship, a practice that’s been in place since the 1940s. (A 2008 study showed that this switch in methodology exaggerated the global temperature dip seen in the 1950s.)
Data inconsistencies aren’t just a pain for the people measuring—they also make it harder to analyze the ways in which the planet is changing over time. Temperatures have been officially recorded in almost all regions of the world since the early 20th century. By the 1930s, records from individual temperature stations around the globe numbered in the millions. But using these records to unearth any long-term global trends involves pooling several different data sets, collected with very different methodologies across wide expanses of time and space. Unavoidably, there are gaps. In some cases, technological breakthroughs have made it impossible to directly compare readings separated by a few decades. And temperature records from many places are scattered and fragmented: Historical events often disrupt data collection (during World War II, for example, recordings from Pacific Island thermometers dropped sharply), and some areas have sparser coverage than others.
And then there are the thermometers themselves. Thermometer enclosures, which shield the temperature sensor from direct sunlight and other sources of radiation, can be wooden or plastic; the variation in materials can, in turn, introduce discrepancies in the results (which some stations in the U.S. discovered firsthand in the 1980s when they switched from traditional enclosures to electronic screens). The instruments are also sensitive to their surroundings: If you measure the temperature on a sunny day versus a cloudy day, direct sunlight on the thermometer will record a higher temperature, even if the two days are equally warm.
As a result, it can be hard for climate researchers to get the historical data they need. The regular fluctuations in global temperatures mean that just a few years’ worth of information isn’t enough; things like volcanic eruptions, solar activity, and El Niños can all throw short-term measurements out of whack, while pollution haze has been known in the past to cause a temporary cooling effect. To make sure the patterns they observe aren’t just side effects of these other phenomena, weather statisticians make inferences about global trends by averaging 10-year temperatures.
Nowadays, climate scientists have a few different tools for correcting these sorts of artificial discrepancies as the information is collected. Some have developed algorithms that identify and separate climate-change-related weather fluctuations from those attributable to some other cause. The National Oceanic and Atmospheric Administration’s Pairwise Homogenization Algorithm, for example, compares monthly temperatures from a network of stations, comparing the data from each one to that of its neighbors. To identify abnormalities in temperature data, the algorithm looks for abrupt shifts at one station that are absent from those surrounding it. The NOAA also runs its daily meteorological observations through quality-control measures to eliminate duplicate data, outliers, and other inconsistencies. When all of these numbers are corrected, the globe can be divided into a grid of boxes, and researchers can fill in the blanks based on satellite readings and temperature measurements from surrounding areas.
In recent years, scientists have also identified ways to study weather patterns from thousands of years ago. One 2013 study, for example, extended temperatures as far back as the end of the last Ice Age—more than 11,000 years ago—by examining oxygen isotopes in fossilized ocean shells. Bubbles of ancient atmosphere trapped in ice can be used to gauge carbon dioxide levels from millions of years ago; fossilized shells preserve information about ocean conditions; and plant and animal microfossils from oceans can tell scientists a great deal about prehistoric temperatures, ocean currents, and wind patterns.
Relating that raw data to what’s happening today, though, is its own long and difficult process. On its website, the American Institute of Physics explains the history of climate studies this way: “The few pages of text and numbers were the visible tip of a prodigious unseen volume of work. One simple sentence (like ‘last year was the warmest year on record’) might be the distillation of the labors of a multi-generational global community. And it still had to be interpreted.”