The machines of modern meal-making are tools of considerable precision. This is the age of bluetooth-enabled meat thermometers and smartphone-powered toaster ovens, devices that reflect the idea that food-making is more science than art.
This isn’t new. The latest kitchen machinery merely builds on a longstanding obsession with culinary exactness—a fixation that’s long been shaped by emerging technologies. Microwaves count down by the second. Ovens automatically preheat to 350 degrees with the press of a button.
Or, they seem to, anyway.
In fact, different ovens set to the same temperature can vary by as much as 90 degrees, according to an investigation by Cook’s Illustrated magazine. And even when an oven says it’s at 350 degrees, the temperature can shift up and down quite a bit—dipping to 300, rising to 400—as something’s cooking. (Plus, some ovens simply wear out over time.)
So how did 350 degrees become the sweet spot—in so many recipes, and as an oven preset—in the first place?
The magic of cooking at 350 degrees isn’t magic at all, but chemistry. It is, for example, the level associated with the Maillard Reaction, the chemical process that gives so many foods a complex flavor profile—and an appealing golden-brown hue—when sugar and protein are heated together just so.
“Without Maillard chemistry we would not have a dark bread crust or golden brown turkey,” wrote the authors of a Royal Society of Chemistry book about the reaction, “our cakes and pastries would be pale and anemic, and we would lose the distinctive color of French onion soup.” The Maillard Reaction—which actually entails a series of reactions—isn’t all toasty goodness, however. It’s also responsible for making apples turn brown, which many people find unappetizing “despite negligible effect on flavor,” the authors write.
Louis Camille Maillard, the chemist for whom the reaction is named, didn’t set out to do culinary research when he first described the browning effect in 1912. But his name is still evoked frequently among chefs, nutritionists, scientists, and others interested in how proteins and sugars together unlock tasty new molecules in a variety of foods right around 350 degrees. (There’s some debate about the exact temperature; some put it closer to 335 degrees.)
Maillard aside, 350 is simply a moderate temperature—another reason it works well for many recipes. It’s hot enough to cook things fairly quickly but no so hot that your dish burns.
But many chefs aren’t fixated on any one temperature, and instead think of their craft in terms of ranges: “Really low, under 275 degrees; moderate, between 275 and 350; high, over 350 but under, say 425; and maximum,” the cookbook author Mark Bittman once told Slate. It wasn’t until the 20th century that recipes routinely included precise temperatures—even in the 1950s, it was common to see terms like “slow oven” and “moderate oven” in place of any number. The very concept of cooking at a constant and precise temperature is technologically driven, an extension of a device that seemed miraculous at the time it was introduced: The regulator.
“The regulator makes scientific cooking possible to the most unscientific woman, and few realize how many perfect recipes are spoiled by wrong handling of the oven heat,” The New York Tribune wrote in a 1919 piece about the the Clark Jewel Gas Range. “[E]ven if the housewife does not know that a slow oven is about 250 to 300 degrees, a moderate oven 350, a hot oven 400 to 450, and very hot 450 to 475, the little wheel of the regulator tells her these facts in words as well as in figures and she can translate any recipe that calls for a moderate or slow or hot oven accordingly.”
The device was located on the oven, and usually involved a wheel or pointer you could turn to the temperature of your choice. This was connected to a thermometer-and-valve contraption that would expand as the oven got hotter, and prevent the temperature from going up when the upper limit was reached. Today, the ability to set a constant temperature seems so inherent to the concept of how an oven works—it’s just what ovens do. But when regulators were new, they were a marvel of automation.
“This is a simple device which means freedom from oven watching,” the Lansing State Journal wrote in 1932. “They permit hours of leisure away from the kitchen that could not otherwise be managed.”
Well, maybe. The complicated history of domestic technologies shows us that all that leisure time may not have actually materialized—the relationship between time and technology isn’t so straightforward. (See also: the Crock-pot and the vacuum cleaner.)
In the decades that followed, temperature regulators became standardized—and their displays eventually digitized. These days, most ovens preheat with the touch of a button rather than the turn of a wheel or dial. But even the first rudimentary devices were an improvement upon old-school methods when cooks had to develop their own inventive ways of testing an oven’s heat. “For instance, when baking bread they sometimes throw a piece of white paper in the over, and if it turns brown the oven is at the proper temperature,” the Indiana Weekly Messenger reported in 1903. “Or, when baking other things, they will throw a little cornmeal or flour into the oven in order to test the heat.”
The best temperature gauge in those days was far more primitive—yet still allowed for “marvelous accuracy,” the newspaper wrote. “You take a man who is an expert in the business, and he can tell what the temperature of the oven is by simply touching the handle of the oven door. In nine cases out of ten he will not miss it the fraction of a degree.”
That’s an awfully precise statistic to put forth for an unverified skill, yet the larger point stands: The human hand was a different kind of instrument for the bakers of generations past. Their modern counterparts may have lost this ability somewhere along the line. But even in an era that prizes precision, amid the computerized ovens and internet-connected kitchens, a fraction of a degree hardly matters.