Flavor, the conjunction of taste and smell, is not a sensation that yields easily to analysis. Unlike sights and sounds, which can be captured by cameras and microphones, there is no widespread way to measure flavor. What people experience when they eat has heretofore been largely ineffable and uncomputable.
“If I go to a farmers’ market, I can take a picture of a really lovely mushroom, but I cannot take an exact ‘flavor image’ and show it to someone and have them understand,” says Tarini Naravane, a doctoral student at the University of California at Davis who studies flavors. This goes right to an age-old philosophical question. “How do I know what I call red is what you call red? This happens far more in flavor than it does in the visual world,” Naravane says. Flavor “is far more complicated.”
But an artificial-intelligence app called Gastrograph aims to introduce a way to reliably measure flavor. If it succeeds, it will give the company that makes it a digital handle on food. And as with everything else, once flavor is digitized, it will be that much easier to understand—and control.
Launched in 2016, Gastrograph works by getting people to sample foods—usually tasters hired by food and beverage companies, but anyone with a smartphone can download the app—and analyzing their input with AI to glean further insight. When people try a new food or drink, they enter their impressions into Gastrograph’s spiderweblike interface, where each spoke represents a flavor category, from floral to woody to retro-nasal. After users pick a value for a category, they delve a step deeper, choosing specific descriptors; if a food is fruity, is it more like green apple, tangerine, or elderberry? Using data entered previously by many users, the system has developed its own representation of how each food and drink shows up in the app’s detailed flavor space, which has more than 600 dimensions.