Tomasz Bobrzynski / Getty

Flavor, the conjunction of taste and smell, is not a sensation that yields easily to analysis. Unlike sights and sounds, which can be captured by cameras and microphones, there is no widespread way to measure flavor. What people experience when they eat has heretofore been largely ineffable and uncomputable.

“If I go to a farmers’ market, I can take a picture of a really lovely mushroom, but I cannot take an exact ‘flavor image’ and show it to someone and have them understand,” says Tarini Naravane, a doctoral student at the University of California at Davis who studies flavors. This goes right to an age-old philosophical question. “How do I know what I call red is what you call red? This happens far more in flavor than it does in the visual world,” Naravane says. Flavor “is far more complicated.”

But an artificial-intelligence app called Gastrograph aims to introduce a way to reliably measure flavor. If it succeeds, it will give the company that makes it a digital handle on food. And as with everything else, once flavor is digitized, it will be that much easier to understand—and control.

Launched in 2016, Gastrograph works by getting people to sample foods—usually tasters hired by food and beverage companies, but anyone with a smartphone can download the app—and analyzing their input with AI to glean further insight. When people try a new food or drink, they enter their impressions into Gastrograph’s spiderweblike interface, where each spoke represents a flavor category, from floral to woody to retro-nasal. After users pick a value for a category, they delve a step deeper, choosing specific descriptors; if a food is fruity, is it more like green apple, tangerine, or elderberry? Using data entered previously by many users, the system has developed its own representation of how each food and drink shows up in the app’s detailed flavor space, which has more than 600 dimensions.

When someone enters a new review, Gastrograph compares the report with the app’s body of data. Its AI analysis can then determine the flavors in the food even better than the person who submitted the review, according to Jason Cohen, the founder and CEO of Analytical Flavor Systems (AFS), the company behind the app. Humans are constantly experiencing flavors that we can’t identify, Cohen says: “We’ve all had that feeling: Oh, I know this flavor, what is that?

Moreover, we don’t consciously notice many flavors we perceive, even though they can be important components in a gustatory experience. As an example, Cohen says that if you add trace amounts of vanilla to milk, people generally report that it tastes sweet, creamy, and delicious, without putting their fingers on vanilla. If someone tries lychee for the first time and reports only different flavors they’re familiar with, the app may be able to recognize that they’re really tasting lychee, Cohen says.

Gastrograph’s iPhone interface (Analytical Flavor Systems)

In each food and drink sampled, Gastrograph tries to make a comprehensive model by pinning down all of its flavors, including the hidden ones. The app is “literally reading someone’s mind,” Cohen says, but then quickly corrects himself. “No, if we were reading their mind, they would’ve known they were tasting it. We’re reading their subconscious.”

AFS is selling Gastrograph as a way for food manufacturers to get a better understanding of what they’re producing and how it relates to customers. Some of the company’s first clients were brewers who wanted to make sure their beers maintained the same flavor over time. Brewing depends on agricultural products that naturally vary from year to year, which chafes against the brewer’s need for consistency.

Yards Brewing in Philadelphia, AFS’s longest-running customer, uses the tool routinely. “We just register a user and they sit down with a bartender, get out their phone, and have a tasting. By doing that repeatedly, we can calibrate them as a taster,” says Frank Winslow, Yards Brewing’s director of quality control. Since the app knows what the beer should taste like at each stage, tasters can be used to check that the product is on track. “Having those kinds of warnings early in the process is a huge step,” Winslow says.

Once Gastrograph models a food or drink, it can then try to simulate what would happen when you change that food or drink’s flavor or introduce it to new demographics. The app does this using a technique from the field of computational linguistics. Language researchers use machine learning to analyze huge piles of text and create many-dimensional models of the meanings of words. The models can also find relationships between words using operations such as “kingman + woman = queen.” Gastrograph uses similar operations in its flavor space to try to predict how new demographics will like foods they haven’t tried.

One of AFS’s new clients is Acelerada, a kind of high-tech Skunk Works for Grupo Bimbo, the Mexico-based baking giant. Bimbo is in the midst of bringing Sanissimo, its most popular cracker in Mexico, to the United States. Before the launch, Acelerada ran a pilot program with Gastrograph to test how American consumers felt about the product. Bimbo plans on creating a different version of the cracker with more sea salt sprinkled on top, testing it with Gastrograph, and potentially modifying the recipe for the crackers that go on sale. Acelerada hopes to use Gastrograph in the future to help run small taste panels and make more specialized products to appeal to specific demographics. “It might be the way people do it going forward,” says Alicia Rosas, the manager of Acelerada.

While AFS’s early customers hope to use AI to maintain the flavor of their products or to make incremental changes, it’s fair to wonder how far Gastrograph’s effects will go if the tech does prove useful for predicting people’s subconscious preferences. How much do we want an app poking around in thoughts that are secret even from ourselves? The world is just starting to reckon with how social-media platforms and omnipresent screens can hijack our mental processes in destructive ways, getting us hooked as we wait for the next “like.” If AI can really tap into flavors in the subconscious, will it be used to help make foods that are addictive? Will apps stick to ingredients that are healthful and sustainable, or gravitate to whatever will help beat the competitors?

Cohen, an enthusiastic foodie, says our AI-enabled future will be more tasty than creepy. “Everything is going into targeted, niche. There’ll always be a beer you like more, and it won’t be the same as for me,” he says. “Far from being a dystopian nightmare, there will be better products for everyone.”

But AI is one of the more powerful tools humans have ever devised. Like all tools, its effects depend on its use. Cohen and AFS’s current customers seem to care about making better food and drink, not just moving more product. Here’s hoping that users of artificial intelligence keep that same thought in mind when they employ the technology to redesign what we eat.

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.