How Computers Parse the Ambiguity of Everyday Language

Words with multiple meanings pose a special challenge to algorithms.

A marshmallow roasting over a fire
Mike Lawrie / Getty

If you’re one of the 2.4 million Twitter followers of the Hamilton impresario Lin-Manuel Miranda, you’ve come to expect a delightful stream of observations, including tweets capturing conversations with his son Sebastian, now 3 years old. Earlier this month, Miranda offered one such exchange under the title, “S’MORES. A Real-Life One-Act Play.”

Me: So that’s the marshmallow but you’re going to eat it with this graham cracker and chocolate.

[My son looks at me like I am the dumbest person alive.]

Sebastian: No, I’m going to eat it with my MOUTH.

[End of play.]

A charming slice of life, to be sure. But in that brief interaction, young Sebastian Miranda also inadvertently hit upon a kind of ambiguity that reveals a great deal about how people learn and process language—and how we might teach computers to do the same.

The misinterpretation on which the s’mores story hinges is hiding in the humble preposition with. Imagine the many ways one could finish this sentence:

I’m going to eat this marshmallow with ...

If you’re in the mood for s’mores, then “graham cracker and chocolate” is an appropriate object of the preposition with. But if you want to split the marshmallow with a friend, you could say you’re going to eat it “with my buddy Charlie.” If you’re only grudgingly consuming that marshmallow, you could say you’re going eat it “with great reluctance.” Or you could say “with my hands” (or “with my mouth” like young Sebastian) if you’re focused on the method of eating.

Somehow speakers of English master these many possible uses of the word with without anyone specifically spelling it out for them. At least that’s the case for native speakers—in a class for English as a foreign language, the teacher likely would tease apart these nuances. But what if you wanted to provide the same linguistic education to a machine?

As it happens, just days after Miranda sent his tweet, computational linguists presented a conference paper exploring exactly why such ambiguous language is challenging for a computer-based system to figure out. The researchers did so using an online game that serves as a handy introduction to some intriguing work currently being done in the field of natural language processing (NLP).

The game, called Madly Ambiguous, was developed by the linguist Michael White and his colleagues at Ohio State University. In it, you are given a challenge: to stump a bot named Mr. Computer Head by filling the blank in the sentence Jane ate spaghetti with ____________. Then the computer tries to determine which kind of with you intended. Playful images drive the point home. In the sentence Jane ate spaghetti with a fork, Mr. Computer Head should be able to figure out that the fork is a utensil, and not something that is eaten in addition to the spaghetti.

Ajda Gokcen

Likewise, if the sentence is Jane ate spaghetti with meatballs, it should be obvious that meatballs are part of the dish, not an instrument for eating spaghetti.

Ajda Gokcen

In addition to these two possibilities, the noun (or noun phrase) following with could also indicate manner (Jane ate spaghetti with gusto) or company (Jane ate spaghetti with Mary).

Mr. Computer Head tries to differentiate among these potential semantic roles in two ways. In basic mode, the program takes a rule-based approach that has traditionally been used in NLP, first zeroing in on the main noun and then looking it up in a semantic database called WordNet. If the noun is classified as an artifact, then the computer guesses the role is instrumental (like fork). If it is a kind of food, then the role is assumed to be part of the dish (like meatballs). If the noun appears to be a kind of feeling, then that fits the manner role (like gusto). If it’s anything else, Mr. Computer Head surmises that the noun refers to the company that Jane is keeping while eating spaghetti.

In advanced mode, Mr. Computer Head uses a more cutting-edge NLP technique known as word embedding. In this approach, words and phrases are mapped onto a geometrical space known as a vector space, which captures degrees of similarity among different words in a corpus (a large collection of texts). Similar words appear closer to each other in the vector space. Mr. Computer Head matches up the words in the input phrase with clusters of words corresponding to the different possible interpretations—say, the instrument cluster, the manner cluster, and so on.

White presented the research findings of the Madly Ambiguous team at the annual conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Both the basic and advanced approaches work fairly well, they found, despite users trying to stump the system. Since the game was launched about a year ago, Mr. Computer Head has matched up with user judgments 64 percent of the time in basic mode and 70 percent of the time in advanced mode. If you enter “this graham cracker and chocolate,” for instance, both modes correctly recognize that this phrase must be part of the dish, not some method of eating, as Sebastian seemed to think in the s’mores story.

The computer system may never be perfect, since some ambiguities can’t be resolved by any kind of formal analysis. In the sentence Jane ate spaghetti with relish, does relish refer to a condiment or to enthusiastic enjoyment? The enjoyment interpretation might make more sense if you have the real-world knowledge that relish is not a typical condiment for spaghetti. But if the sentence were Jane ate the hot dog with relish, the ambiguity would be even more pronounced.

Truth be told, humans run aground on ambiguous language all the time. Consider the Associated Press headline, “Supreme Court Rules Narrowly for Colorado Baker Who Wouldn’t Make Same-Sex Wedding Cake.” When Donald Trump Jr. encountered that sentence on Twitter, he misconstrued the word narrowly. “I am reading about a 7–2 vote. Pretty sure that’s not narrowly,” he tweeted. If he had bothered to read past the headline, he would have learned that the ruling was made on narrow grounds, even though the vote itself was not a narrow one.

Context is indeed the key for resolving most linguistic ambiguity, which is a lesson that Sebastian Miranda will gradually learn. He’ll also learn that “eating something with your mouth” isn’t something people usually specify. (If Mr. Computer Head were looking for that kind of usage in the corpus, it would likely only be found in certain expressions like the rhetorical question after someone has said something obscene: “Do you eat with that mouth?”)

For computational systems to better overcome the challenges of ambiguous language, NLP researchers will need to arm them with the same kinds of contextual thinking that young Sebastian is developing.