This article is from the archive of our partner .
Despite hiring a couple of linguists to get its new search engine to move "beyond 'Robospeak" and understand how people talk, Facebook hasn't taught Graph Search how to do that very well just yet. And that's a problem, no matter which way the social network spins it. Unlike Google's pattern-matching search engine, Facebook's new recommendation-based social search platform tries to understand full sentences. And that takes context, something that's very hard to teach even the smartest computers, as one of the linguists that worked on the project, Amy Campbell, told The New York Times's Somini Sengupta.
In order to think more like a person the Graph Search team taught the engine 25 synonyms for "student" so that when someone types in "Stanford Academics that work at Facebook" the engine knows to look for "students" — 275,000 different ways in fact. But it turns out that an English class isn't the future of machine learning: a grammar and vocabulary lesson proves a lot easier than complex sentient thoughts, and that's where Facebook's new product breaks down in practice.
For example, Graph Search doesn't get vague pronouns. My query today for "photos Elle Reeve likes that she commented on" confuses Facebook's beyond-robo engine. Instead, Graph Search results track down photos that my Atlantic Wire colleague "likes" but that I commented on: