While text analytics has become common on Wall Street, it has not yet been widely used to assess the words written by employees at work. Many firms are sensitive about intruding too much on privacy, though courts have held that employees have virtually no expectation of privacy at work, particularly if they’ve been given notice that their correspondence may be monitored. Yet as language analytics improves, companies may have a hard time resisting the urge to mine employee information.
One obvious application of language analysis is as a tool for human-resources departments. HR teams have their own, old-fashioned ways of keeping tabs on employee morale, but people aren’t necessarily honest when asked about their work, even in anonymous surveys. Our grammar, syntax, and word choices might betray more about how we really feel.
Take Vibe, a program that searches through keywords and emoji in messages sent on Slack, the workplace-communication app. The algorithm reports in real time on whether a team is feeling disappointed, disapproving, happy, irritated, or stressed. Frederic Peyrot, one of Vibe’s creators, told me Vibe was more an experiment than a product, but some 500 companies have tried it.
Keeping tabs on employee happiness is crucial to running a successful business. But counting emoji is unlikely to prevent the next Enron. Does KeenCorp really have the ability to uncover malfeasance through text analysis?
That question brings us back to June 28, 1999. The two men from KeenCorp didn’t realize it, but their algorithm had, in fact, spotted one of the most important inflection points in Enron’s history. Fastow told me that on that date, the company’s board had spent hours discussing a novel proposal called “LJM,” which involved a series of complex and dubious transactions that would hide some of Enron’s poorly performing assets and bolster its financial statements. Ultimately, when discovered, LJM contributed to the firm’s undoing.
According to Fastow, Enron’s employees didn’t formally challenge LJM. No one went to the board and said, “This is wrong; we shouldn’t do it.” But KeenCorp says its algorithm detected tension at the company starting with the first LJM deals.
Today, KeenCorp has 15 employees, half a dozen major clients, and several consultants and advisers—including Andy Fastow, who told me he had been so impressed with the algorithm’s ability to spot employees’ concerns about LJM that he’d decided to become an investor. Fastow knows he’s stuck with a legacy of unethical and illegal behavior from his time at Enron. He says he hopes that, in making companies aware of KeenCorp’s software, he can help “prevent similar situations from occurring in the future.”
I was skeptical about KeenCorp at first. Text analysis after the fact was one thing, but could an analysis of employee emails actually contain enough information to help executives spot serious trouble in real time? As evidence that it can, KeenCorp points to the “heat maps” of employee engagement that its software creates. KeenCorp says the maps have helped companies identify potential problems in the workplace, including audit-related concerns that accountants failed to flag. The software merely provides a warning, of course—it isn’t trained in the Sarbanes-Oxley Act. But a warning could be enough to help uncover serious problems.