Elina Lange-Ionatamishvili, an official at NATO’s Strategic Communication Center of Excellence in Latvia, agreed that education is essential, but argued this long-term approach should be matched with efforts to educate current voters, such as “social-advertising campaigns helping citizens to recognize fake news, disinformation, and also propaganda.” The EU’s East StratCom Task Force’s Disinformation Review is one example of such a campaign. “Governments have a great responsibility in setting the right policy priorities and allocating resources to enable the citizens to defend [themselves] from foreign disinformation campaigns,” Lange-Ionatamishvili said. “But at the end of the day each citizen is on their own when faced with the 21st-century information ‘deluge.’”
Given the emphasis on cyber defenses and social-media algorithms in many American conversations about disinformation, I was surprised by how rarely northeast-European officials emphasized technical solutions to the problem. While officials in many countries noted the relevance of technology, most were far more focused on their populations’ “psychological resilience” and viewed technological developments with a dose of fear. NATO’s Lange-Ionatamishvili worried about extremely realistic audio-video editing, as did Geir Hågen Karlsen, director of Strategic Communication and Psychological Operations at the Norwegian Defense University College. “In the future we will have to deal with Troll Factory 2.0: human trolls replaced by advanced bots and a few operatives,” and also “artificial intelligence, algorithms like natural language generation, manipulation of speech, imagery, and soon also video, higher speed, and most importantly, more sophisticated manipulation,” he said.
In Estonia, even some technology experts advise against focusing too closely on technology. In the spring of 2017, Liisa Past, the chief research officer at the Cyber Security Branch of the Estonian Information System Authority, attended a meeting on i-Voting, Estonia’s remote voting system. The gathering focused on technology to protect the electoral process from cyber attacks. But Past felt her colleagues ignored non-technological, human vulnerabilities. “First I got scared,” she said, “then I got reasonably loud.” An attack on the vote-counting system itself was possible, she explained to her colleagues, but expensive. “Our adversary’s m.o. is to go for low-hanging fruit” such as political campaigns, she said. More realistic attacks would target the candidates and the parties, who lacked the technical expertise of the government, and the “news or information space layer,” sowing doubt and confusion among the electorate without needing to hack secure systems.
Finland made a concerted effort to educate its military officers, civilian officials, and journalists on the social science behind disinformation, in order to more thoroughly process influence campaigns and determine how to respond effectively. The government didn’t pursue purely technical solutions or quick fixes, but rather delved into the empirical evidence behind human vulnerability to disinformation. This allowed officials to confront the complexity of the problem and devise more thoughtful responses. This, too, is possible in America.