Almost a decade ago now, McDonald’s made a seemingly innocuous decision. On the side of Happy Meals distributed in Morocco in 2008, it put a small map of the region. The map showed a border between the disputed territory of Western Sahara and Morocco—a vision of reality that differed from, among other accounts, Morocco’s official stance.
The controversy led to protest and, eventually, an official apology from the company.
Anyone who has ever spun an old globe in a flea market understands the ever-changing nature of geopolitics. Although schoolchildren may be taught a geography of fixed and everlasting borders, adults often grow to learn that all maps are political constructions, constantly changing and rife with political biases. The “wrong” map could literally start a war.
Just like the cartographers of yore, multinational corporations—particularly Internet companies—play a role in defining and shaping political boundaries for the public’s consumption. This rise of huge, international corporations online has torn away at the Emerald Curtain that once obscured the variety of geopolitical boundaries that exist in the world, making clearer to the average person just how unsettled the planet’s borders really are.
Given the global nature of the Internet, corporate giants like Google and Microsoft are forced to define borders, often contending with demands from governments. The result? One’s view of certain countries’ borders is often dependent on the physical location from which one accesses Google or Bing maps. In other cases—such as that of the Western Sahara—jurisdiction is a determining factor. Microsoft, which has offices in Morocco, takes its cue from Rabat in determining the territory’s borders, while Google—which does not—draws a dotted line between Morocco and the Western Sahara, demarcating the disputed border.
Although maps are perhaps the most well-known and obvious example of how social-media companies define boundaries, they are not the only one. Conduct a Google search for something with a definitive answer—such as “how long do cats live?” or “what is four plus four?”—and the company will present you with a boxed answer; that is, an “official” response pulled from third-party data and framed in a box above other search results. Google has thus far been opaque about where the data that fills the boxes comes from.
This lack of transparency becomes an issue when the question is not “what is four plus four?” but “What time is it in Ramallah?” Two years ago, the answer to this query was returned in a Google box that stated it was 2:00 p.m. in “Ramallah, Israel.” While Apple and other companies have faced controversy for their labeling of disputed Jerusalem, the location of Ramallah is hardly disputed: The city is the home of the Palestinian Authority, and situated in the West Bank, in what is referred to by most of the world (including the United States) as the occupied territories. When prompted, Google fixed the situation by removing any reference to country—responses to queries about time in Ramallah now place the city outside of national territory, as do queries about time in other disputed locales such as Dakhla (in Western Sahara, under Moroccan control) and Sevastopol (in Crimea, under Russian control).
Google and Microsoft aren’t the only companies to get caught up in geopolitical conflict. In May 2014, the porn actress Belle Knox was surprised to hear that Twitter had agreed to block her photos in Pakistan, citing a legal request they’d received from Pakistan’s telecommunications authority.
The decision to block the content was in accordance with a policy that the company introduced in 2012 in an effort to comply with government regulations on speech. Rather than remove content entirely as other companies do, Twitter created a system whereby content would be “withheld” from users in a given country. Users are notified that the content in question has been withheld due to a legal request from a government. In addition to Pakistan, the tool has been used in numerous countries, including France, Brazil, and Russia.
The tool’s usage means that one “view” of the platform from a given country is different from the view from another. In other words, a Pakistani Twitter user is provided a sanitized version of Twitter, while an American one has access to—as far as we know—whatever content they desire. Corporate decisions around controversial speech, such as this one, all too often result in the creation of an “iron curtain” of sorts, dividing the seemingly borderless Internet.
The impact of corporate content regulation on local politics can be severe. The majority of popular social-media platforms belong to American companies, which means that their policies are at least inspired by United States law. In some cases, such as those involving copyright infringement or violent threats, platforms must comply with U.S. law. In others—such as when dealing with content from terrorist groups—the law is murky.
Facebook and other platforms appear to underpin their definition of “terrorism” with American law; specifically, by blocking the ability of U.S.-designated terrorist organizations to have a presence on their platforms. Though no company has been explicit about this, clues from media coverage suggest that Facebook is doing so out of a potential misinterpretation of so-called material support statutes that prevent American citizens (and companies) from providing “material support” to terrorist organizations.
In a diverse global environment, the old adage “one man’s terrorist is another man’s freedom fighter” carries some weight.
In several countries, groups that the U.S. designates as terrorists are legitimate political actors, active in local or national legislatures. In Lebanon, Hezbollah—which is designated as a terrorist group by the United States after its attack on Marines in the 1980s—functions as a political party, with members elected to parliament and serving in the cabinet.
Like al-Qaeda and ISIS, Hezbollah cannot easily having a presence on Facebook—it is banned by a set of corporate regulations that restrict use of the platform from “dangerous” organizations. Other Lebanese political parties can utilize the platform as they wish, including to campaign for elections. Though inadvertently, Facebook is, in a sense, preferencing these parties by banning their opponents. The implications of this for a country where several parties run candidates who are also accused of war crimes are myriad.
As people have come to increasingly rely on corporate social platforms for their daily dose of politics, humor, and social interaction, many often fail to notice the lack of neutrality with which these companies actually operate. Like the arbiters of speech that preceded them—governments, churches, and the like—companies are led by individuals who bring to the table their own world-views; in the case of Silicon Valley companies, that worldview is often American and male. A policy that allows for violent content but bans nudity, for example, follows in the tradition of American film and broadcast television regulations. In fact, while many treat online social spaces like the proverbial town square, they are actually more like shopping malls, privately owned and authorized to restrict content however they deem appropriate.
One needn’t imagine how, in the wrong hands, such freedom could have a real world effect on geopolitics: In 2011, Google saw this firsthand when its Mapmaker project—intended to allow users to add content to Google Maps in countries where robust cartography might not exist—enabled Syrian opposition activists to change the names of streets and landmarks to ones that reflected their goals. Though the changes were eventually reversed, the incident demonstrated how casually companies sometimes handle these issues.
For better or worse, many have come to rely on these platforms to provide up-to-the-minute information on everything from the weather to the current state of the world. It is therefore prudent to remember that, behind the veneer of impartiality, companies like Google, Twitter, and Facebook are staffed by real people, who come with their own biases and their own worldviews. The decisions that they make can have real-world impact.
We want to hear what you think about this article. Submit a letter to the editor or write to firstname.lastname@example.org.