Much has been made of the existence of “filter bubbles,” the information feedback loop in which our preferences and viewpoints are continually amplified. This can happen in the analog world—how many of us would go out of our way to actually spend time with people whose worldviews are radically different from ours?—but is perhaps most often referenced as an artifact of our digital lives. Therein, through sophisticated recommendation algorithms, we are generally, if not always shown the materials we are most likely to like, and at worst, least likely to hate, so as to either instantaneously initiate or at least sustain the possibility of future click-throughs and extended visits to a website.

It has been suggested that filter bubbles were at least partially responsible for the election of Donald Trump, engendering an environment of optimism and overconfidence in the Democratic faithful when in fact the sky was falling around them. Moreover, life in the Democratic filter bubble was presumably not only feeding happy election news to the constituents, it was also possibly keeping out the news about the large cohort of disaffected Democrats who either were not energized enough to get to the polls, or angry enough to ultimately either switch parties, not vote, or come out for the first time in a long time for a populist and hatemongering candidate. In short, after the election, the words you would hear echoing around the filter bubble were: “who knew?”

Technology contributed to the building of ideological walls, but it can also help knock them down. Let’s start with Google, most people’s portal to the world’s information. The days of simple “PageRank,” the anodyne algorithm based almost entirely on the link structure of the web, are long gone. Search now depends on host of other variables, related to the phenomenon (and niche information industry) of “search engine optimization,” the very name of which tells you that searching is something that can be gamed. That said, what if Google—or any web process that returns or pushes information to you—gave you access to a simple dashboard that would allow you to experience the information environment like your different-minded twin digital avatar.

Think of your digital representation as a point in space—which is in fact how many of these companies represent you, but in a space that has possibly hundreds if not thousands of dimensions! This idea would show you how digital life looks like from the other side of that universe. The Wall Street Journal conducted a similar, small-scale experiment of this nature with a program that generated side-by-side views of liberal and conservative Facebook feeds.

Companies could do that without giving you a look under the hood—in other words, you wouldn’t have to know precisely how the algorithm works—giving you a few knobs to turn to potentially change your trip through the digital universe.

What if, alongside Google’s list of “top sites,” you were given a list of randomly chosen sites from the tail? It might even provide a way for Google or another vendor to broaden your tastes. When it came to delivering news, you might find yourself exposed to sites and sources that you would never come into contact with during your daily information strolls. You might find yourself, if but for a moment, walking in another person’s digital shoes.


This article is part of a collaboration with the Markkula Center for Applied Ethics at Santa Clara University.