Is the Problem With Tech Companies That They're Companies?

A Stanford professor argues that a profit imperative is in tension with the needs of a democratic society.

The headquarters of Facebook in Menlo Park, California
The headquarters of Facebook in Menlo Park, California (Noah Berger / Reuters)

What news do people see? What do they believe to be true about the world around them? What do they do with that information as citizens—as voters?

Facebook, Google, and other giant technology companies have significant control over the answers to those questions. It’s no exaggeration to say that their decisions shape how billions see the world and, in the long run, will contribute to, or detract from, the health of governing institutions around the world.

That’s a hefty responsibility, but one that many tech companies say they want to uphold. For example, in an open letter in February, Facebook’s founder and CEO Mark Zuckerberg wrote that the company’s next focus would be “developing the social infrastructure for community—for supporting us, for keeping us safe, for informing us, for civic engagement, and for inclusion of all.”

The trouble is not a lack of good intentions on Zuckerberg’s part, but the system he is working within, the Stanford professor Rob Reich argued on Monday at the Aspen Ideas Festival, which is co-hosted by the Aspen Institute and The Atlantic.

Reich said that Zuckerberg’s effort to position Facebook as committed to a civic purpose is “in deep and obvious tension with the for-profit business model of a technology company.” The company’s shareholders are bound to be focused on increasing revenue, which in Facebook’s case comes from user engagement. And, as Reich put it, “it’s not the case that responsible civic engagement will always coincide with maximizing engagement on the platform.”

For example, Facebook’s news feed may elicit more user engagement when the content provokes some sort of emotional response, as is the case with cute babies and conspiracy theories. Cute babies are well and good for democracy, but those conspiracy theories aren’t. Tamping down on them may lead to less user engagement, and Facebook will find that its commitment to civic engagement is at odds with its need to increase profits.

The idea that a company’s sole obligation is to its shareholders comes from a 1970 article in The New York Times Magazine by the economist Milton Friedman calledThe Social Responsibility of Business Is to Increase Its Profits.” In it, Friedman argued that if corporate executives try to pursue any sort of “social responsibility” (and Friedman always put that in quotes), the executive was in a sense betraying the shareholders who had hired him. Instead, he must solely pursue profits, and leave social commitments out of it. Reich says that these ideas have contributed to a libertarian “background ethos” in Silicon Valley, where people believe that “you can have your social responsibility as a philanthropist, and in the meantime make sure you are responding to your shareholders by maximizing profit.”

Reich believes that some sort of oversight is necessary to ensure that big tech companies make decisions that are in the public’s interest, even when it’s at odds with increasing revenue. Relying on CEOs and boards of directors to choose to do good doesn’t cut it, he said: “I think we need to think structurally about how to create a system of checks and balances or an incentive arrangement so that whether you get a good person or a bad person or a good board or a bad board, it’s just much more difficult for any particular company or any particular sector to do a whole bunch of things that threaten nothing less than the integrity of our democratic institutions.”

Reich said that one model for corporations might be creating something like ethics committees that hospitals have. When hospitals run into complicated medical questions, they can refer the question to the ethics committee whose members—doctors, patients, community members, executives, and so onrepresent a variety of interests. That group dives deeply into the question and comes up with a course of action that takes into account various values they prize. It’s a complicated, thoughtful process—“not an algorithm where you spit out the correct moral answer at the end of the day,” Reich said.