Communication technologies affect the rate at which people lie, and remote television studios may actually encourage deception
People lie a lot. But the rate at which we lie is not constant. The technology we use to talk with each other influences how often we try to deceive one another.
That is to say, the medium -- phone, IM, email, Facebook -- changes the amount of deception that occurs relative to face-to-face interactions. Cornell researcher Jeffrey Hancock has found that people lie substantially more over the phone than they do in person and less over email and IM. A few key factors affect the rate of lying:
- People lie more during real-time (synchronous) interactions.
- People lie more when they are not in the same room with the people they're lying to.
- People lie less when an interaction is easily recorded.
This is why the phone is the favored technology of the liar: it's synchronous, long-distance, and not easily recorded. Interestingly, it doesn't seem to matter whether people are particularly good or bad at detecting lies in a given medium. That could be because everyone in the world is pretty bad at detecting deception in all communication, Hancock said.
This emerging body of research got me thinking about a ubiquitous feature of our media landscape: the pundit remote television hit. This strange technological system, in which experts are piped onto a TV news program from boxy little studios around the world, has a lot of the features of the phone call, which may make it easier for talking heads to lie regardless of their political persuasion.
This isn't an attack on Fox or NBC, but the system as a whole. In order to make a certain kind of television cheaper, the news shows may have built a system that encourages (or at least doesn't discourage) deception.
Here's what it's like to do a television hit from an "insert studio." You walk into a room where a cameraman or handler guides you into a chair, which is strategically positioned in front of a screen of some type. For viewers, the screen will render as a photo or video feed of an iconic scene from your city. A tiny earbud is inserted into your right ear and a mic clipped onto the left side of your tie. You are looking right at a camera and there are bright lights. You can't see the show that you're going to be on. Your only connection to the show is through the sound coming into your earbud. You stare into the camera and talk. Sometimes there isn't even someone in the room with you.
As a technological system, it's not the very worst that you could imagine to encourage deception. (That's the phone, remember.) The video is recordable. But it's also not generally publicly available in a searchable format, which many text communications are. Imagine if blog commenters were out there critiquing pundit talking points in real-time, the way that you/they do our posts.
Hancock also noted that it may be difficult for people to viscerally believe that they're being recorded because of the isolated nature of the talking-head studio. "There isn't a lot of feedback and there is very little sense of recordability," he said. So, you get people potentially more willing to be deceptive or at least uninhibited."
It seems as if you're just talking into a camera -- not all the people watching the video feed -- which can actually have some other strange effects on your behavior. "We see this with stuff like Girls Gone Wild, where women do these things that they would never be caught doing normally," Hancock said. "You'd think the camera would make people wary, but it actually primed the performative sense."
If it is true that some technological systems encourage truth-telling or others discourage it, we should redesign this system. Why make it easier for pundits of all types to lie?
The first step would be public and fully searchable transcripts of everything said on political news shows. You could even show the real-time transcription to the person doing the television hit in the insert studio. Then they would know what they were saying was going into the permanent record.
It's important to note that this lying problem isn't about technology generally. Hancock's latest research finds that people's LinkedIn resumes are more likely to be truthful than the ones they create in Word. The network, he hypothesizes, acts as a constraint on people. For the same reason, he thinks Facebook is a "fairly honest space," so much so that he believes "Facebook could have, in the short term, an honesty effect."
Perhaps in the future, pundits will have to duke it out *on Facebook* and the truth-telling pushes and pulls will balance each other out, forcing them to be just as honest as they would be while standing in front of you.
Image: The Beyondpix studio in San Francisco, where I once sat staring into space.