While we rarely think of it in this way, the payment system we use every day is among the most widespread and functional examples of an Internet of Things. It is an array of objects embedded with chips, magnetic stripes, scanners, and touchpads. These things are coordinated through networking protocols used to move information and, ultimately, monetary value.
In payment systems, as flights of imagination get grounded in real infrastructures, interoperability has gone hand in hand with technological inertia. Payment systems have to work, and they have to work everywhere. When you swipe your credit card, it works. No matter where you are in the U.S., if you have money or credit in physical or electronic form, you can pay for stuff.
The payments industry that develops and markets these systems is big business. But it is difficult to measure, since it is made up of so many different kinds of companies: companies that manufacture the plastic cards that turn into credit cards, companies that build all kinds of card readers, database companies that sit behind the major card networks, specialized banks that sit behind payment service providers; and so on. Outside of the industry, people don’t tend to notice payment systems. Like all infrastructure, they think about payment systems only when they don’t work well, when a card is declined or someone starts writing a check at the front of the line at the grocery store. But every time we try to pay for something, small bits of value get transferred between actors in this network, invisibly.
Although it works pretty well most of the time, the payment system is far from uniform. It undermines any notion of linear technological progress. Payment technologies that were invented in the 1960s, like the credit card magnetic stripe, or the 1860s, like uniformly valued, nationally-based paper currency, exist alongside technologies that haven’t been fully invented yet, like Coin, an as-yet unreleased product that permits you to store payment information from multiple cards on one device. During the same day, you might have your card carbon-copied with a zip zap machine, hear modem noises coming from an ATM when you withdraw cash, use a smartphone app to pay for coffee, and be offered the opportunity to pay for a manicure in Bitcoin.
The payment industry does a surprisingly good job at creating interoperability between these varied and uneven systems, but sometimes things go awry. Here’s an example. Earlier this year, Bill (who directs two research centers that deal with money and technology) flew from California (birthplace of the credit card!) to Amsterdam (birthplace of the modern financial bubble!) to attend a conference (on the future of money!) and immediately encountered a consequence of our uneven present money infrastructure. Despite all his bonafides, Bill’s actual practice of money and technology in everyday life makes him highly vulnerable to fraud. He used his debit card to withdraw euros from the airport lobby ATM, and the card was skimmed and cloned. Over the next couple of days, someone, not Bill, had a great time in Amsterdam on his dime.
In Europe, payment cards have included a microchip that prevents this kind of fraud for as long as two decades. Each time a card is inserted into a reader and the correct PIN is entered, the chip uses electricity from the terminal to generate a dynamic authorization code. The United States payment industry, however, has been slow to adopt this technology, continuing to rely on the magnetic stripe that was introduced in the late 1960s. The typical American credit card is notorious for being one of the most hackable payment technologies around. You don’t even need software to do it: just write down all the information printed on the card. Because he is an American and carries the standard American payment card, Bill was the perfect target for traps laid in the ATM of an international airport.
Americans don’t have chip and PIN cards is because of infrastructural inertia. When banks and interbank networks first introduced magnetic stripe payment cards, they also absorbed the cost of outfitting merchants with card readers. Later, independent companies offered merchants point-of-sale devices that also provided additional services like invoicing and account reconciliation. Once magstripe point-of-sale terminals were installed everywhere, who would pay the cost of adding chip and PIN terminals? The existing infrastructure hums along nicely—even though merchants are paying for it through card fees, and consumers are paying for it through the higher prices that reflect those passed-on fees. Infrastructural change happens slowly.
But lately, payment systems have become host to tremendous imagination. In our research, we talk to a lot of people trying to develop new payment technology. We hear a lot about the past and the future. One person said American payment systems today feel “a bit like watching Mad Men.” We’ve been told that chip and PIN “smartcards” are “sooo 90s, basically already a relic that will never get traction in the U.S.” that will no-doubt be “leap frogged” by something new even though they work better than magstripe. Industry conferences have names like Money2020, implying a near-ish future that will deliver a radical change to the way we make change. “Physical currency will disappear!”
This is hardly a new promise. In the 1950s, the “cashless society” was as much a part of an idealized modern future as the jetpack and the flying car. Most of today’s dreams of a future without money harken back to a time when money was a commodity in the form of gold or silver—not a government promise. Consider all the new computer-generated “coins”: There are Bitcoins, Dogecoins, Auroracoins and other “alt coins” that make World of Warcraft’s virtual “gold” sound quaint. We hear that the “new gold” will be one new store of value or another: frequent flyer miles, reputation, personal data.