Apple CEO Tim Cook introduces the iPhone 6S at an event in September 2015.Stephen Lam / Getty Images

Nearly a decade ago, Steve Jobs strode out onto a dark stage in San Francisco to the final brass hits of James Brown’s “I Feel Good.” He greeted the audience with a sly smile. “Thank you for coming,” he said. “We’re going to make some history today.”

In the grandiose style that had come to define his keynote speeches, Jobs told the crowd he’d been waiting for this moment for more than two years. “Every once in a while, a revolutionary product comes along that changes everything,” he said to the audience of 44,000, before revealing the first-ever iPhone to sustained applause. Over the course of the next hour, he’d go on to describe the device and its features as “revolutionary,” using the word 14 times in the course of his presentation. “It works like magic,” he said of the phone’s screen, using another of his favorite ways to talk about new Apple products.

For decades, those sorts of declarations were exactly what excited technology consumers (and investors). People were looking for the next big thing, for “disruptions” that would upend their daily routines. Personal technology was pitched as sorcery; the more it broke with the past, the more impressive it would seem. The packaging for Windows 3.1, an operating system Microsoft released in 1992, included a VHS tape. The tape begins with a young Bill Gates—tie askew, wearing glasses the size of coasters—gazing into the camera. “In this video, you’re going to see the future,” he says.

Today, those sweeping declarations of revolution might seem out of place. When large companies announce a product or service, they often hedge its newness with assurances of familiarity and ease-of-use, in order to avoid alienating consumers who might be put off by the prospect of learning to use something completely different. Companies sell evolutions instead of revolutions, promising a product that can make your life easier without making you learn a new language.

This new normal is apparent in Twitter’s latest announcements. The service is bleeding users—it lost two million in the last three months of 2015—and it’s bent over backwards to bring on new ones. In October, Twitter introduced Moments, a feature that organizes important tweets by topic; a few months later, it began testing a change that would show popular tweets higher in users’ timelines, out of chronological order. Those features weren’t meant for avid Twitter users; instead, they’re meant to make the product more friendly to new users who may have previously been too intimidated to jump into the fast-moving network.

Among longtime Twitter users, the backlash to the timeline change was swift and furious, even though they could opt out. Users wave the same pitchforks every time Facebook tweaks its design, briefly overtaking the site with a flood of threats to delete their accounts unless the site reverts to its previous state. (It never does, and people quickly forget.)

Why this rabid aversion to change? For one, the target audience for mainstream products and services has broadened considerably. The cost of computers—desktops, laptops, smartphones, you name it—has dropped to the point that the overwhelming majority of people have access to one or more devices. The average tech user is far less savvy than before.

The technology market’s shift toward personal tech started many years ago, but even once laptops and MP3 players began to spread, they were too expensive and finicky to be ubiquitous. Even the iPod, which was eventually crowned king of the portable-audio empire, only sold about 400 million units in the 15 years since the first one was introduced. By contrast, Apple will likely sell its billionth iPhone this summer. With that sort of reach, even a change that drives away a sliver of its user base can have a huge impact: One half of one percent of a billion unsatisfied users is still five million angry people.

The proliferation of hyper-personal technology—social networks that house thousands of family photos; handheld devices that keep all of a person’s most valuable secrets—has also changed the relationship users have with the machines that surround them. The furious rebellion of a Facebook user whose homepage just got rearranged is a far cry from the professional frustration that a photographer might feel when her favorite photo-editing software loses a powerful feature. It’s a much more personal hurt.

A Facebook user who offers up his life to the site, sharing his happiest and most distressing moments, may feel entitled to a say in the way a service looks and works. Over the years, the user has developed an idea of what Facebook “should” be, and feels betrayed when it sweeps the rug out from under him. In truth, of course, he represents a minuscule fraction of the enormous user base that Facebook depends on for revenue.

Because tech users can seem so hostile to change, some companies have gotten good at tricking people into learning and liking a new feature. Take, for example, 3D Touch, the marquee feature of Apple’s latest iPhone that allows users to press harder on their screens to activate different features.

Apple CEO Tim Cook introduced it on stage as a “tremendous breakthrough,” but then he focused the rest of his speech on demonstrating how the feature would save users a few taps and swipes here and there. “No matter what you like to do with your phone, 3D Touch makes it better than ever,” Cook said.

My colleague Adrienne LaFrance wrote about this feature shortly after it was announced. Much of the internet immediately complained about its “uselessness,” but Adrienne took the long view. “We’re only just beginning to see what pressure-sensitive screens will mean for how people use phones,” she wrote. “And a lot of that is because developers are still figuring out what to do with the technology.”

Fast-forward about six months (you can press down harder on your screen to speed up) and the prediction begins to flower. After the feature was introduced, developers updated their apps and games to use 3D Touch in creative ways, but the biggest change finally came from Apple, which maintains complete control over the iPhone’s operating system. When Cook announced iOS 10 earlier this month, he showed off a new lock screen populated by interactive notifications. How are they invoked? With 3D Touch.

With Cook’s announcement, the feature was suddenly made relevant. Users who upgraded last fall have spent half a year getting used to the idea of 3D Touch, even if many didn’t adopt it for daily use. (Many of those who didn’t upgrade probably played with it on their friends’ newer phones.) Armed with hours of practice, they now just have to learn one new thing—how the new lock screen works—to take full advantage of it. That’s a lot simpler than trying to digest a new lock screen interface at the same time as 3D Touch itself.

And perhaps the souped-up lock screen is itself a sleeper feature as well. Rumors have long swirled that Apple will do away with the round home button—the only physical button on the phone’s face. The new lock screen wakes when a user raises his or her phone from the table or a pocket—no need to press a button to bring it to life anymore. That small change is the latest in a string of developments that are pushing the home button to the brink of obsolescence. If Apple does eventually retire it, users will already be accustomed to interacting with their phones largely without it.

It’s easy enough to make incremental changes that feed users’ hunger for novelty while keeping them from fuming over an abrupt transition. But some technologies are so fundamentally different that there’s no way to ease someone into it. Take virtual reality, for example: The closest equivalent that most have experienced is a 3D movie, and those are pretty terrible. VR is one of those things you have to try to understand, and there’s no comparison.

But unlike handheld devices or social networks, to the mainstream user, VR in 2016 is still completely new. Few will put on a headset for the first time with a well-formed idea of what to expect, and so as long as the device doesn’t make them sick, they’ll probably be impressed by the immersive experience. The draw of VR is still that it’s new and different—revolutionary, even.

But VR technology is still in its infancy, and is improving dramatically nearly every month. In 10 or 15 years, when a VR developer tweaks a popular app’s interface to move a crucial menu from the top-left to the bottom-right of every user’s field of view, the world will groan in unison and post angry comments on whatever social-media platform is ruling the world at that moment. They’ll vow to throw their headsets in the trash. But they won’t.

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.