Recently I recommended that you check out Google Photos if you have not done so already. Like Gmail, it’s a way to store huge quantities of digital material and leave its management to someone else. (I promise, later we’ll get into the privacy tradeoffs involved.) And much more than Gmail, it offers big-data tools that can arrange and transform your information/photos in ways difficult or impossible to do by yourself.
For instance: I mentioned that Google Photos had, on its own, merged three smart-phone snapshots of a scene at Oxford into one panorama view. Several people wrote in to say: Let’s see the originals! So here goes.
First, in its full-frame entirety, a smartphone snapshot of one side of the entry quad at The Queen’s College, Oxford.
Then two almost-identical shots of the other side, both in full frame. First this:
The point is that without my doing anything more than saving all three shots to a Google drive, the system recognized them as overlapping parts of a whole and stitched them together into a high-rez, level-horizon, panorama version, looking like this (and at larger scale here):
Even when zooming in on the composite shot as far as possible, I still can’t find a pixellated boundary where the shots were brought together.
We all say in our blase way: Yeah yeah there’s increasing power of big-data systems. At least for me, seeing how it worked on my own information dramatized these effects. To be clear, this was with three quick, casual phone-shots taken over a few-second span. The result isn’t anything fancy, but it’s different from what I could have done myself.
And, as I say, we’ll get to the surveillance-state ramifications soon.
Actually, why not now. Here’s one reader response:
In the vein of the glass being neither half-full nor half-empty, but having a leak, that [big-data] magic means just as much that a computer can figure-out where you have been and when based on the photos you take, making it that much easier for a human being with access to that computer to know where you have gone and when.
The photo taker providing in essence CCTV of their movements. At least though unlike CCTV (and of course all that magical facial recognition) if the photo taker stops taking/posting the photos the intelligence stream stops.
Two quick updates on themes I’ve mentioned over the years.
Hit: Google Photos. You may well already have started using this service — some 100 million people have done so since its debut early this year. If you haven’t, by all means check it out. It is the closest thing I’ve had to the feeling of magic in online life in a very long while.
This review a few months ago, by Casey Newton in The Verge, gives you the main idea. The title, “How Google solved our photo backup nightmare,” covers one main feature. Just as Gmail long ago became the place where it was easiest and most efficient to store, arrange, retrieve, and otherwise handle electronic messages, Google Photos finally seem as if it can be the answer for the ever-mounting volume of digital images. Yes, I’m aware that Google is making use of the vast raw data users entrust to it. Newton’s piece, and another in Vergeby Ryan Gantz, explain why they think (as I do) that the tradeoff is worthwhile.
Beyond the storage-dump aspect is the application of big-data in ways that are sometimes creepy but more often useful and even astonishing. This past summer I took a few camera-phone snapshots at The Queen’s College in Oxford, where my wife and I were married long ago. The next time I logged into Google Photos, it had, unbidden, aligned and assembled the patches into the composite panorama you see above, or here. Pictures you take in the modern geo-tagging age it can of course match to locations. But based on images alone it has gone through and grouped old photos by location — giving me, for instance, a collection of pictures taken in Duluth, Minnesota in 2002, or another from Shanghai a few years later.
As Gantz says:
The service delights by offering me presents. As photos upload, Google Photos is processing old pictures I’ve forgotten about, including images that I’ve assumed were unremarkable or superfluous, and assembling them into collages, animations, and experiences that I wasn’t aware I wanted. “Assistant” offers me its creations and politely asks if I want to dismiss them or add them to my library. Like an opening of Timehop, these little creations can be surprising and lovely.
It’s hard to appreciate this feature until you experience it. I keep eagerly checking Google Photos notifications on my phone, excited about what Assistant has crafted from my digital trail. I find animations of my children playing on the grass, a collage of my wife giggling, a trip to Austin rendered as a slide show.
Let me emphasize the “hard to appreciate until you’ve seen it” point. For instance, here is a GIF animation of a visit to the Southern Tier brewery near Chautauqua in August, which Google Photos auto-created from a set of phone snapshots.
Judge for yourself, but certainly give it a try.
Miss: Livescribe Pen. I’ll try to make this concise, because I’m writing to amend the record rather than to beat up on anyone.
Starting six years ago, I have in this space frequently sung the praises of the Livescribe pen. When it appeared, Livescribe was another seemingly magical step forward: it matched notes you made in a special notebook, with audio recordings it was making at the same time. Later on, you could simply click on the notes you’d made — during an interview, at a lecture, in a language lesson — and hear that exact part of the recording played back.
The system indeed worked like magic — when it worked. But over the years, I have come to mistrust successive Livescribe models because in the real world, for me, they simply failed too often.
A pen would suddenly and unobtrusively turn itself off during an interview, so that when it was over I saw that I had captured the first 10 minutes of discussion but not the next hour. The first time this happened, I thought it was bad luck. By the fourth time, I’d lost faith. Other sessions recorded all the way through — but then proved to be corrupted and unreadable. With the plain old cheapo Olympus and Sony digital recorders I’d used before, I lacked the fancy features but had never lost information. After another data loss about a year ago, I (regretfully) stopped using Livescribe and switched back to the humble pocket recorders.
My friends at Livescribe tell me that my problems represent an unfortunate outlier experience. Maybe so. But many of my journalist friends say that they’ve had problems like mine.
The Livescribe company was recently taken over by a Swedish firm. I wish everyone there the best, and I wish for a reliable version of this pen. If you’re using one, especially if you tried it on my suggestion, I hope that it’s holding up well for you. But having repeatedly gone on record saying that I used it, I wanted to close the loop by explaining why I don’t any more.
To get better sleep, stop treating it like a chore.
“How to Build a Life” is a weekly column by Arthur Brooks, tackling questions of meaning and happiness.
For many people, the cruelest part of daily life is the transition between wakefulness and sleep. When you should be sleeping, you want to be awake; when you should be awake, you want to stay asleep. It is easy to regard sleep as a torment: hard to attain and then hard to give up, day after day after day.
According to the CDC, about 70 million Americans have chronic sleep problems. Insomnia affects between a third and a half of U.S. adults at one point or another. And we Americans are not unusually afflicted—one 2016 study reported that worldwide, 10 to 30 percent of the population experiences insomnia; some studies find rates as high as 50 to 60 percent.
The G7 summit was stuck in time, between the era of Trump and the future.
Somewhere in China, a company recently received an order for boxes and boxes of reusable face masks with G7 UK 2021 embroidered on them. Over the weekend in Cornwall, in southwest England, these little bits of protective cloth were handed to journalists covering the 2021 summit of some of the world’s most powerful industrial economies—so they could write in safety about these leaders’ efforts to contain China.
The irony of the situation neatly summed up the trouble with this year’s G7 summit. The gathering was supposed to mark a turning point, a physical meeting symbolizing not only the beginning of the end of the coronavirus pandemic but also a return to something approaching normalcy after the years of Donald Trump and Brexit. And in certain senses it was. With Joe Biden—the walking embodiment of the traditional American paterfamilias that Trump was not—no one feared a sudden explosion or American walkout as before. Biden is not the sort of person to hurl starburst at another leader in a fit of pique. And yet, the reality was that the leaders in attendance were playing their diplomatic games within tram lines graffitied on the floor largely by the former U.S. president, not the incumbent one.
High-income workers at highly profitable companies will benefit greatly. Downtown landlords won’t.
This year, two international teams of economists published papers that offer very different impressions of the future of remote work.
The first team looked at an unnamed Asian tech company that went remote during the pandemic. Just about everything that could go wrong did go wrong. Working hours went up while productivity plummeted. Uninterrupted work time cratered and mentorship evaporated. Naturally, workers with children at home were the worst off.
The second team surveyed more than 30,000 Americans over the past few months and found that workers were overwhelmingly satisfied with their work-from-home experience. Most people said it exceeded their expectations. “Employees will enjoy large benefits from greater remote work” after the pandemic, the paper’s authors predicted. They said that productivity would surge in the post-pandemic economy, “due to re-optimized working arrangements” at some of the economy’s most successful white-collar companies.
People in the United States no longer agree on the nation’s purpose, values, history, or meaning. Is reconciliation possible?
Nations, like individuals, tell stories in order to understand what they are, where they come from, and what they want to be. National narratives, like personal ones, are prone to sentimentality, grievance, pride, shame, self-blindness. There is never just one—they compete and constantly change. The most durable narratives are not the ones that stand up best to fact-checking. They’re the ones that address our deepest needs and desires. Americans know by now that democracy depends on a baseline of shared reality—when facts become fungible, we’re lost. But just as no one can live a happy and productive life in nonstop self-criticism, nations require more than facts—they need stories that convey a moral identity. The long gaze in the mirror has to end in self-respect or it will swallow us up.
Our son needs structure, but he also needs to unwind. What should we prioritize?
Editor’s Note: Every Tuesday, Abby Freireich and Brian Platzer take questions from readers about their kids’ education. Have one? Email them at email@example.com.
Dear Abby and Brian,
Everything feels untenable. I am so frustrated for my son, whom I’ll refer to as “Caleb,” who is in first grade. I’m frustrated for his teachers too, and for me and my wife. Caleb is on the verge of tears by the time online school ends at 2:30, and, to be honest, so am I. His schedule is different every day, and he can’t read well enough to follow all the directions, so even though I am working and ignoring him most of the time, he interrupts me just often enough to make me seem unprofessional. After his day is done, we let him watch TV until my wife or I can stop working, which is around 5 o’clock most days. This means that one of us has about an hour with Caleb before bath, dinner, and bedtime.
“Scientists are meant to know what’s going on, but in this particular case, we are deeply confused.”
Carl Schoonover and Andrew Fink are confused. As neuroscientists, they know that the brain must be flexible but not too flexible. It must rewire itself in the face of new experiences, but must also consistently represent the features of the external world. How? The relatively simple explanation found in neuroscience textbooks is that specific groups of neurons reliably fire when their owner smells a rose, sees a sunset, or hears a bell. These representations—these patterns of neural firing—presumably stay the same from one moment to the next. But as Schoonover, Fink, and others have found, they sometimes don’t. They change—and to a confusing and unexpected extent.
Schoonover, Fink, and their colleagues from Columbia University allowed mice to sniff the same odors over several days and weeks, and recorded the activity of neurons in the rodents’ piriform cortex—a brain region involved in identifying smells. At a given moment, each odor caused a distinctive group of neurons in this region to fire. But as time went on, the makeup of these groups slowly changed. Some neurons stopped responding to the smells; others started. After a month, each group was almost completely different. Put it this way: The neurons that represented the smell of an apple in May and those that represented the same smell in June were as different from each other as those that represent the smells of apples and grass at any one time.
Polls suggest the left will lose out in the city arguably leading the socialist revival in the United States.
Representative Alexandria Ocasio-Cortez, one of the most prominent progressive politicians in the country, warned last week that her hometown is at high risk of having a decidedly moderate mayor. Standing in New York’s City Hall Park to deliver a last-minute endorsement of Maya Wiley, a civil-rights lawyer who’d previously struggled to crack the top tier, Ocasio-Cortez urged the left to come together. “We have the candidates in the field, and it’s time for us to make a choice,” she said. “We cannot afford to sit on the sidelines. We can’t afford to not engage because of what could have been. We engage in the world that we have.”
The forces driving a likely moderate outcome in the June 22 Democratic primary are varied; many are specific to New York and to this election. But the race also contains major warning signs for progressives across the country. If the left loses out in the city arguably leading the socialist revival in the United States, it will be, at least in part, because of dramatic infighting fueled by rigid positions on sexual and social-justice politics, as well as the generalized failure to unify behind one candidate alluded to by Ocasio-Cortez.
The narrative that nonwhite people will soon outnumber white people is not only divisive, but also false.
In recent years, demographers and pundits have latched on to the idea that, within a generation, the United States will inevitably become a majority-minority nation, with nonwhite people outnumbering white people. In the minds of many Americans, this ethno-racial transition betokens political, cultural, and social upheaval, because a white majority has dominated the nation since its founding. But our research on immigration, public opinion, and racial demography reveals something quite different: By softening and blurring racial and ethnic lines, diversity is bringing Americans together more than it is tearing the country apart.
The majority-minority narrative contributes to our national polarization. Its depiction of a society fractured in two, with one side rising while the other subsides, is inherently divisive because it implies winners and losers. It has bolstered white anxiety and resentment of supposedly ascendant minority groups, and has turned people against democratic institutions that many conservative white Americans and politicians consider complicit in illegitimate minority empowerment. At the extreme, it nurtures conspiratorial beliefs in a racist “replacement” theory, which holds that elites are working to replace white people with minority immigrants in a “stolen America.”
This article was published online on June 14, 2021.
On the morning of May 25, 2019, a food-safety inspector at a Cargill meatpacking plant in Dodge City, Kansas, came across a disturbing sight. In an area of the plant called the stack, a Hereford steer had, after being shot in the forehead with a bolt gun, regained consciousness. Or maybe he had never lost it. Either way, this wasn’t supposed to happen. The steer was hanging upside down by a steel chain shackled to one of his rear legs. He was showing what is known in the euphemistic language of the American beef industry as “signs of sensibility.” His breathing was “rhythmic.” His eyes were open and moving. And he was trying to right himself, which the animals commonly do by arching their back. The only sign he wasn’t exhibiting was “vocalization.”
Simone Biles is the greatest athlete in the world today.
For me, this isn’t a debate. It’s a statement of fact. On Sunday, she won a record seventh United States gymnastics championship, continuing her jaw-dropping winning streak in every all-around competition she’s entered since 2013. The 24-year-old hasn’t lost in eight years. Typical gymnasts her age aren’t beating all their rivals by the big margins that, for Biles, have become routine.
Although Tom Brady won his seventh Super Bowl at age 43, he is no longer in his prime, and other Super Bowl–winning quarterbacks, including Patrick Mahomes and Aaron Rodgers, are arguably more physically talented. Unlike the current greats in other sports, Biles has no peer. Serena Williams is the greatest female tennis player of all time and among the greatest athletes of all time, but her career is winding down, and Naomi Osaka is in position to unseat her as the face of women’s tennis. LeBron James won’t get a chance to defend the NBA title he won with the Los Angeles Lakers last season, because the Phoenix Suns eliminated his team in the first round of this year’s playoffs.