As we look back on the first 10 years of the new millennium, the ubiquity of the Internet and the growth of all its social networking possibilities, from email to Blackberries, iPhones and Facebook, is surely one of the most significant changes to emerge from the decade. Granted, the changes began a decade before that. But the past 10 years has seen a phenomenal boom in the growth of Internet access and usage ... up 1600% in the Middle East, 1300% in Africa, and more than 300% worldwide, so that Internet users now number more than half the total population of the world, according to Internet World Stats.
We can now easily connect with friends on different continents without waiting two weeks for a letter, talk via computer without the expense of international long distance, and share new baby or other photos with hundreds of friends and relatives in a single posting, via personal Web sites and Facebook pages. My next-door neighbors, who are from Turino, Italy and just had a baby, even hooked up webcams around the house so the distant grandparents could watch live videos of their new grandson playing, feeding, and sleeping from half a world away. With chat rooms, email, Skype, Facebook, and the worldwide Web community, the possibility of being isolated or without someone to "talk" to is far more remote.
But are there hidden costs to all this connectedness? Is it possible that for some, there is loneliness, not safety, in numbers? Two essays by Willaim Deresiewicz in The Chronicle for Higher Education--one last January, and one penned only a few days ago, argue that it is. In his most recent essay, Deresiewicz quotes two studies, one from 1985, and one from 2004, that show a marked decline in people who have a "close confidant." In 1985, only one out of 10 people said they lacked such a person in their life. In 2004, that number had climbed to four out of 10. And that was before so many blogs and social networking sites expanded the number of options (and distractions) for how we spend whatever social connection time we have.
So as we spend more time connecting to the world, it appears that at least some of us may be trading off depth for breadth. We are at once more connected and less connected, depending on how you look at it. But that's not the only impact that concerns Deresiewicz. In his essay from last January, entitled "The End of Solitude," he talks about the impact of constant connectivity on our comfort with being quiet and alone. Just as boredom comes from a discomfort with idle time, he argues that loneliness comes not from being alone per se, but from discomfort over being alone. Just as a small child has to learn to put themselves to sleep, we have to learn how to be comfortable with being alone. And that takes practice ... practice that is far easier to avoid with all the distractions of constant connectivity.
The essays are an interesting read on the history of friendship, social values, and how evolving technology has affected our social connections, from the evolution of the suburbs to the advent of the Internet. And whether you agree with his assessment of Facebook and its impact on social connection, he raises some interesting and valid points.
Without question, there are certain elements that exist in inverse proportion to each other. An Olympic gold-medal athlete has deep expertise in one area, but generally trades off experience and knowledge in other subjects for that one field of excellence. You can go deep, or broad, but generally not both. Quality begins to degrade if increasing quantity is demanded in the same time frame. If you have 10 priorities, you really have none. The same goes for intimacy. Just ask anyone who's tried to balance multiple intimate relationships at the same time.
Friendship is less demanding than a more intimate and vulnerable romantic connection, but the same principle applies. I've noticed, the more times I've moved, and the more people I've met, how much harder it is to keep up with all those friendships on any significant level. Acquaintances are easy to maintain with casual, group emails and Holiday notes. But real friends? They take time and energy--both to develop, and to nurture or maintain.
Facebook, Twitter, Group Emails, texting and other mass communication and connection vehicles don't preclude anyone also taking that time and focus to develop a few deep friendships, any more than they preclude taking time to read, think, or get comfortable with yourself, alone. But they do throw more potential and tempting distractions in the mix, as well as a slightly guilty feeling that we should be keeping up with all those people. In our increasingly immediate, non-stop society, all of us struggle to find enough time for family and friends. And the more of that already-squeezed time anyone spends maintaining a broad network of Internet, text, Facebook and Twitter friends and updates, the less time and energy they have to devote to any one friend or person. It's just simple math.
Once upon a time, books and conversations were the only distractions we had. We also tended to stay in small, local communities, so we had years to develop ties with one small group of people. Is there a link between our moving away from those communities and the development of more media to assuage the loneliness and distance that ensued? I wouldn't be surprised if someone told me there was. But in any event, the media and distractions came. First radio and movies. Then TV. Then videos. Then video games, the Internet and the cell phone. For the past 50 years, there's been some passive way to avoid facing silence, alone with yourself, if you really wanted to.
At the very least, the increase in connection and distraction possibilities increases the need to make choices among all the options. There is no technology that can speed up the time it takes to have an intimate, personal, and unique conversation with a single friend. But it can increase the number of friends, past and present, with whom I could have those conversations, either via email or just through the reconnection magic of Internet searches. So the temptation is there to become scattered--and in trying to keep up with all, to end up keeping up well with none.
Does that mean that our friendships are in danger of becoming less deep, or that the increased distractions mean we've gotten worse at learning to be alone, in silence? Maybe. But only if we've allowed it. Avoiding scatteredness--in social connections, anyway--is simply a matter of prioritizing and letting go of things that are less important. And getting immersed in distractions is a choice. For those who are afraid of being alone, there have always been distractions. For those of us who recognize the value of silence and deeper connections, I doubt the advent of new technologies will suddenly change our craving for those things.
Indeed, as I've sat in a snow-bound Connecticut house, curled up with a bad cold the past few days, I've remembered again the beauty of a slower pace of living. One that allows for a long chat with an old friend, a well-developed thought, or the joy of spending time over a piece of writing not due two hours later. But I also love being able to keep in frequent touch with lifelong friends who live in Paris, in ways we never would if it took mailing international letters, instead of email, to connect.
As always, it's a matter of balance; of being master of the sorcery at our disposal, instead of letting it master us. Of course, balance itself is a skill that, like being comfortable with solitude or a deep friendship, requires patience, dedicated effort ... and evolves, in most cases, with age, experience, and time.
People labeled “smart” at a young age don’t deal well with being wrong. Life grows stagnant.
ASPEN, Colo.—At whatever agesmart people develop the idea that they are smart, they also tend to develop vulnerability around relinquishing that label. So the difference between telling a kid “You did a great job” and “You are smart” isn’t subtle. That is, at least, according to one growing movement in education and parenting that advocates for retirement of “the S word.”
The idea is that when we praise kids for being smart, those kids think: Oh good, I'm smart. And then later, when those kids mess up, which they will, they think: Oh no, I'm not smart after all. People will think I’m not smart after all. And that’s the worst. That’s a risk to avoid, they learn.“Smart” kids stand to become especially averse to making mistakes, which are critical to learning and succeeding.
As he prepares for a presidential run, the governor’s labor legacy deserves inspection. Are his state’s “hardworking taxpayers” any better off?
This past February, at the Conservative Political Action Conference (CPAC) outside Washington, D.C., Wisconsin Governor Scott Walker rolled up his sleeves, clipped on a lavalier microphone, and without the aid of a teleprompter gave the speech of his life. He emerged from that early GOP cattle call as a front-runner for his party’s nomination for president. Numerous polls this spring placed him several points ahead of former Florida Governor Jeb Bush, the preferred candidate of the Republican establishment, in Iowa and New Hampshire. Those same polls showed him with an even more substantial lead over movement conservative favorites such as Ted Cruz, Rand Paul, and Mike Huckabee. In late April, the Koch brothers hinted that Walker would be the likely recipient of the nearly $900 million they plan to spend on the 2016 election cycle.
The untold story of the improbable campaign that finally tipped the U.S. Supreme Court.
On May 18, 1970, Jack Baker and Michael McConnell walked into a courthouse in Minneapolis, paid $10, and applied for a marriage license. The county clerk, Gerald Nelson, refused to give it to them. Obviously, he told them, marriage was for people of the opposite sex; it was silly to think otherwise.
Baker, a law student, didn’t agree. He and McConnell, a librarian, had met at a Halloween party in Oklahoma in 1966, shortly after Baker was pushed out of the Air Force for his sexuality. From the beginning, the men were committed to one another. In 1967, Baker proposed that they move in together. McConnell replied that he wanted to get married—really, legally married. The idea struck even Baker as odd at first, but he promised to find a way and decided to go to law school to figure it out.
Many authors have been tempted into writing revisionist histories of the 37th U.S. president, but these counterintuitive takes often do not hold up under closer scrutiny.
Every once in a while someone writes a book arguing that Richard Nixon has been misunderstood. These authors tend to focus on some particular aspect of his presidency that, the argument goes, is more important than that Watergate business. They’ve focused on his domestic policy or his foreign policy as achievements that override his flaws and his presidency’s denouement. Nixon’s highly complex persona also has led to books that probe his psyche—a hazardous and widely debunked practice, though that hasn’t discouraged further attempts.
And, as with other major figures, but all the more so given the drama of his time on the national stage, Nixon’s complexity and essentially low repute tempts some authors to offer revisionist approaches to his place in history. Such approaches have to be assessed on their own merits, not accepted merely because they’re counterintuitive or receive a lot of attention, as new assessments of the controversial and fascinating Nixon tend to do. Two major revisionist books about Nixon argued that his domestic policy was so expansive, humane, and innovative that it overrides his unfortunate behavior; their accounts relegate Watergate to a far less important role. The problem with these books is that they don’t stand up to close scrutiny.
Mike Huckabee and Ted Cruz are suggesting there might be ways for states and cities to nullify the justices’ ruling. They’re wrong.
The Supreme Court’s decision last week did make gay marriage legal around the nation. Unfortunately for social conservatives, it did not, however, make nullification legal around the nation.
Nullification is the historical idea that states can ignore federal laws, or pass laws that supercede them. This concept has a long but not especially honorable pedigree in U.S. history. Its origins date back to antebellum America, where Southern states tried to nullify tariffs and Northern states tried to nullify fugitive-slave laws. In the 1950s, after Brown v. Board of Education, some Southern states tried to pass laws to avoid integrating schools. It didn’t work, because nullification is not constitutional.
The social network learns more about its users than they might realize.
Facebook, you may have noticed, turned into a rainbow-drenched spectacle following the Supreme Court’s decision Friday that same-sex marriage is a Constitutional right.
By overlaying their profile photos with a rainbow filter, Facebook users began celebrating in a way we haven't seen since March 2013, when 3 million peoplechanged their profile images to a red equals sign—the logo of the Human Rights Campaign—as a way to support marriage equality. This time, Facebook provided a simple way to turn profile photos rainbow-colored. More than 1 million people changed their profile in the first few hours, according to the Facebook spokesperson William Nevius, and the number continues to grow.
“This is probably a Facebook experiment!” joked the MIT network scientist Cesar Hidalgo on Facebook yesterday. “This is one Facebook study I want to be included in!” wrote Stacy Blasiola, a communications Ph.D. candidate at the University of Illinois, when she changed her profile.
For centuries, experts have predicted that machines would make workers obsolete. That moment may finally be arriving. Could that be a good thing?
1. Youngstown, U.S.A.
The end of work is still just a futuristic concept for most of the United States, but it is something like a moment in history for Youngstown, Ohio, one its residents can cite with precision: September 19, 1977.
For much of the 20th century, Youngstown’s steel mills delivered such great prosperity that the city was a model of the American dream, boasting a median income and a homeownership rate that were among the nation’s highest. But as manufacturing shifted abroad after World War II, Youngstown steel suffered, and on that gray September afternoon in 1977, Youngstown Sheet and Tube announced the shuttering of its Campbell Works mill. Within five years, the city lost 50,000 jobs and $1.3 billion in manufacturing wages. The effect was so severe that a term was coined to describe the fallout: regional depression.
Engineers at IBM and Google claim they're closer than ever to making computers that could process data in days that would take millions of years to flow through today's machines.
One of the first electronic, programmable computers in the world is remembered today mostly by its nickname: Colossus. The fact that this moniker evokes one of the seven wonders of the ancient world is fitting both physically and conceptually. Colossus, which filled an entire room and included dinner-plate-sized pulleys that had to be loaded with tape, was built in World War II to help crack Nazi codes. Ten versions of the mammoth computer would decrypt tens of millions of characters of German messages before the war ended.
Colossus was a marvel at a time when “computers” still referred to people—women, usually—rather than machines. And it is practically unrecognizable by today's computing standards, made up of thousands of vacuum tubes that contained glowing hot filaments. The machine was programmable, but not based on stored memory. Operators used switches and plugs to modify wires when they wanted to run different programs. Colossus was a beast and a capricious one at that.
The U.S. Supreme Court’s decision in Obergefell v. Hodgeswill shift the debate over gay-marriage debate from a legal fight to a cultural and religious conflict.
God appears exactly twice in the U.S. Supreme Court’s decision in Obergefell v. Hodges, which held that gay marriage is a constitutional right in America. Both mentions come in Clarence Thomas’s dissent to the majority opinion: Once, he cites the 1754 writings of the English churchmanThomas Rutherforth, who argued that the only restraints on liberty are “the law of nature, and the law of God.” Toward the end, he also quotes the Declaration of Independence, noting that the Framers established their vision of rights based on the dignity bestowed upon them by their Creator.
God probably deserved a little more credit than he received in the footnotes of the case’s four dissents, also authored by Samuel Alito, John Roberts, and Antonin Scalia. In their own ways, each of the justices defended the “traditional” definition of marriage—or, in less veiled language, a certain Judeo-Christian understanding of marriage as an institution created and commanded by God.
The commonwealth is facing a serious debt crisis that could result in default, but that’s only part of the problem.
Updated on June 30, 2015
Puerto Rico is a small island with some big financial problems. Governor Alejandro Garcia Padilla recently told TheNew York Times that there was no way the island, which has been struggling with about $72 billion of debt, would be able to pay, and instead would try to work out new deals and deferred payments with some of its creditors. This, of course, has lead to fears that the commonwealth will default on its loans.
The admission that Puerto Rico’s finances are much worse than originally thought was spurred by areport commissioned by the Government Development Bank, an agency tasked with developing economic and financial strategies for the commonwealth, and conducted by current and former IMF staffers. The report, nicknamed The Krueger Plan for its lead author Anne Krueger, doesn’t mince words when it comes to the outlook for the debt-laden island: "Structural problems, economic shocks and weak public finances have yielded a decade of stagnation, outmigration and debt. Financial markets once looked past these realities but have since cut off the commonwealth from normal market access. A crisis looms.”