Now, be thankful I work for a company that embodies the value of a spirit of generosity, because when I read that, I wasn't thinking very generous things. In fact, it is precisely the mission of the NCTC to connect dots. Right? I mean, who DIDN'T know that? Really? Who didn't know that?
Let's assume, for the moment, that the law does not give the director of NCTC, Michael Leiter, the "primary responsibility within the United States Government for conducting net assessments of terrorist threats," which it does.
The NCTC was set up precisely to solve the dot-connecting problem that the 9/11 Commission identified. The intelligence committee knows this. Congress knows this. The American public knows this. And the NCTC ... well, the NCTC is parsing language.
An intelligence official said that the 14 missed clues could easily be read as the 14 chances the intelligence community had to connect the dots and prevent the bombing attempt -- and failed. Fourteen chances!
It is infuriating to hear such a thing. It seems so obvious to those outside the circle that practicing responsibility and accountability would go along way toward solving the communications issues that prevent a piece of data from getting from point A to point C, which might be the terrorist watch list. There will always be human judgments intervening, and the SSCI report points out how plenty were well-intentioned but ultimately mistaken.
But what the report really revealed, without making the conclusion explicit, is that every entity in the IC seemed to be going out of its way to avoid responsibility for making the call. For picking up the phone, stepping on someone's toes, and saying, "You know what ... something doesn't feel right about this guy." For sending e-mail after e-mail to people in other agencies urging them to check and recheck databases. For making TACTICAL decisions about immediate intelligence priorities.
(Within the past few months, DNI Dennis Blair has set up an analytical cell within the NCTC to evaluate tactical intelligence. Finally!)
The lack of a sense of urgency -- or what John Brennan, the president's chief counter-terrorism adviser calls "pulse" -- is astonishing and disheartening.
Not long ago, I asked a senior intelligence official to estimate the number of separate databases regularly used by entities that conduct counterterrorism missions. He thought for a moment, and said, "About 50." Do these databases talk to one another? Most of them don't. They don't interface. They don't update in real time. Many of them are sealed off from most analysts because of security classifications and turf wars. Yes, there are meetings and task forces designed to facilitate "interoperability."
But to those of us watching someone nearly bring down a plane, no one takes responsibility for making sure, even at the risk to his or her own career, that the damn bits in one server talk to the damn bits in another. Michael Leiter himself is well regarded by the intelligence community. He is trying. But the SSCI report finds explicitly that the NCTC "Failed to Fulfill Its Mission."
That is a damning indictment of a lot of people. It's an indictment of the entire structure of the Office of the Director of National Intelligence, which oversees the NCTC. It's an indictment of the CIA, which apparently still refuses to share key counterterrorism information with the NCTC. It's an indictment of Congress, which has never properly empowered the Director of National Intelligence. It's an indictment of Barack Obama's national security staff, which did not appreciate the magnitude of the problem until this incident. It's an indictment of a culture that still exists among the senior executives at many agencies. These seniors are intelligence professionals, so they are able to mouth phrases like "need to share" and "work together" but when they get back to their desks, they're back into their silos.
It should worry Congress and those concerned about intelligence that the IC culture is broken.
The SSCI gave its report to the White House and the intelligence agencies two months ago, and an official told me last night that the the IC had made progress implementing many of its regulations. The new budget contains more authority for the DNI to make technical decisions more quickly, which should help with the database issues. A DNI official said that Blair "accepted" blame and is making necessary changes.
The report doesn't provide too much detail on intelligence collection, which is par for the course. That stuff is sensitive. But reading through the lines, it appears as if the SSCI wanted to send another message about overreyling on electronic intelligence (ELINT) and signals intelligence (SIGINT) ... and not so much on finding, verifying, vetting and running down nuggets of information from human sources. Human sources are very sensitive and it is very hard to share information from them without disclosing their identities. And many younger IC analysts are trained to read through data, rather than to evaluate HUMINT, much less evaluate it in the context of everything else.
A few days after the Christmas Day incident, President Obama brought his intelligence cabinet together in the Situation Room and said, "I could fire each and every one of you." He did not do that. Instead he said that he would assess each agency's performance over the next several months. Everyone, essentially, was put on notice. Obama's reluctance to fire someone, particularly his director of national intelligence, grated on some in Congress, but they understood the difficulty: firing Adm. Blair would be a gesture, would make a dedicated patriot a scapegoat, and would compound the problem, not help solve it.
No one in government wants to be the DNI because everyone believes that it lacks one of the two fundamental ingredients for power in D.C.: access. But the DNI has plenty of access; it's an open debate about his authority over budgets and programs. Some Blair agonists believe that he hasn't used the power he has and has focused on the wrong priorities. Instead of fighting with the CIA over covert ops, he should have fought with the CIOs of the community over information sharing. Instead of expanding the DNI's 4,000-person staff, he should have pared it down to its essentials, reducing the number of decision makers and streamlining the analytic process. Blair's staff would disagree; they say that budget authority is but one ingredient. The other is the full backing of the president. And there is a perception in Blair's inner circle that the White House hasn't always been there for Denny Blair.
When the National Security Agency began its "Stellar Wind" domestic eavesdropping programs, perhaps the most tragic legacy of that decision was the shame that many analysts at NSA felt upon the program's disclosure. These analysts had spent their entire lives working off the assumption that the NSA does not spy on Americans. That spying on Americans is wrong. When the NSA began to spy on Americans, however carefully they did it, it would not be irresponsible to say that a large number of the people who do their jobs at NSA very well began to question whether their job was worth doing. This is not to say that the policymakers who felt compelled to create the program were wrong. It is to say simply that policies have endogenous consequences as well.
Footnotes. Numbers. Detailed proposals. The Donald’s economic address at an aluminum factory in Pennsylvania had it all.
Donald Trump must have hired some researchers.
The famously off-the-cuff orator delivered a surprisingly specific speech on trade, making seven detailed policy pledges while predicting that Hillary Clinton, if elected, would tweak and then sign the enormous Pacific trade pact she now opposes as a candidate for president.
Trump’s address to workers at a Pennsylvania aluminum factory continued his recent effort to lift both the tone and substance of his speeches. But it marked an even bigger departure in its sheer wonkiness.First, his campaign sent out the prepared remarks with 128 footnotes. And in delivering the speech from a teleprompter, Trump delved into such granular policy detail that he referenced specific sections of decades-old trade laws and vowed to invoke “Article 2205” of the North American Free Trade Agreement. Doing so, he said, would withdraw the U.S. from NAFTA if its trading partners don’t agree to renegotiate the Clinton-era accord.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
Fears of civilization-wide idleness are based too much on the downsides of being unemployed in a society premised on the concept of employment.
People have speculated for centuries about a future without work, and today is no different, with academics, writers, and activists once again warning that technology is replacing human workers. Some imagine that the coming work-free world will be defined by inequality: A few wealthy people will own all the capital, and the masses will struggle in an impoverished wasteland.
A different, less paranoid, and not mutually exclusive prediction holds that the future will be a wasteland of a different sort, one characterized by purposelessness: Without jobs to give their lives meaning, people will simply become lazy and depressed. Indeed, today’s unemployed don’t seem to be having a great time. One Gallup poll found that 20 percent of Americans who have been unemployed for at least a year report having depression, double the rate for working Americans. Also, some research suggests that the explanation for rising rates of mortality, mental-health problems, and addiction among poorly-educated, middle-aged people is a shortage of well-paid jobs. Another study shows that people are often happier at work than in their free time. Perhaps this is why many worry about the agonizing dullness of a jobless future.
It’s the cloudless map’s first major makeover since 2013.
More than 1 billion people use Google Maps every month, making it possibly the most popular atlas ever created. On Monday, it gets a makeover, and its many users will see something different when they examine the planet’s forests, fields, seas, and cities.
Google has added 700 trillion pixels of new data to its service. The new map, which activates this week for all users of Google Maps and Google Earth, consists of orbital imagery that is newer, more detailed, and of higher contrast than the previous version.
Most importantly, this new map contains fewer clouds than before—only the second time Google has unveiled a “cloudless” map. Google had not updated its low- and medium-resolution satellite map in three years.
Their degrees may help them secure entry-level jobs, but to advance in their careers, they’ll need much more than technical skills.
American undergraduates are flocking to business programs, and finding plenty of entry-level opportunities. But when businesses go hunting for CEOs or managers, “they will say, a couple of decades out, that I’m looking for a liberal arts grad,” said Judy Samuelson, executive director of the Aspen Institute’s Business and Society Program.
That presents a growing challenge to colleges and universities. Students are clamoring for degrees that will help them secure jobs in a shifting economy, but to succeed in the long term, they’ll require an education that allows them to grow, adapt, and contribute as citizens—and to build successful careers. And it’s why many schools are shaking up their curricula to ensure that undergraduate business majors receive something they may not even know they need—a rigorous liberal-arts education.
At least 36 people were killed in an attack Tuesday at Ataturk airport, one of the busiest in Europe.
Here’s what we know:
—Explosions and gunfire were reported Tuesday night at Istanbul’s Ataturk International Airport, one of the busiest in Europe. Turkey’s prime minister, Binali Yildirim, said three attackers opened fire at the airport’s international terminal and detonated explosives, blowing themselves up. Officials suspect the Islamic State was behind the attack.
—At least 36 people were killed and 147 wounded, the prime minister said. Photos from the scene showed bloodied bodies and debris on the pavement outside the terminal.
—We’re live-blogging what’s happening, and you can read how it unfolded below. All updates are in Eastern Standard Time (GMT -5).
Witnesses described the attack and the chaos that followed to reporters in Istanbul. From the AP:
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
The 18th-century ailment was on the brink of elimination before budget cuts helped resurrect it.
In recent months, newspapers around the country have published stories that sound like they could have been written 100 years ago. Indiana’s syphilis cases skyrocketed by 70 percent in a single year. Texas’ Lubbock county was under a “syphilis alert.” Various counties face shortages of the medication used to treat syphilitic pregnant women.
But the headlines are very much modern—and urgent. Syphilis is back, public-health experts say.
For many years, syphilis was considered a practically ancient ailment—a “Great Pox” that, like tuberculosis or polio, Americans just don’t get anymore. There were just 6,000 cases of primary and secondary syphilis in 2000, and the CDC briefly thought the disease’s total elimination was within reach.
There are two basic modes of judgment: criticism and praise. The former consists of identifying a subject’s flaws; the latter of noting its merits.
In most settings, criticism tends to dominate. For any idea or book or movie or what have you, the question that people discuss is what’s wrong with it, why it didn’t live up to expectations. Often, one gets the feeling that the criticism isn’t dispensed in an effort to engage with the work but as a demonstration of the critic’s smarts, the implicit argument being that he or she is sharper and more discerning than the work’s creator.
Often, the greater intellectual challenge—as a reader, as a viewer, and as a manager—is to recognize when something is truly great.
American society increasingly mistakes intelligence for human worth.
As recently as the 1950s, possessing only middling intelligence was not likely to severely limit your life’s trajectory. IQ wasn’t a big factor in whom you married, where you lived, or what others thought of you. The qualifications for a good job, whether on an assembly line or behind a desk, mostly revolved around integrity, work ethic, and a knack for getting along—bosses didn’t routinely expect college degrees, much less ask to see SAT scores. As one account of the era put it, hiring decisions were “based on a candidate having a critical skill or two and on soft factors such as eagerness, appearance, family background, and physical characteristics.”
The 2010s, in contrast, are a terrible time to not be brainy. Those who consider themselves bright openly mock others for being less so. Even in this age of rampant concern over microaggressions and victimization, we maintain open season on the nonsmart. People who’d swerve off a cliff rather than use a pejorative for race, religion, physical appearance, or disability are all too happy to drop the s‑bomb: Indeed, degrading others for being “stupid” has become nearly automatic in all forms of disagreement.