He may still be slogging it out in the Republican primary, but he used a speech in Chicago to try to shape his general-election message.
Mitt Romney hasn't yet made it out of the Republican primary, but in a speech on the economy at the University of Chicago Monday, he didn't mention Rick Santorum or Newt Gingrich. Instead, Romney did his best impression of a Republican presidential nominee, contending that President Obama has sought to erode Americans' "economic freedom."
The speech didn't roll out any new policy proposals or open any broad new themes for Romney, but it offered a preview of how he'll approach his tricky general-election challenge -- arguing that the president is egregiously mishandling the economy even as, for the moment at least, the economy is improving.
Earlier in the day Monday, Romney acknowledged as much, telling a crowd in Springfield, Ill.: "I believe the economy is coming back, by the way. ... The problem is this [recession] has been deeper than it needed to be and a slower recovery than it should have been, by virtue of the policies of this president."
In the Chicago speech, Romney pointed to the "weak recovery" as "proof" that the current administration has squelched growth. "This administration thinks our economy is struggling because the stimulus was too small," he said. "The truth is we're struggling because our government is too big."
Romney's speech had a highbrow cast, beginning as it did with a hoary anecdote about Milton Friedman, whom he referred to chummily by his first name. (The story: Watching workers on a government project in Asia building a canal with shovels, Friedman wondered why they didn't use machines; he was told it was a jobs program. "If it's jobs you want, then you should give these workers spoons, not shovels," he supposedly said. Though Romney used the story to demonstrate that "government does not create prosperity," this is not necessarily an argument against government's ability to create jobs -- nor is it clear that Friedman is the true source of this well-worn economic anecdote.)
Romney proceeded to cite the Harvard historian David Landes' The Wealth and Poverty of Nations -- a work that looks to explain the economic miracle of the development of Western Europe, a region whose oppressive socialism Romney routinely laments.
In this speech, though, Romney didn't use that particular bit of red meat, another potential sign he's moving on from the GOP base-baiting of the primary. He used Landes' theory that "culture" is the fundamental underpinning of economic success to argue that America's culture of economic freedom is what "drives our economic vitality." Those who would raise taxes or expand burdensome regulation, he said, threaten that fundamental freedom.
Taxes and regulation are bad -- a pretty boilerplate Republican notion, and Romney didn't go into too many specifics about his own plans. Instead, he related folksy anecdotes of suffering Americans: a guitar-amp-maker in St. Louis who claims the government skims 65 percent of his business's profits; a couple in Idaho who the EPA wouldn't allow to construct a home on their residential property.
Romney quoted from Obama's own words, citing his speech last week that Americans "are inventors, we are builders, we are makers of things, we are Thomas Edison, we are the Wright Brothers, we are Bill Gates, we are Steve Jobs."
Actually, Romney claimed, "the reality is that under President Obama's administration, these pioneers would have found it much, much more difficult, if not impossible, to innovate, invent and create." Regulators, he said, "would have shut down the Wright Brothers for their dust pollution," while "the government would have banned Thomas Edison's light bulb -- oh yeah, they just did." (In fact, legislation increasing light-bulb efficiency standards passed under George W. Bush and didn't ban incandescent bulbs.)
Curiously, Romney didn't mention gas prices, which many Republicans see as Obama's biggest economic vulnerability at the moment. He took three questions. To a query about his proposed tax cuts increasing the deficit, as independent analysts have claimed, he argued that he would make up the difference by cutting spending and increasing economic growth. To a question about urban poverty, he vowed to send federal welfare money to be administered by states and localities instead, then turned to education, which he vowed to fix in part by paying teachers more.
To a question about youth concerns, Romney got a bit flustered. "I don't see how a young American could vote for a Democrat. I apologize for being so offensive in saying that," he said, as if abashed by the way he just couldn't help being so partisan. Democrats, he said, are threatening future generations' prosperity by piling up debt and threatening the long-term sustainability of entitlement programs.
Not so long ago, it was Obama who was in the unenviable position of arguing a counterfactual: Sure, the economy is bad, but it could have been so much worse! Trust me! Now, it's Romney who is in that position: Sure, the economy is OK, but it could have been so much better! Either way, it's a tough argument to make.
For Romney, it's even tougher when you're still taking incoming from your own side. In advance of Tuesday's Illinois primary, Santorum was stepping up his attacks on Romney from the right. But as Romney continues his grim slog toward the nomination -- he declined to mention it, but his introducer in Chicago read the tally of his delegate lead over his rivals -- he seems to be figuring that the best way to get the Republican Party to see him as its standard-bearer is to start acting like he already is.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
I traveled to every country on earth. In some cases, the adventure started before I could get there.
Last summer, my Royal Air Maroc flight from Casablanca landed at Malabo International Airport in Equatorial Guinea, and I completed a 50-year mission: I had officially, and legally, visited every recognized country on earth.
This means 196 countries: the 193 members of the United Nations, plus Taiwan, Vatican City, and Kosovo, which are not members but are, to varying degrees, recognized as independent countries by other international actors.
In five decades of traveling, I’ve crossed countries by rickshaw, pedicab, bus, car, minivan, and bush taxi; a handful by train (Italy, Switzerland, Moldova, Belarus, Ukraine, Romania, and Greece); two by riverboat (Gabon and Germany); Norway by coastal steamer; Gambia and the Amazonian parts of Peru and Ecuador by motorized canoe; and half of Burma by motor scooter. I rode completely around Jamaica on a motorcycle and Nauru on a bicycle. I’ve also crossed three small countries on foot (Vatican City, San Marino, and Liechtenstein), and parts of others by horse, camel, elephant, llama, and donkey. I confess that I have not visited every one of the 7,107 islands in the Philippine archipelago or most of the more than 17,000 islands constituting Indonesia, but I’ve made my share of risky voyages on the rickety inter-island rustbuckets you read about in the back pages of the Times under headlines like “Ship Sinks in Sulu Sea, 400 Presumed Lost.”
Heather Armstrong’s Dooce once drew millions of readers. Her blog’s semi-retirement speaks to the challenges of earning money as an individual blogger today.
The success story of Dooce.com was once blogger lore, told and re-told in playgroups and Meetups—anywhere hyper-verbal people with Wordpress accounts gathered. “It happened for that Dooce lady,” they would say. “It could happen for your blog, too.”
Dooce has its origin in the late 1990s, when a young lapsed Mormon named Heather Armstrong taught herself HTML code and moved to Los Angeles. She got a job in web design and began blogging about her life on her personal site, Dooce.com.
The site’s name evolved out of her friends’ AOL Instant-Messenger slang for dude, or its more incredulous cousin, "doooood!” About a year later, Armstrong was fired for writing about her co-workers on the site—an experience that, for a good portion of the ‘aughts, came known as “getting dooced.” She eloped with her now ex-husband, Jon, moved to Salt Lake City, and eventually started blogging full time again.
ISIS did not merely blast apart old stones—it attacked the very foundations of pluralistic society.
If the ruined ruins of Palmyra could speak, they would marvel at our shock. After all, they have been sacked before. In their mute and shattered eloquence, they spoke for centuries not only about the cultures that built them but also about the cultures that destroyed them—about the fragility of civilization itself, even when it is incarnated in stone. No designation of sanctity, by God or by UNESCO, suffices to protect the past. The past is helpless. Instead these ruins, all ruins, have had the effect of lifting the past out of history and into time. They carry the spectator away from facts and toward reveries.
In the 18th century, after the publication in London of The Ruins of Palmyra, a pioneering volume of etchings by Robert Wood, who had traveled to the Syrian desert with the rather colorful James Dawkins, a fellow antiquarian and politician, the desolation of Palmyra became a recurring symbol for ephemerality and the vanity of all human endeavors. “It is the natural and common fate of cities,” Wood dryly remarked in one of the essays in his book, “to have their memory longer preserved than their ruins.” Wood’s beautiful and meticulous prints served as inspirations for paintings, and it was in response to one of those paintings that Diderot wrote some famous pages in his great Salons of 1767: “The ideas ruins evoke in me are grand. Everything comes to nothing, everything perishes, everything passes, only the world remains, only time endures. ... Wherever I cast my glance, the objects surrounding me announce death and compel my resignation to what awaits me. What is my ephemeral existence in comparison with that of a rock being worn down, of a valley being formed, of a forest that’s dying, of these deteriorating masses suspended above my head? I see the marble of tombs crumble into powder and I don’t want to die!”
Learning to program involves a lot of Googling, logic, and trial-and-error—but almost nothing beyond fourth-grade arithmetic.
I’m not in favor of anyone learning to code unless she really wants to. I believe you should follow your bliss, career-wise, because most of the things you’d buy with all the money you’d make as a programmer won’t make you happy. Also, if your only reason for learning to code is because you want to be a journalist and you think that’s the only way to break into the field, that’s false.
I’m all for people not becoming coders, in other words—as long they make that decision for the right reasons. “I’m bad at math” is not the right reason.
Math has very little to do with coding, especially at the early stages. In fact, I’m not even sure why people conflate the two. (Maybe it has to do with the fact that both fields are male-dominated.)
In continuing to tinker with the universe she built eight years after it ended, J.K. Rowling might be falling into the same trap as Star Wars’s George Lucas.
September 1st, 2015 marked a curious footnote in Harry Potter marginalia: According to the series’s elaborate timeline, rarely referenced in the books themselves, it was the day James S. Potter, Harry’s eldest son, started school at Hogwarts. It’s not an event directly written about in the books, nor one of particular importance, but their creator, J.K. Rowling, dutifully took to Twitter to announce what amounts to footnote details: that James was sorted into House Gryffindor, just like his father, to the disappointment of Teddy Lupin, Harry’s godson, apparently a Hufflepuff.
It’s not earth-shattering information that Harry’s kid would end up in the same house his father was in, and the Harry Potter series’s insistence on sorting all of its characters into four broad personality quadrants largely based on their family names has always struggled to stand up to scrutiny. Still, Rowling’s tweet prompted much garment-rending among the books’ devoted fans. Can a tweet really amount to a piece of canonical information for a book? There isn’t much harm in Rowling providing these little embellishments years after her books were published, but even idle tinkering can be a dangerous path to take, with the obvious example being the insistent tweaks wrought by George Lucas on his Star Wars series.
It’s not just Trump: With Ben Carson and Carly Fiorina on the rise, Republicans are loving outsiders and shunning politicians.
For the first time in a long time, Donald Trump isn’t the most interesting story in the 2016 presidential race. That's partly because his dominance in the Republican polls, while still surprising, is no longer novel and increasingly well explored and explained, but it’s also partly because what’s going on with the rest of the GOP field is far more interesting.
The man who made computers personal was a genius and a jerk. A new documentary wonders whether his legacy can accommodate both realities.
An iPhone is a machine much like any other: motherboard, modem, microphone, microchip, battery, wire of gold and silver and copper twisting and snaking, the whole assembly arranged under a piece of glass whose surface—coated with an oxide of indium and tin to make it electrically conductive—sparks to life at the touch of a warm-blooded finger. But an iPhone, too, is much more than a machine. The neat ecosystem that hums under its heat-activated glass holds grocery lists and photos and games and jokes and news and books and music and secrets and the voices of loved ones and, quite possibly, every text you’ve ever exchanged with your best friend. Thought, memory, empathy, the stuff we sometimes shorthand as “the soul”: There it all is, zapping through metal whose curves and coils were designed to be held in a human hand.
Some Republican candidates are promoting a policy change that would hurt workers by disguising it with a pleasant-sounding phrase.
Americans like their Social Security benefits quite a bit: They oppose cuts to them by a margin of two to one. Even Millennials, who won’t be seeing benefits anytime soon, feel protective of Social Security, according to a poll from the Pew Research Center.
One way to effectively cut Social Security benefits is to raise the age at which they kick in. And yet, when asked specifically about raising the retirement age, Americans are mixed.
Perhaps confusion arises because “raising the age of retirement” sounds like a nice jobs program for older Americans, or an end to forced retirement. I sympathize with that position: Anyone who wants to retire later and work into old age should have a job. But that’s not what raising the retirement age would entail—the fact is, raising the Social Security retirement age represents a reduction in benefits: Because the monthly payments a person receives grow bigger the later in life he or she retires, raising the age cutoff reduces the total amount of money paid out.