We hear a lot about energy research and development. Perhaps that's because it's the one sort of policy that Republicans and Democrats generally agree on. But there's a different kind of research that I'd like to see get a lot more attention and funding. I'm talking about research into what various kinds of energy policies actually *do* to shape the technical possibilities open to humanity.
In my time researching energy, most of the people who actually care about where we get our energy from have committed to an energy source, be it oil, gas, traditional nuclear, wind, solar, geothermal, or thorium. Then, they go looking for policies that would benefit their technology. I've also run into a lot of people who believe in inexorable laws of change in energy, whether that's decarbonization or the inevitable rise of natural gas or nuclear power. And I've run into a lot of energy experts who believe in a fairly simple relationship between research money going in and technologies coming out.
Unfortunately, none of these three groups of people is likely to produce very good energy policy. To put it in more mainstream terms, we've got a lot of energy pundits and very few energy Nate Silvers, who put reality (i.e. good data) ahead of ideology and intuition. Don't get me wrong: everyone in energy loves them some data, but few people are interested in using it the way Silver does.
Let me introduce you to a scholar who I think embodies the kind of research we need more of. His name is Gregory Nemet. He did his PhD at Berkeley and now teaches at the University of Wisconsin, Madison. I first discovered his work through a 2006 paper in Energy Policy, "Beyond the Learning Curve: factors influencing cost reductions in photovoltaics." Now, you're probably familiar with the neat story that learning curves tell. They say that as you do something, you get better at it, and because it's a curve, the assumption is that this happens at a fairly consistent (and therefore predictable) rate. This is part of the rationale for supporting photovoltaics after all. They've gotten so much cheaper (orders of magnitude) over the last few decades that proponents suggest they're inevitably going to get cheaper than grid electricity some time in the near future.
But this is just too simple a model for the way the world works. Nemet first demolishes the idea that we can bank on simple learning by experience models that show consistent cost reductions as the amount of solar produced increases. These analyses are super sensitive to small changes in the learning rate or the growth of the market (the number of megawatts of PV production in a given time span). And that's not even taking into account the discontinuities that we know occur in technological development. He raises several other powerful objections based on the literature. All in all, it's a pretty amazing takedown of a common method of analysis.
But he doesn't stop there. He then uses the history of photovoltaics (from 1976-2001) to demonstrate a new way of modeling cost reductions in technology. It's hard to gloss the whole thing, but suffice to say that his model allows him to identify which of the following factors were important for different periods of the technology's evolution in driving down cost: plant size, scaling factor, module efficiency, silicon cost, wafer size, silicon use, yield, polycrystal share, polycrystal cost.
With kind of policy impact might that have? Well, if increasing the size of photovoltaic plants appears to lead to large cost reductions, then it might be a good idea to have a loan program that helps get these sorts of plants built. A loan program much like the one that produced many good outcomes along with a few duds like Solyndra.
But there's a deeper reason to support this kind of research. When people think of technological development as somehow magically proceeding apace, it makes it seem *as if* people's personal and civic interventions don't matter. But of course they do! It's just that when you draw one curve to stick in your PowerPoint, all the decisions that affect the factors above get submerged into a false law of simplistic cost reductions.
Since 2006, Nemet has kept working on important research projects. He's done more work on trying to model the effectiveness of differing government support models, as in this paper on whether subsidies or R&D spending are more likely to bring organic solar cells to market. (In this case, the answer is R&D.)
His most recent work, though, might be his most significant, though I think his current research program is not yet complete. In carious ways, he's been trying to get at a very basic question: do demand-side subsidies work to stimulate technological development? Or might better policies exist? This is more than a theoretical question, given the various tax credits both here and abroad that appear to have pushed low-carbon technologies forward. Note the way I framed his project, which I think he would agree with. This is not about whether Nemet believes government should be subsidizing energy projects or not. This is not about whether solar or wind or nuclear *should* be the future of our energy system. No, this is something more basic and more difficult to answer: how much can subsidies enhance the learning (and therefore cost reductions) that an industry like wind actually does?
If you're curious what his final analysis is, here's the conclusion from an excellent forthcoming paper in the Journal of Policy Analysis and Management. You probably won't be surprised to learn that he makes a nuanced judgment:
The magnitude of public funds at stake add some urgency to improving understanding of the extent and characteristics of knowledge spillovers from learning by doing. The main results here imply that policies that enhance demand are necessary to generate sufficient knowledge from experience. Other insights from this case--especially depreciation and diminishing returns--heighten the value of policy instruments with performance-oriented mechanisms and longevity. That experience-derived knowledge appears to be so ephemeral suggests that we should also consider explicit support for codification and transfer of what is learned.
Two hundred fifty years of slavery. Ninety years of Jim Crow. Sixty years of separate but equal. Thirty-five years of racist housing policy. Until we reckon with our compounding moral debts, America will never be whole.
And if thy brother, a Hebrew man, or a Hebrew woman, be sold unto thee, and serve thee six years; then in the seventh year thou shalt let him go free from thee. And when thou sendest him out free from thee, thou shalt not let him go away empty: thou shalt furnish him liberally out of thy flock, and out of thy floor, and out of thy winepress: of that wherewith the LORD thy God hath blessed thee thou shalt give unto him. And thou shalt remember that thou wast a bondman in the land of Egypt, and the LORD thy God redeemed thee: therefore I command thee this thing today.
— Deuteronomy 15: 12–15
Besides the crime which consists in violating the law, and varying from the right rule of reason, whereby a man so far becomes degenerate, and declares himself to quit the principles of human nature, and to be a noxious creature, there is commonly injury done to some person or other, and some other man receives damage by his transgression: in which case he who hath received any damage, has, besides the right of punishment common to him with other men, a particular right to seek reparation.
Bernie Sanders and Jeb Bush look abroad for inspiration, heralding the end of American exceptionalism.
This election cycle, two candidates have dared to touch a third rail in American politics.
Not Social Security reform. Not Medicare. Not ethanol subsidies. The shibboleth that politicians are suddenly willing to discuss is the idea that America might have something to learn from other countries.
The most notable example is Bernie Sanders, who renewed his praise for Western Europe in a recent interview with Ezra Klein. “Where is the UK? Where is France? Germany is the economic powerhouse in Europe,” Sanders said. “They provide health care to all of their people, they provide free college education to their kids.”
On ABC’s This Week in May, George Stephanopoulos asked Sanders about this sort of rhetoric. “I can hear the Republican attack ad right now: ‘He wants American to look more like Scandinavia,’” the host said. Sanders didn’t flinch:
Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.
Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.
But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.
Writing used to be a solitary profession. How did it become so interminably social?
Whether we’re behind the podium or awaiting our turn, numbing our bottoms on the chill of metal foldout chairs or trying to work some life into our terror-stricken tongues, we introverts feel the pain of the public performance. This is because there are requirements to being a writer. Other than being a writer, I mean. Firstly, there’s the need to become part of the writing “community”, which compels every writer who craves self respect and success to attend community events, help to organize them, buzz over them, and—despite blitzed nerves and staggering bowels—present and perform at them. We get through it. We bully ourselves into it. We dose ourselves with beta blockers. We drink. We become our own worst enemies for a night of validation and participation.
Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.
MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.
Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.
During the multi-country press tour for Mission Impossible: Rogue Nation, not even Jon Stewart has dared ask Tom Cruise about Scientology.
During the media blitz for Mission Impossible: Rogue Nation over the past two weeks, Tom Cruise has seemingly been everywhere. In London, he participated in a live interview at the British Film Institute with the presenter Alex Zane, the movie’s director, Christopher McQuarrie, and a handful of his fellow cast members. In New York, he faced off with Jimmy Fallon in a lip-sync battle on The Tonight Show and attended the Monday night premiere in Times Square. And, on Tuesday afternoon, the actor recorded an appearance on The Daily Show With Jon Stewart, where he discussed his exercise regimen, the importance of a healthy diet, and how he still has all his own hair at 53.
Stewart, who during his career has won two Peabody Awards for public service and the Orwell Award for “distinguished contribution to honesty and clarity in public language,” represented the most challenging interviewer Cruise has faced on the tour, during a challenging year for the actor. In April, HBO broadcast Alex Gibney’s documentary Going Clear, a film based on the book of the same title by Lawrence Wright exploring the Church of Scientology, of which Cruise is a high-profile member. The movie alleges, among other things, that the actor personally profited from slave labor (church members who were paid 40 cents an hour to outfit the star’s airplane hangar and motorcycle), and that his former girlfriend, the actress Nazanin Boniadi, was punished by the Church by being forced to do menial work after telling a friend about her relationship troubles with Cruise. For Cruise “not to address the allegations of abuse,” Gibney said in January, “seems to me palpably irresponsible.” But in The Daily Show interview, as with all of Cruise’s other appearances, Scientology wasn’t mentioned.
The Wall Street Journal’s eyebrow-raising story of how the presidential candidate and her husband accepted cash from UBS without any regard for the appearance of impropriety that it created.
The Swiss bank UBS is one of the biggest, most powerful financial institutions in the world. As secretary of state, Hillary Clinton intervened to help it out with the IRS. And after that, the Swiss bank paid Bill Clinton $1.5 million for speaking gigs. TheWall Street Journal reported all that and more Thursday in an article that highlights huge conflicts of interest that the Clintons have created in the recent past.
The piece begins by detailing how Clinton helped the global bank.
“A few weeks after Hillary Clinton was sworn in as secretary of state in early 2009, she was summoned to Geneva by her Swiss counterpart to discuss an urgent matter. The Internal Revenue Service was suing UBS AG to get the identities of Americans with secret accounts,” the newspaper reports. “If the case proceeded, Switzerland’s largest bank would face an impossible choice: Violate Swiss secrecy laws by handing over the names, or refuse and face criminal charges in U.S. federal court. Within months, Mrs. Clinton announced a tentative legal settlement—an unusual intervention by the top U.S. diplomat. UBS ultimately turned over information on 4,450 accounts, a fraction of the 52,000 sought by the IRS.”
Most of the big names in futurism are men. What does that mean for the direction we’re all headed?
In the future, everyone’s going to have a robot assistant. That’s the story, at least. And as part of that long-running narrative, Facebook just launched its virtual assistant. They’re calling it Moneypenny—the secretary from the James Bond Films. Which means the symbol of our march forward, once again, ends up being a nod back. In this case, Moneypenny is a send-up to an age when Bond’s womanizing was a symbol of manliness and many women were, no matter what they wanted to be doing, secretaries.
Why can’t people imagine a future without falling into the sexist past? Why does the road ahead keep leading us back to a place that looks like the Tomorrowland of the 1950s? Well, when it comes to Moneypenny, here’s a relevant datapoint: More than two thirds of Facebook employees are men. That’s a ratio reflected among another key group: futurists.
An attack on an American-funded military group epitomizes the Obama Administration’s logistical and strategic failures in the war-torn country.
Last week, the U.S. finally received some good news in Syria: After months of prevarication, Turkey announced that the American military could launch airstrikes against Islamic State positions in Syria from its base in Incirlik. The development signaled that Turkey, a regional power, had at last agreed to join the fight against ISIS.
The announcement provided a dose of optimism in a conflict that has, in the last four years, killed over 200,000 and displaced millions more. Days later, however, the positive momentum screeched to a halt. Earlier this week, fighters from the al-Nusra Front, an Islamist group aligned with al-Qaeda, reportedly captured the commander of Division 30, a Syrian militia that receives U.S. funding and logistical support, in the countryside north of Aleppo. On Friday, the offensive escalated: Al-Nusra fighters attacked Division 30 headquarters, killing five and capturing others. According to Agence France Presse, the purpose of the attack was to obtain sophisticated weapons provided by the Americans.
Jim Gilmore joins the race, and the Republican field jockeys for spots in the August 6 debate in Cleveland.
After decades as the butt of countless jokes, it’s Cleveland’s turn to laugh: Seldom have so many powerful people been so desperate to get to the Forest City. There’s one week until the Republican Party’s first primary debate of the cycle on August 6, and now there’s a mad dash to get into the top 10 and qualify for the main event.
With former Virginia Governor Jim Gilmore filing papers to run for president on July 29, there are now 17 “major” candidates vying for the GOP nomination, though that’s an awfully imprecise descriptor. It takes in candidates with lengthy experience and a good chance at the White House, like Scott Walker and Jeb Bush; at least one person who is polling well but is manifestly unserious, namely Donald Trump; and people with long experience but no chance at the White House, like Gilmore. Yet it also excludes other people with long experience but no chance at the White House, such as former IRS Commissioner Mark Everson.