We hear a lot about energy research and development. Perhaps that's because it's the one sort of policy that Republicans and Democrats generally agree on. But there's a different kind of research that I'd like to see get a lot more attention and funding. I'm talking about research into what various kinds of energy policies actually *do* to shape the technical possibilities open to humanity.
In my time researching energy, most of the people who actually care about where we get our energy from have committed to an energy source, be it oil, gas, traditional nuclear, wind, solar, geothermal, or thorium. Then, they go looking for policies that would benefit their technology. I've also run into a lot of people who believe in inexorable laws of change in energy, whether that's decarbonization or the inevitable rise of natural gas or nuclear power. And I've run into a lot of energy experts who believe in a fairly simple relationship between research money going in and technologies coming out.
Unfortunately, none of these three groups of people is likely to produce very good energy policy. To put it in more mainstream terms, we've got a lot of energy pundits and very few energy Nate Silvers, who put reality (i.e. good data) ahead of ideology and intuition. Don't get me wrong: everyone in energy loves them some data, but few people are interested in using it the way Silver does.
Let me introduce you to a scholar who I think embodies the kind of research we need more of. His name is Gregory Nemet. He did his PhD at Berkeley and now teaches at the University of Wisconsin, Madison. I first discovered his work through a 2006 paper in Energy Policy, "Beyond the Learning Curve: factors influencing cost reductions in photovoltaics." Now, you're probably familiar with the neat story that learning curves tell. They say that as you do something, you get better at it, and because it's a curve, the assumption is that this happens at a fairly consistent (and therefore predictable) rate. This is part of the rationale for supporting photovoltaics after all. They've gotten so much cheaper (orders of magnitude) over the last few decades that proponents suggest they're inevitably going to get cheaper than grid electricity some time in the near future.
But this is just too simple a model for the way the world works. Nemet first demolishes the idea that we can bank on simple learning by experience models that show consistent cost reductions as the amount of solar produced increases. These analyses are super sensitive to small changes in the learning rate or the growth of the market (the number of megawatts of PV production in a given time span). And that's not even taking into account the discontinuities that we know occur in technological development. He raises several other powerful objections based on the literature. All in all, it's a pretty amazing takedown of a common method of analysis.
But he doesn't stop there. He then uses the history of photovoltaics (from 1976-2001) to demonstrate a new way of modeling cost reductions in technology. It's hard to gloss the whole thing, but suffice to say that his model allows him to identify which of the following factors were important for different periods of the technology's evolution in driving down cost: plant size, scaling factor, module efficiency, silicon cost, wafer size, silicon use, yield, polycrystal share, polycrystal cost.
With kind of policy impact might that have? Well, if increasing the size of photovoltaic plants appears to lead to large cost reductions, then it might be a good idea to have a loan program that helps get these sorts of plants built. A loan program much like the one that produced many good outcomes along with a few duds like Solyndra.
But there's a deeper reason to support this kind of research. When people think of technological development as somehow magically proceeding apace, it makes it seem *as if* people's personal and civic interventions don't matter. But of course they do! It's just that when you draw one curve to stick in your PowerPoint, all the decisions that affect the factors above get submerged into a false law of simplistic cost reductions.
Since 2006, Nemet has kept working on important research projects. He's done more work on trying to model the effectiveness of differing government support models, as in this paper on whether subsidies or R&D spending are more likely to bring organic solar cells to market. (In this case, the answer is R&D.)
His most recent work, though, might be his most significant, though I think his current research program is not yet complete. In carious ways, he's been trying to get at a very basic question: do demand-side subsidies work to stimulate technological development? Or might better policies exist? This is more than a theoretical question, given the various tax credits both here and abroad that appear to have pushed low-carbon technologies forward. Note the way I framed his project, which I think he would agree with. This is not about whether Nemet believes government should be subsidizing energy projects or not. This is not about whether solar or wind or nuclear *should* be the future of our energy system. No, this is something more basic and more difficult to answer: how much can subsidies enhance the learning (and therefore cost reductions) that an industry like wind actually does?
If you're curious what his final analysis is, here's the conclusion from an excellent forthcoming paper in the Journal of Policy Analysis and Management. You probably won't be surprised to learn that he makes a nuanced judgment:
The magnitude of public funds at stake add some urgency to improving understanding of the extent and characteristics of knowledge spillovers from learning by doing. The main results here imply that policies that enhance demand are necessary to generate sufficient knowledge from experience. Other insights from this case--especially depreciation and diminishing returns--heighten the value of policy instruments with performance-oriented mechanisms and longevity. That experience-derived knowledge appears to be so ephemeral suggests that we should also consider explicit support for codification and transfer of what is learned.
Dean of Students John Ellison gets an A for initiative, a B-minus for execution, and extra-credit for stoking a useful debate.
When I was a heretical student at a Catholic high school deciding where to apply to college, I thrilled at the prospect of an educational institution where free inquiry would reign supreme and forceful debate would never be hemmed in by dogma.
A letter like the one that University of Chicago Dean of Students John Ellison sent last week to incoming first-year students––reminding them of the school’s “commitment to freedom of inquiry and expression," and affirming that those admitted to it “are encouraged to speak, write, listen, challenge, and learn, without fear of censorship”––would have struck me as a glorious affirmation: that robust intellectual communities truly did exist; that I would finally be free to follow my brain; that college would be a crucible that tested the strength of all my beliefs.
What do we actually know about the candidate’s health?
Cameras rolling, Manhattan gastroenterologist Harold Bornstein was confronted last week with a letter that carried his signature. In that letter, the writer “state[d] unequivocally” that Donald Trump “will be the healthiest individual ever elected to the presidency.”
Donald Trump would be the oldest individual ever elected to the presidency. He sleeps little and holds angry grudges. He purports to eat KFC and girthy slabs of red meat, and his physique doesn’t suggest any inconsistency in this. His health might be fine, but a claim to anything superlative feels off.
Bornstein might have jumped on that opportunity to get out of this mess—to say that Trump had dictated the letter, and Bornstein only signed it. Or that Trump had at least suggested phrases. Because it’s not just the facts of Trump’s life that don’t add up, but the linguistics of the letter.
The San Francisco quarterback has been attacked for refusing to stand for the Star Spangled Banner—and for daring to criticize the system in which he thrived.
It was in early childhood when W.E.B. Du Bois––scholar, activist, and black radical––first noticed The Veil that separated him from his white classmates in the mostly white town of Great Barrington, Massachusetts. He and his classmates were exchanging “visiting cards,” invitations to visit one another’s homes, when a white girl refused his.
“Then it dawned upon me with a certain suddenness that I was different from the others; or like, mayhap, in heart and life and longing, but shut out from their world by a vast veil. I had thereafter no desire to tear down that veil, to creep through; I held all beyond it in common contempt, and lived above it in a region of blue sky and great wandering shadows,” Du Bois wrote in his acclaimed essay collection, The Souls of Black Folk. “That sky was bluest when I could beat my mates at examination-time, or beat them at a foot-race, or even beat their stringy heads.”
How will the show maintain its charm while unraveling its mysteries?
Stranger Things will return in 2017 for a second season with nine episodes by original writers/directors Matt and Ross Duffer, Netflix announced today. The news is about as unsurprising as, say, the idea that four Dungeons and Dragons-playing nerds in 1983 would be bullied at school. But it’s also an intriguing development—not unlike the revelation of an alternate dimension that resembles our own but has unfriendly plant-headed monsters roaming about.
The first eight episodes of the nostalgia-soaked sci-fi saga became the unpredicted breakout pop-culture conversation piece of summer 2016, spawning memes online and faux funerals in real life. Netflix doesn’t reveal viewership numbers, but this week the independent data-measurement company Symphony Advanced Media estimated that the series drew an average of 14.07 million adults age 18-49 in the first 35 days of streaming. That would make it the second most-watched Netflix original of 2016, just behind Fuller House and the latest Orange Is the New Black season, both of which (unlike Stranger Things ) arrived with established fan bases. Netflix’s business model relies on shows doing exactly what Stranger Things has done: draw buzz to lure subscribers.
The talk-radio host claims that he never took Donald Trump seriously on immigration. He neglected to tell his immigration obsessed listeners.
For almost a decade, I’ve been angrily documenting the way that many right-wing talk-radio hosts betray the rank-and-file conservatives who trust them for information. My late grandmother was one of those people. She deserved better than she got. With huge platforms and massive audiences, successful hosts ought to take more care than the average person to be truthful and avoid misinforming listeners. Yet they are egregiously careless on some days and willfully misleading on others.
And that matters, as we’ll come to see.
Rush Limbaugh is easily the most consequential of these hosts. He has an audience of millions. And over the years, parts of the conservative movement that ought to know better, like the Claremont Institute, have treated him like an honorable conservative intellectual rather than an intellectually dishonest entertainer. The full cost of doing so became evident this year, when a faction of populists shaped by years of talk radio, Fox News, and Breitbart.com picked Donald Trump to lead the Republican Party, a choice that makes a Hillary Clinton victory likely and is a catastrophe for movement conservatism regardless of who wins.
Richmond was once the epicenter of black finance. What happened there explains the decline of black-owned banks across the country.
On April, 3rd, 1968, Martin Luther King Jr. gave his famous “I’ve Been to the Mountaintop” speech in Memphis. In it, he urged African Americans to put their money in black-owned banks. It wasn’t his most famous line, but the message was clear: “We’ve got to strengthen black institutions. I call upon you to take your money out of the banks downtown and deposit your money in the Tri-State Bank. We want a ‘bank-in’ movement in Memphis … We begin the process of building a greater economic base.”
The next day, King was assassinated, and his hope of harnessing black wealth remains unfulfilled. Before integration, African Americans in cities like Richmond, Chicago, and Atlanta relied on black community banks, which were largely responsible for providing loans and boosting black businesses, churches, and neighborhoods. After desegregation, black wealth started to hemorrhage from these communities: White-owned banks were forced to open their doors to African Americans and the money that once flowed into black banks and back out to black communities ended up on Wall Street and other banks farther away.
A Hillary Clinton presidential victory promises to usher in a new age of public misogyny.
Get ready for the era of The Bitch.
If Hillary Clinton wins the White House in November, it will be a historic moment, the smashing of the preeminent glass ceiling in American public life. A mere 240 years after this nation’s founding, a woman will occupy its top office. America’s daughters will at last have living, breathing, pantsuit-wearing proof that they too can grow up to be president.
A Clinton victory also promises to usher in four-to-eight years of the kind of down-and-dirty public misogyny you might expect from a stag party at Roger Ailes’s house.
You know it’s coming. As hyperpartisanship, grievance politics, and garden-variety rage shift from America’s first black commander-in-chief onto its first female one, so too will the focus of political bigotry. Some of it will be driven by genuine gender grievance or discomfort among some at being led by a woman. But in plenty of other cases, slamming Hillary as a bitch, a c**t (Thanks, Scott Baio!), or a menopausal nut-job (an enduringly popular theme on Twitter) will simply be an easy-peasy shortcut for dismissing her and delegitimizing her presidency.
In its early days, the first English settlement in America had lots of men, tobacco, and land. All it needed was women.
“First comes love, then comes marriage,” the old nursery rhyme goes, but historically, first came money. Marriage was above all an economic transaction, and in no place was this more apparent than in the early 1600s in the Jamestown colony, where a severe gender imbalance threatened the fledgling colony’s future.
The men of Jamestown desperately wanted wives, but women were refusing to immigrate. They had heard disturbing reports of dissension, famine, and disease, and had decided it simply wasn’t worth it. Consequently, barely a decade after its founding in 1607, Jamestown was almost entirely male, and because these men were unable to find wives, they were deserting the colony in droves.
An immediate influx of women was needed to save the floundering colony; its leaders suggested putting out an advertisement targeting wives. The women who responded to this marital request and agreed to marry unknown men in an unfamiliar land were in a sense America’s first mail-order brides.
Practices meant to protect marginalized communities can also ostracize those who disagree with them.
Last week, the University of Chicago’s dean of students sent a welcome letter to freshmen decrying trigger warnings and safe spaces—ways for students to be warned about and opt out of exposure to potentially challenging material. While some supported the school’s actions, arguing that these practices threaten free speech and the purpose of higher education, the note also led to widespread outrage, and understandably so. Considered in isolation, trigger warnings may seem straightforwardly good. Basic human decency means professors like myself should be aware of students’ traumatic experiences, and give them a heads up about course content—photographs of dead bodies, extended accounts of abuse, disordered eating, self-harm—that might trigger an anxiety attack and foreclose intellectual engagement. Similarly, it may seem silly to object to the creation of safe spaces on campus, where members of marginalized groups can count on meeting supportive conversation partners who empathize with their life experiences, and where they feel free to be themselves without the threat of judgment or censure.
Education experts offer their thoughts on how—if at all—schools should assign, grade, and use take-home assignments.
This is the third installment in our series about school in a perfect world. Read previous entries here and here.
We asked prominent voices in education—from policy makers and teachers to activists and parents—to look beyond laws, politics, and funding and imagine a utopian system of learning. They went back to the drawing board—and the chalkboard—to build an educational Garden of Eden. We’re publishing their answers to one question each day this week. Responses have been lightly edited for clarity and length.
Today’s assignment: The Homework. Will students have homework?
Rita Pin Ahrens, thedirector of education policy for the Southeast Asia Resource Action Center
Homework is absolutely necessary for students to demonstrate that they are able to independently process and apply their learning. But who says homework has to be the same as it has been? Homework might include pre-reading in preparation for what will be covered in class that day, independent research on a student-chosen topic that complements the class curriculum, experiential learning through a volunteer activity or field trip, or visiting a website and accomplishing a task on it. The structure will be left to the teachers to determine, as best fits the learning objective, and should be graded—whether by the teacher or student. Students will be held accountable for their homework and understand that it is an integral part of the learning process.