Worst Case Scenarios, Ctd

Steinglass proffers a unified theory for twenty-first century catastrophes:

Leonhardt concentrates on the unfortunate human tendency to discount the highly unlikely. This is certainly a factor, but as advice, it’s only partially useful. If the lesson of the catastrophes of the noughties is to pay attention to tail-end risk, then we should all be running around building nuclear fallout shelters and working out deflection strategies for massive asteroid strikes. And that’s not going to happen. (Though in the case of climate change, one of Leonhardt’s examples, it is useful: we should be paying more attention to the risk that global temperature rise by 2100 will be near the catastrophic 6-degree-celsius high-end estimate, not the merely awful 2-degree median estimate.) But I don’t think that is the main lesson. The main lesson is simpler and more concrete: government regulations need to be more restrictive, regulators need to be more aggressive, better-paid, and more powerful, and they need to stop people and corporations more often from doing things that may be profitable but pose unacceptable risks to the public. We had this theory for a while that economic self-interest would prove sufficient disincentive to foolish risk-taking. But now the Gulf of Mexico is on fire, so I’m afraid we need to go back to the old-fashioned system with the rules and the monitors carrying sticks. Sorry.

This sounds good, but what do you do when regulators have sufficiently large sticks but remain unwilling to swing them?