A trade group's set of best practices is a start, but unmanned vehicles -- like the car and airplane before them -- will require revising old laws.
The last several months were supposed to be good times for the makers of unmanned aerial systems, popularly known as "drones." Business is booming and theirs is one of the few parts of the aerospace industry not shaking in its boots at impending defense budget cuts. And the $2.3 million spent on lobbying Congress finally seems to have paid off. In February, Congress ordered the FAA to figure out an action plan to open up the national air space to unmanned systems (currently, only those with special agreements, such as for Border Patrol, are allowed) by 2015, as well as set up six experimentation locales.
This will put an already strong business on steroids, akin to what the development of the Internet did for the computer industry. Rather than just selling to the Pentagon, the new clients might range from the more than 21,000 state and local law enforcement departments to farmers, journalists, and more, as they find new and innovative uses for unmanned systems, from overhead surveillance to crop-dusting.
The problem is that what appeared to be good news for the industry instead turned into a public-relations nightmare. Both left and right came together in condemnation and worry over the implications of this move, especially on privacy rights. Perhaps the most extreme example was when Charles Krauthammer, the right-wing columnist and Fox News commentator, reacted to the news by saying, "I'm going to go hard left on you here, I'm going ACLU" and calling for an absolute ban: "I don't want regulations, I don't want restrictions, I want a ban on this." He then swung back to the traditional right, adding, "The first guy who uses a Second Amendment weapon to bring a drone down that's been hovering over his house is going to be a folk hero in this country."
Even worse for the industry, policymakers began to take seriously some of the stories about drones run amok that went viral on the Internet. Some of those stories turned out to be false, such as when Nebraska's congressional delegation complained to the EPA about federal government drones illegally "spying" on farmers, despite the story having no factual basis. Soon, legislators at the state and federal levels were racing to submit bills to roll back the planned domestic drone boom.
Faced with the backlash, the trade group for the industry, the Association for Unmanned Vehicles Systems International (AUVSI), which had originally taken credit for literally writing the exact language used in the FAA bill, tried to stem the bleeding with a classic move from the bad-press playbook. Last week, it issued an industry "code of conduct."
That move was a positive step, showing that the industry finally recognizes that drones have great potential to the nation (and great potential for profits as well), but only if they successfully navigate the deep concerns that the technology evokes. The code smartly discussed how "as with every revolutionary technology, there will be mishaps and abuses," but that the key to winning "public acceptance and trust" is openness and transparency. This would seem to be common sense, but it's a new turn for an industry that has largely been Pollyannaish about public concerns. Just a few years ago, a survey of AUVSI key stakeholders found that 60 percent believed that there would be "no" social, ethical, or moral problems to emerge from the advancement of unmanned systems.
The code of conduct is a laudable start, but it doesn't change the underlying issues and concerns. Like such other would-be "codes of conduct," it lacks one key ingredient: consequences.
Changing course, the code of conduct took on many of the concerns circulating, grouping them into three core themes of Safety, Professionalism, and Respect. It laid out how the industry and users would "commit" to not operating drones "in a manner that presents undue risk to persons or property;" to planning for "all anticipated off-nominal events;" and to share such contingency plans with "all appropriate authorities." It made great sense and was reported widely.
The challenge for the robotics code of conduct is much the same as other industries' attempts at self-regulation, ranging from banking to the private military industry. It's a laudable start, but it doesn't change the underlying issues and concerns. Like such other would-be "codes of conduct," it lacks one key ingredient: consequences.