A trade group's set of best practices is a start, but unmanned vehicles -- like the car and airplane before them -- will require revising old laws.
The last several months were supposed to be good times for the makers of unmanned aerial systems, popularly known as "drones." Business is booming and theirs is one of the few parts of the aerospace industry not shaking in its boots at impending defense budget cuts. And the $2.3 million spent on lobbying Congress finally seems to have paid off. In February, Congress ordered the FAA to figure out an action plan to open up the national air space to unmanned systems (currently, only those with special agreements, such as for Border Patrol, are allowed) by 2015, as well as set up six experimentation locales.
This will put an already strong business on steroids, akin to what the development of the Internet did for the computer industry. Rather than just selling to the Pentagon, the new clients might range from the more than 21,000 state and local law enforcement departments to farmers, journalists, and more, as they find new and innovative uses for unmanned systems, from overhead surveillance to crop-dusting.
The problem is that what appeared to be good news for the industry instead turned into a public-relations nightmare. Both left and right came together in condemnation and worry over the implications of this move, especially on privacy rights. Perhaps the most extreme example was when Charles Krauthammer, the right-wing columnist and Fox News commentator, reacted to the news by saying, "I'm going to go hard left on you here, I'm going ACLU" and calling for an absolute ban: "I don't want regulations, I don't want restrictions, I want a ban on this." He then swung back to the traditional right, adding, "The first guy who uses a Second Amendment weapon to bring a drone down that's been hovering over his house is going to be a folk hero in this country."
Even worse for the industry, policymakers began to take seriously some of the stories about drones run amok that went viral on the Internet. Some of those stories turned out to be false, such as when Nebraska's congressional delegation complained to the EPA about federal government drones illegally "spying" on farmers, despite the story having no factual basis. Soon, legislators at the state and federal levels were racing to submit bills to roll back the planned domestic drone boom.
Faced with the backlash, the trade group for the industry, the Association for Unmanned Vehicles Systems International (AUVSI), which had originally taken credit for literally writing the exact language used in the FAA bill, tried to stem the bleeding with a classic move from the bad-press playbook. Last week, it issued an industry "code of conduct."
That move was a positive step, showing that the industry finally recognizes that drones have great potential to the nation (and great potential for profits as well), but only if they successfully navigate the deep concerns that the technology evokes. The code smartly discussed how "as with every revolutionary technology, there will be mishaps and abuses," but that the key to winning "public acceptance and trust" is openness and transparency. This would seem to be common sense, but it's a new turn for an industry that has largely been Pollyannaish about public concerns. Just a few years ago, a survey of AUVSI key stakeholders found that 60 percent believed that there would be "no" social, ethical, or moral problems to emerge from the advancement of unmanned systems.
The code of conduct is a laudable start, but it doesn't change the underlying issues and concerns. Like such other would-be "codes of conduct," it lacks one key ingredient: consequences.
Changing course, the code of conduct took on many of the concerns circulating, grouping them into three core themes of Safety, Professionalism, and Respect. It laid out how the industry and users would "commit" to not operating drones "in a manner that presents undue risk to persons or property;" to planning for "all anticipated off-nominal events;" and to share such contingency plans with "all appropriate authorities." It made great sense and was reported widely.
The challenge for the robotics code of conduct is much the same as other industries' attempts at self-regulation, ranging from banking to the private military industry. It's a laudable start, but it doesn't change the underlying issues and concerns. Like such other would-be "codes of conduct," it lacks one key ingredient: consequences.
As it stands now, the golfer who violates his country club's code of conduct risks stiffer punishments than a drone maker or user who violates the terms of their new code. Golfers might lose a point, or even be kicked out the club, if they violate their agreement. The new robotics code doesn't include a single potential sanction, such as, for example, something like kicking violators out of the trade group. Indeed, much of what is laid out is actually restatements of responsibilities the firms and users already must abide by, regardless of any code. For example, the code says that the firms "will comply with all federal, state and local laws." So, before the code, they could violate the law at will? Of course not. Saying one will follow the law is one of those things that sounds meaningful but is ultimately meaningless, as it just illustrates the importance of the law, not the code.
Similarly, the code is quite vague on a variety of legitimate concerns. It says that "we will ensure that UAS [unmanned aircraft systems] are piloted by individuals who are properly trained and competent to operate the vehicle or its systems." Who will determine this, and what does "trained and competent" mean in a world where some believe drones should only be operated by rated pilots, even though new versions can be flown by teens using iPhone apps? Likewise, the code pledges to "respect the privacy of individuals," which is a bold statement with nothing about what it actually means. "Respect" could be anything from avoiding the monitoring of individuals without their express permission to showing them "respect" only in the public-relations sense.
Of course, these are thorny issues. Indeed, it's their very thorniness that is why an industry self-regulatory code -- especially one that emerged in the context of bad press and built around lowest common denominator agreement within a trade group -- would sensibly want to avoid them for now. But the irony is that resolving these problems is what actually matters to the industry's overall goal of "gaining public trust and acceptance." The same need for resolution goes for the pressing concerns that the code completely ignored. For example, despite purporting to cover "those who design, test and operate UAS," it avoids stating any specific intent or concern about those we'd rather not see be involved in the field. What can we do not just to promote a powerful technology for good, but also to stop the illicit use by or unintended transfer of the technology to dangerous actors? Much like the technology, such worries are not science fiction. Everyone from terrorists to jewelry thieves to vigilante groups have already used UAS technology.
Likewise, while the code outlines how weather conditions and other potential causes of accidents are to be included in risk assessments, there is no mention in the Safety section on whether or how the industry might work to address hostile man-made threats, including criminal or adversarial efforts at UAS communications interference or hacking. Here again, this scenario is not science fiction, but was recently demonstrated in a test in Texas, where a university team hacked the navigation system of a drone. Part of the absence may be explained by another omission. How does the code view potential responsibilities (and hence liabilities) against its members in the event of "off-nominal" events it now believes are possible?
As with revolutionary inventions of the past, like the horseless carriage and manned airplanes, no amount of handwringing by pundits late to the game will see a technology of such great promise banned. That said, new technologies bring with them the need for revising old laws. Early cars and planes, for instance, led to the creation of newfangled things like "traffic laws" and the Federal Aviation Administration. The drone industry's code of conduct points toward key issues that effort will have to tackle, so it must be viewed as something more than just public relations. But it also shows the long way we have to go in establishing actual policy and real laws.