Pain Rays and Robot Swarms: The Radical New War Games the DOD Plays

From an ethics standpoint, if we want to avoid provocation, we need to consider unintended effects of an e-bomb. Because a warship is a complex technical system, turning off its power may indirectly cause harm or death to sailors aboard, and this can elevate the crisis into an actual armed conflict. But assuming that war has begun and we are engaged in self-defense, an e-bomb seems preferable to a kinetic attack that would certainly harm property and persons, if they both get the same job done: stopping the warships. It may even be preferable to a cyberweapon, the effects of which we can't be certain, including its scope, immediacy, and possible proliferation "into the wild" or to civilian systems such as our own. Unlike an e-bomb, a cyberattack would give adversaries a blueprint or ideas to design a similar weapon that can be used against our systems.

From a policy standpoint, the U.S. may be worried about the proliferation of e-bombs above any other technology, because we'd have the most to lose with the world's most wired military. We have more self-interest to not set a precedent here than perhaps with any other weapon system. Further, adversaries might begin to anticipate our e-bomb campaigns and implement a "dead hand" deterrence system to protect their assets--a system that automatically launches (or ceases to hold back) an attack on us, in the event of a total power loss. And this co-evolution of hunter and prey makes war even more dangerous that it already was.

Untangling ethics, policy, and law

From the above scenarios, based on the NeXTech wargames, we can see that ethics, policy, and law may come to radically different conclusions. When they do converge on a solution, they often focus on different issues. Perhaps in an ideal world, there's a syzygy or alignment of the three areas: Policies and law should be ethical. The real world, though, is messy. It's difficult to pin down and integrate analysis from the three disciplines, each an art and science unto itself.

In wargaming, we saw substantial disagreements not just at the intersections of ethics, policy, and law but also within each community, adding to the complexity of the exercise. These areas of contention are important to explore for decision-makers to have a broader perspective and more options, and this is crucial in a dynamic, complex world that is unlikely to be captured by a single perspective.

So it's encouraging that the U.S. and other militaries are showing more interest in these areas. War is one of the most ethically problematic areas of human life. As such, there is much humanitarian and practical value in accounting for ethics, policy, and law--especially around emerging military technologies that give rise to novel scenarios and issues. Beyond sparing civilians from harm and safeguarding human rights, a commitment to ethics and the rule of law is what sets apart a military, with honor and professionalism, from a band of mercenaries.

As we have learned from the U.S.-Vietnam war and arguably current drone-strike campaigns, superior technology by itself is not enough for victory. Winning "hearts and minds" matters for a lasting peace, and this is difficult to achieve if a war is prosecuted unethically or illegally. Failing to think ahead about ethics, policy, and law could also deal serious blows to national reputations and key military programs, from pain-rays to drones to cyberweapons and more, all presently controversial and under debate.

What's next?

The short analyses I presented above are far from complete. The NeXTech wargames were meant to kickstart a conversation, helping to understand the work in front of us rather than attempting to anticipate every scenario and offer clear solutions.

We still need to examine the issues more fully and methodically in a "whole-systems" approach. In wargaming with law professors, JAG lawyers, policy advisers, philosophers, theologians, and other domain experts, we saw the value of their different perspectives to the conversation. We also saw the need for scientists, technologists, futurists, journalists, military officers, as well as cadets and midshipmen (who will be on the frontlines of these next-generation weapons) at the table to ensure the conversation is guided by realism.

Noetic's NeXTech is unique with its wargaming methodology, and other efforts exist as well--such as by the National Research Council, Naval Academy, and Chautauqua Council--to cross-pollinate expertise and to engage the broader public on these weighty issues, a vital part of democracy. So we already have a nice head start and, with that momentum, now just need to keep running.

Not only do ethics, policy, and legal experts believe these issues are urgent, but cultural and religious communities also want to participate. And all of these stakeholders will engage the debate with or without the participation of the defense establishment, whether government or industry. Without that participation, decision-makers lose a valuable opportunity to help frame the debate, address public fears, and make better informed calls about a new technology and its risks. The Active Denial System again is just one poster-child of this lesson.

It will take work to integrate ethics, policy, and law into national security planning and military technology development, particularly as the emerging technologies aren't here yet. But the future may come sooner than we think, and we are always surprised. We hope that the U.S. and other governments have the foresight and commitment to stay with this challenge. It's not a bridge too far, but one that is worth the effort.


Acknowledgements: Some of this research has been supported by NeXTech wargames, The Greenwall Foundation, US Naval Academy, Office of Naval Research, and California Polytechnic State University. I thank Keith Abney, Brad Allenby, Ben Fitzgerald, George R. Lucas, Jr., Peter W. Singer, Wendell Wallach, and John Watts for reviewing this essay. The statements expressed here are the author's alone and do not necessarily reflect the views of the aforementioned persons or organizations.

Presented by

Patrick Lin is the director of the Ethics + Emerging Sciences Group at California Polytechnic State University, San Luis Obispo; a visiting associate professor at Stanford's School of Engineering; and an affiliate scholar at Stanford Law School. He is the lead editor of Robot Ethics and the co-author of What Is Nanotechnology and Why Does It Matter? and Enhanced Warfighters: Risk, Ethics, and Policy.

Google Street View, Transformed Into a Tiny Planet

A 360-degree tour of our world, made entirely from Google's panoramas

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register.

blog comments powered by Disqus

Video

Google Street View, Transformed Into a Tiny Planet

A 360-degree tour of our world, made entirely from Google's panoramas

Video

The 86-Year-Old Farmer Who Won't Quit

A filmmaker returns to his hometown to profile the patriarch of a family farm

Video

Riding Unicycles in a Cave

"If you fall down and break your leg, there's no way out."

Video

Carrot: A Pitch-Perfect Satire of Tech

"It's not just a vegetable. It's what a vegetable should be."

Video

The Benefits of Living Alone on a Mountain

"You really have to love solitary time by yourself."

More in Technology

Just In