Articles  |  ,   |  April 1, 2017

More than a Game: Decision Support Systems and Moral Injury

Research Paper by Chaplain (Colonel) James L. Boggess, United States Army Reserve. United States Army War College.

Abstract:

Computers are becoming an ever-increasing part of the decision-making process. From managing data to help humans make informed decisions to decision support systems that develop and recommend or select courses of action, computer enhanced decision-making is already impacting the way America fights her wars. As computers carry more and more of the decision-making burden, humans are left to wrestle with the ethical and moral issues. The potential for psychological and moral injury remains and may even grow as a result of computer enhanced decision-making. As decision support systems make decision-making seem more like a game, humans will have less time and may be less likely to fully review courses of action for ethical and moral compatibility. In some instances, especially if collateral damage results in civilian casualties, the human “in, on, or over” the decision loop may feel personally responsible for the action and, as a result, find that their ethical and moral core has been violated, leading to psychological or moral injury. As the military continues to pursue artificial intelligence, automated, and autonomous systems, equal care must be taken to ensure these systems operate within approved ethical and moral boundaries and that their operators are properly trained in ethical decision-making.

About the Author

Chaplain (Colonel) Boggess holds a Doctor of Ministry from Erskine Theological Seminary. He has both a Master of Arts in Theological Studies and a Master of Divinity from Assemblies of God Theological Seminary and a Masters in Strategic Studies from the US Army War College. He also has four units of Clinical Pastoral Education.