System Characteristics and Human Involvement with Lethal Autonomous Weapons Systems (LAWS)
The following represents a summary of my contributions to the UN CCW GGE on Lethal Autonomous Weapons Systems (LAWS) and thinking regarding the characteristics of and human involvement with these systems. Information on this GGE can be found here.
For information regarding my participation in the November 2017 LAWS discussion where I discussed the drivers for the deployment of lethal autonomous weapons systems here is the link. Official information on the November 2017 meeting can be found on the Chairman’s website here.
April 2018 GGE LAWS Meeting
While I wasn’t able to attend the April meeting, I followed the discussion which continued from understanding how LAWS would be deployed (November 2017) to understanding what the characteristics of such a weapon system were.
The tables below that I have created attempt to consolidate the thinking across the member nations and participating organizations who submitted reports on their perspectives of the characteristics of LAWS. I have also made a deliberate attempt to organize them in a way that kept in line with the majority of the perspectives and used similar vocabulary as in the reports.
The notes on the sides refer to some member nation reports and their wording, as well as to some organization reports that contributed to the discussion.
August 2018 GGE LAWS Meeting
In August 2018, the GEE on Lethal Autonomous Weapons Systems gathered for their final meeting of the year to pick up where they left off in April where the discussions focused more on characteristics of weapons systems.
In this meeting they built on the April discussions by focusing on the human involvement in LAWS throughout its development and use.
To kick off the week of discussion, the GGE Chairman Ambassador Amandeep Gill opened with a panel of experts to shed light on the human involvement aspect of LAWS. The panel included: Dr. Knut Dormann, Lt. Col. Christopher Korpela, Dr. Gautam Shroff, Professor Anthony Gillespie and Dr. Lydia Kostopoulos, as well as written input from Dr.Ing. Konstantinos Karachalios who was not able to be physically present.
The tables below are what I presented to the GGE on this panel. Just like the tables I prepared as a thought exercise for the April focus on characteristics, these tables consolidate the categorizations and language in the GGE reports submitted by member nations and the NGOs contributing to the discussions.
The second table below aims to highlight the importance of the security of the weapon system itself, as it is still susceptible to cyber attacks, data manipulation and physical intrusions.
Possible Guiding Principles Affirmed by the GGE on August 31, 2018
The GGE concluded late in the evening on Friday August 31st, 2018 and agreed upon the following possible guiding principles (for emerging technologies in the area of lethal autonomous weapons systems). The full report can be accessed here.
It was affirmed that international law, in particular the UN Charter and international humanitarian law as well as relevant ethical perspectives should guide the continued work of the Group. Noting the potential challenges posed by emerging technologies in the area of lethal autonomous weapons systems to international humanitarian law, the following were affirmed without prejudice to the result of future discussions:
1. International humanitarian law continues to apply fully to all weapons systems, including the potential development and use of lethal autonomous weapons systems.
2. Human responsibility for decision on the use of weapons systems must be retained since accountability cannot be transferred to machines. This should be considered across the entire life cycle of the weapon system.
3. Accountability for developing, deploying and using any emerging weapons system in the framework of the CCW must be ensured in accordance with applicable international law, including through the operation of such systems within a responsible chain of human command and control.
4. In accordance with States’ obligations under international law, in the study, development, acquisition, or adoption of a new weapon, means or method of warfare, determination must be made whether its employment would, in some or all circumstances, be prohibited by international law.
5. When developing or acquiring new weapons systems based on emerging technologies in the area of LAWS, physical security, appropriate non-physical safeguards (including cyber-security against hacking or data spoofing), the risk of acquisition by terrorist group s and the risk of proliferations should be considered.
6. Risk assessments and mitigation measures should be part of the design, development, testing and deployment cycle of emerging technologies in any weapons systems.
7. Consideration should be given to the use of emerging technologies in the area of lethal autonomous weapons systems in upholding compliance with IHL and other applicable international legal obligations.
8. In crafting potential policy measures, emerging technologies in the area of lethal autonomous weapons systems should not be anthropomorphized.
9. Discussions and any potential policy measures taken within the context of the CCW should not hamper progress in or access to peaceful uses of intelligent autonomous technologies.
10. CCW offers an appropriate framework for dealing with the issue of emerging technologies in the area of lethal autonomous weapons systems within the context of the objectives and purposes of the Convention, which seeks to strike a balance between military necessity and humanitarian considerations.
— -
Dr. Lydia Kostopoulos’ (@LKCYBER) work lies in the intersection of people, strategy, technology, education, and national security. She addressed the United Nations member states on the military effects panel at the Convention of Certain Weapons Group of Governmental Experts (GGE) meeting on Lethal Autonomous Weapons Systems (LAWS). Her professional experience spans three continents, several countries and multi-cultural environments. She speaks and writes on disruptive technology convergence, innovation, tech ethics, and national security. She lectures at the National Defense University, Joint Special Operations University, is a member of the IEEE-USA AI Policy Committee, participates in NATO’s Science for Peace and Security Program, and during the Obama administration has received the U.S. Presidential Volunteer Service Award for her pro bono work in cybersecurity. In efforts to raise awareness on AI and ethics she is working on a reflectional art series. www.lkcyber.com