Statement by the ICRC (15 April 2015)

[…]

Challenges to respect for IHL posed by predictability and reliability of the weapon

As weapon systems become more autonomous, are increasingly able to determine their own actions, and are deployed for lengthier periods and in more complex environments, the results/outcomes of their use will necessarily become less predictable.

The degree of predictability (which requires knowing what precisely the weapon will do; the results/outcomes of use), which also depend on the factors I have listed, are crucial considerations both at the legal review stage and, if the weapon passes the legal review, at operational stage by military commanders when planning and deciding to launch attacks using AWS. The key questions at the review stage is: how can the review determine whether the AWS will operate in accordance with IHL if its performance is unpredictable, how can the risks of using such system be evaluated and how can such risks be mitigated?

A new weapon system with autonomy in its critical functions, like any new weapon system, must undergo a rigorous legal review, involving the requisite multidisciplinary expertise, to determine whether it can be used in according with IHL rules. In other words, the obligation to carry out a legal review of new weapons creates an onus on the State developing or acquiring the new weapon to show that it can be used in accordance with IHL. The ICRC encourages States that have not yet done so to establish weapons review mechanisms and stands ready to advise States in this regard. In this respect, States may wish to refer to the ICRC’s Guide to the Legal Review of New Weapons, Means and Methods of Warfare.

Reviewers must assess the weapon’s lawfulness in relation to the normal or expected circumstances of its use. This requires foreseeing how the weapon will perform in the circumstances and environment in which it is intended to be deployed (the factors I have just mentioned), based on the weapon’s design and how it actually functions.

This will enable the review to set clear parameters on their operational use and deployment. If it is found that it could be used lawfully only in limited circumstances, these limits on use imposed must then be incorporated in the instructions and rules of engagement applying to the weapon, to ensure it is not misused.

I should stress that the permitted circumstances of use may in some cases be so limited and complex, and therefore unrealistic to apply in real-world scenarios, that it may be more appropriate to prohibit the weapon’s use altogether.

A significant challenge in reviewing the legality of an AWS will be how to test it to be sure that it will do what the human operator wants it to do, in other words how to test the weapon’s predictability.

Predictability about the way an AWS will interact with its environment must be sufficiently high to allow an accurate legal review. As I said before, the greater the autonomy in the tasks of the weapon, the lesser the ability to determine whether the system will function in accordance with IHL. If at the legal review stage, it is not possible to accurately predict, through testing or otherwise, whether or not the weapon will respect IHL, then surely it cannot pass the legal review. A key challenge in this respect is how to properly test the AWS to ensure it can be used in accordance with IHL. As we have heard, there are no standards for testing autonomous systems.

Operational decisions to use AWS

Predictability of the weapon system — again, meaning its foreseeable effects — is also relevant at the operational stage, assuming the weapon has passed the legal review. Indeed, not all legal questions will have been resolved at the legal review phase. The rules of distinction, proportionality and precautions in attack are applied contextually, on a case-by-case basis, by commanders and soldiers the field. It is the commander that plans and decides upon the attack, using the means of warfare at his/her disposal and taking into consideration the conflict environment.

[…]

Questions for consideration of States at multilateral level

If some specific usages of a specific AWS are problematic under IHL, there may be a need for technical fixes, or limits on use, or outright prohibition of the specific weapon system.

Some argue that the faithful implementation of IHL’s general rules — including through the legal reviews of new weapons and proper training of the users of these weapons — should be sufficient to address any legal concerns regarding autonomy in the critical functions of weapon systems.

However, history has shown that when using certain weapons, the specific characteristics of the weapon combined with the inconsistent application of the general rules of IHL to that weapon (be it at the legal review stage or at the operational stage) may reveal a need to clarify the law and ultimately develop weapon-specific rules. Regarding AWS in particular, leaving it up to each State to determine the lawfulness of specific AWS they are developing or acquiring risks inconsistent application of IHL, with e.g. some States applying limits, and others not.

What is clear is that there are too many questions regarding the interpretation of IHL rules as they apply to AWS, including issues of accountability for violations (which I do not have time to address here) to leave solely to national legal reviews. The many questions about AWS raised under IHL (not to mention the ethical and other questions), underscore the need for continued discussions in multilateral forums such as the CCW, with a view to considering various to policy and other options available to States to address the legal (and other) challenges raised by AWS.

[…]

Ms Kathleen Lawand, Panel on possible challenges to IHL due to increasing degrees of autonomy (15 April 2015)