We wish to begin by saying that we agree with you and with other delegations that have spoken here this morning. We believe that there is considerable convergence on this issue. It is true that current legal review obligations under IHL apply generally speaking to all types of armaments, including those using emerging technologies such as autonomous functionalities.
Nonetheless, we have to recognise that there are specific challenges when it comes to legal reviews for AWS. There are technical problems with guaranteeing the reliability of such reviews for instance. And we know there are questions of automation, questions of the black box, for instance, that are technical problems, and these issues are touched upon in the US paper. There are also other issues relating to the functioning of these systems: they will be increasingly taking decisions, selecting and engaging targets, deciding whether or not to apply force — decisions that were previously taken by a human being, by a human operator. So we must ensure that the principle of proportionality is always respected. These then are the challenges that we face in terms of the legal reviews.
The purpose of the reviews is to ensure predictability, reliability, understandability and explainability of these weapons systems. That being so, it is important that we recognise that legal reviews will have to be conducted at national level, and respect confidentiality and intellectual property issues very strictly indeed. In the operational context of legal reviews, and of these assessments, then it’s important that we apply procedures — and we’ll have to decide in advance what these procedures are — to allow us to overcome the technical challenges within that specific context.
LAWS clearly will require that those conducting the reviews are appropriately trained and qualified. We will also have to decide on how regularly such reviews will be conducted. We need to take due account of what has been suggested by Palestine, Pakistan and other elements that have been put forward by the US and others; also the Philippines; and indeed I would include also proposals made here in 2022.
Indeed throughout our work, we’ve heard a lot about national practice in this area: the US, Australia and the UK have recalled this in their statements here this morning. As we can see by looking at the relevant working papers, legal reviews of autonomous functionalities within armament system must be based on a fundamentally preventive approach, and no green light should be given unless there is absolute certainty about all of these issues that I have just been outlining.
However, as the ICRC have already stated, it’s difficult to believe that your review per se, will allow us to overcome all of the challenges posed by autonomous weapons systems, both now and in the future. We see, as I say that increasingly decisions to select a target, to decide whether or not to use force, will be taken by machines, whereas in the past, they were taken by a human operator, and that’s going to change the whole context of these reviews. Distinction and precaution and proportionality are clearly the key principles that will always have to be respected, and we believe that we need regulation to guarantee that within the framework of a legally binding instrument.
Statement by Mexico under agenda item 5, topic 5 (GGE LAWS, 9 March 2023) 11:24:15 (interpretation)