This is, like our Swedish colleague, in response to your questions. Regarding the first question, my delegation agrees with what seems to be the consensus that legal reviews do not ensure compliance but are a very important measure.
Regarding the second question, we do see a risk of divergent outcomes in legal reviews and the fragmentation posed in your question. We definitely want to avoid this kind of fragmentation. And we agree with our Algerian colleague’s desire for unified and universal rules. And this is precisely what animates the concerns that we’ve tried to explain with the label meaningful human control. We see the effort to develop a new standard on meaningful human control as leading to fragmentation and even conflict with existing norms. Instead, as our Dutch colleagues have emphasised, all states have already accepted IHL as universal and applicable norms. The issue, of course, is that it’s not perfectly clear how existing IHL norms apply to this novel context. From our perspective, the ideal vehicle for avoiding the risk of fragmentation of IHL and divergent understandings would be an instrument, an international instrument, that clarifies how universally accepted IHL norms apply in the specific context of weapons with autonomous functions. This more detailed articulation of the primary rules of IHL as applied in this context with strengthened legal reviews, promote consistency across legal reviews and national practice and avoid fragmentation.
Regarding the third set of questions, I would make three points. First, the legal reviews themselves do not depend on a definition. In US military practice, at least we seek to review all weapons regardless of whether it would be characterised as a LAWS or weapon with autonomous functions. Second, for the purposes of international exchanges of good practices on legal reviews, we think it would be helpful to have a more clear understanding of the scope of this exchange. We should focus on weapons systems with autonomous functions in target identification, selection and engagement. We should ensure that the scope includes such weapons systems with AI (artificial intelligence) capabilities. Third, we wanted to endorse the point that a number of delegations — Philippines, Netherlands — have made about understanding technology. It’s not so much that national definitions will help the review process on a national level, but a better understanding of the technical characteristics and the technology is critical. We need to know the technology, understand it to effectively conduct the legal review. And it really benefits all of our discussions to be informed about the technical and military aspects.
Statement by the United States under Agenda item 5(5) (9 March 2023) (2nd intervention, transcript)