Gendering the Legal Review of New Means and Methods of Warfare

Gendering the Legal Review of New Means and Methods of Warfare

This piece is part of a series on Gender, Conflict, and International Humanitarian Law (IHL), co-hosted by the ICRC and Just Security. In the coming months, the series will feature contributions from a range of experts exploring the humanitarian, legal, and military implications of – and challenges raised by – the integration of a gender perspective into the interpretation and application of IHL.

It has been well established that conflict exacerbates previously existing gender inequalities while often entrenching gender roles (see e.g., other articles in this series here). For instance, research has shown that women have less access to health services during armed conflict than men, especially in protracted conflicts. The lack of access is particularly notable in the context of maternal mortality, as more than 50 percent of maternal deaths worldwide occur in settings affected by armed conflict or natural disasters.

This example highlights how women, men, boys, and girls are impacted differently in times of armed conflict, when hostilities kill or injure civilians, damage or destroy essential services, and contribute to population displacement. Therefore, as outlined by the latest report titled Gendered Impacts of Armed Conflict and Implications for the Application of IHL of the International Committee of the Red Cross (ICRC), we need to take a gendered view of international humanitarian law (IHL) in order to better reduce civilian harm in times of armed conflict.

This article focuses on how a gender perspective should apply in the context of the legal review during the development and use of new means and methods of warfare (in particular, those that use artificial intelligence and/or machine learning), as required in Article 36 of the Additional Protocol I to the Geneva Conventions (AP I). Different weapons are used for different purposes and have differential impacts on combatants and civilians alike, so when considering which type of weapon or system to develop, gender dimensions play a role.

The Application of Article 36 of AP I through a Gender-Sensitive Perspective

The Geneva Conventions and their Additional Protocols establish the principle of non-discrimination, which prohibits any adverse distinction between individuals on the basis of sex. This principle can be found in articles such as Common Article 3 of the Geneva Conventions, Article 12 of the First and Second Geneva Conventions and Article 2(1) of Additional Protocol II, among others. However, scholars note that only adverse distinctions are banned, which can lead to situations of equality, but not equity. Therefore, organizations like the ICRC note that IHL requires substantive, not formal equality (in the context of the interpretation of Common Article 3, see paras. 610-616).

In IHL, the role and status of women has been traditionally relegated to one of victims and child-bearers, putting an emphasis on the protection of women’s honor and relying on assumptions about inherent vulnerability, which promote outdated, stereotyped notions of women as fundamentally weak (see, for example, Article 27(2) of the Fourth Geneva Convention, which specifically invokes women’s “honour” as a ground for protection – historically, a coded reference to sexual purity). This traditional view of gender roles, in turn, centers men as combatants and warriors, while potentially under-protecting civilian men and boys.

In relation to the development and use of new technologies, Article 36 of Additional Protocol I obliges States to review new weapons, means, or methods of warfare, which includes physical and digital systems. In order to comply with Article 36, States must ensure that new weapons systems are capable of being used in compliance with IHL, particularly in the context of the conduct of hostilities.

However, the article makes no explicit reference to the need to undertake a gender-sensitive review of the employment of the new weapons, means, or methods of warfare which the State in question studies, develops, acquires, or adopts. Unsurprisingly, the Commentaries of 1987 also do not contain any specific references to the need for considering gender aspects. Regrettably, even the Guide to the Legal Review of New Weapons, Means and Methods of Warfare, published by the ICRC in 2006, does not include any references to gender – despite acknowledging that “the reviewing authority will have to take into consideration a wide range of military, technical, health and environmental factors.”

The consideration of gender factors as part of the legal review of new weapons, means, or methods of warfare is crucial both in order to minimize civilian casualties, and for States to meet their humanitarian law obligations related to the general prohibitions or restrictions on weapon, means and methods of warfare under Article 36. As the new ICRC report notes, “sex-specific harms arise from the use of certain weapons, and gender and gender analysis of civilian casualty data indicates that women, men, boys and girls are killed at different rates depending on the context and weapon used.” Examples of this reality have been studied particularly in the use of explosive weapons or nuclear weapons, among others, such as mines and chemical and biological weapons. In the case of explosive weapons, while men make up the highest proportion of explosive weapons casualties, women, who due to the indirect effect of the explosive weapons become the main income provider for households, are exposed to more violence and exploitation. In the case of nuclear weapons, it has been demonstrated that women are not only biologically more vulnerable to harmful health effects of ionizing radiation than men, but also are the ones most affected in relation to psychological health, displacement, social stigma and discrimination.

The Legal Review of Artificial Intelligence Systems

The modern battlefield has seen an increase in the autonomy of weapons, with escalating aspirations toward “full autonomy”, combined with “merged heteronomy” or human-machine partnerships. Examples of such technological developments include defensive and offensive weapons. Beyond automation, the use of artificial intelligence (AI) and machine learning (ML) in the modern battlefield is on the rise, for instance to classify images from surveillance drones.

The term “artificial intelligence” describes the use of computer systems to perform tasks normally requiring human intelligence, such as visual perception or decision-making. Machine learning consists of the use of AI systems which employ large amounts of data to develop their functions without following explicit instructions, using algorithms to draw inferences from patterns in data and “learning” from experience.

The increasing use of AI and ML tools in the battlefield, together with the fact that these tools can be applied to many different tasks including the collection and processing of information used in targeting, means that States must include these tools – and their underlying training data – in their Article 36 review processes. And, as outlined above, in order to be complete these reviews of AI- and ML-enabled means and methods of war must incorporate gendered impacts. The following paragraphs highlight some considerations of how to review the development and use of such tools with a gender-based approach in the context of AP I Article 36.

Gendering AI-Enabled Weapons

Gender-related considerations must be included starting from the study and development phases of new technologies of war, because technology is not neutral. Rather, it is political, and exists in a relationship of co-production with culture, politics and law. This means that implicit assumptions about which populations are worth (and legally entitled to) protection can be incorporated (intentionally or inadvertently) into the data used to train an algorithm, which can result in gendered outcomes, including the deepening of gender biases. Even more insidious, cultural assumptions about the supposed objectivity, superiority, and infallibility of algorithmic decision-making can further entrench these biases, creating a feedback loop of inequity. These impacts have been seen in the context of unmanned aerial vehicles used for targeting, where especially in signature strikes (which target a person based only on patterns of life), it has been deemed sufficient that the persons targeted “fit into the category ‘military-aged males’, who live in regions where non-State armed groups operate, and ‘whose behavior is assessed to be similar enough to those of terrorists to mark them for death.’” In the aftermath of these strikes, data on the gender, age, and assessed status – whether combatant or civilian – of the casualties can be reincorporated into algorithmic training datasets, reinforcing gendered disparities. This example highlights the dangerous practice of conflating the identification of combatants with gender. When developing AI-supported systems to aid in the process of targeting decision-making, it is crucial to assess how the algorithm will impact genders differently.

In situations of armed conflict, gendered impacts of hostilities on civilians remain under-documented. These data gaps, unless corrected for, are translated into training datasets for the developing algorithm which lack a gender perspective. Bridging this data gap is important because, for instance, knowing the patterns of movement or duties women or men perform in different areas or under different circumstances can enable an estimate of how exposed these different categories are to a particular kind of weapon. In this sense, the collection of sex and gender disaggregated data, as well as well as the collection of information on the expected behavior, roles, capacities, and needs of men and women, plays a crucial role.

Accurate data are essential for the State to effectively consider the environment in which any particular weapon is intended to be used. Indeed, a proper analysis of the specific context of each armed conflict or operation, including its gender dimensions, should be the starting point for the planning and conduct of all military operations. Any algorithm supporting the targeting process should incorporate data on how the characteristics, expected use, and foreseeable effects of the weapon can impact the civilian population – data which must be disaggregated by sex and gender in order to capture all relevant impacts. For example, in some contexts, due to gender inequalities, women can be much more affected by the harmful health effects of certain weapons, especially the ones which have effects requiring long-term after-care, due to their reduced access to health services. To consider these aspects, data collection and analysis processes must ask “how/if gendered differences in status and functions in society create different vulnerabilities to specific types of weapons or to specific use of weapons”, and “how/if gendered differences in, for example, freedom of movement, and access to resources and services create different vulnerabilities to specific types of weapons or to specific use of weapons.” Afterwards, the gender-based assessment should be integrated into the rules of engagement or operating procedures associated with the weapon which the AI system supports.

The steps above are necessary, but not sufficient for States to meet their Article 36 responsibilities when developing new AI-enabled means and methods of war. In addition to data quality and integration, considering who carries out the review is crucial. Currently, the majority of new technologies of war are developed in the Global North, in industries dominated by white men. Likewise, the military sector is a predominantly male space. These demographics impact how priorities are weighed, obligations met, and data collected, as outlined in the ICRC report. Therefore, conducting effective Article 36 review requires active efforts to correct for, and ultimately reverse these realities and promote gender parity in the military and technological sectors. Moreover, experts who have particular training on gender dynamics in military contexts, such as gender advisors and gender focal points, must be included in the legal review, since gender parity in itself does not guarantee the inclusion of a gender perspective. It is crucial that all key stakeholders and decision-makers (civilian and military) who need to apply IHL understand how gender factors impact the application of the law and ensure that this perspective is embedded in the legal review of new means and methods of warfare.

  Read the article in Just Security

Related publications

Lines of research:
Posted in Just Security, el 23/08/2022
You can now check the program and register for the Centre Delàs Annual Conference!