Site icon SCC Times

Autonomous Weapon Systems: Understanding the Potential Human Rights Violations

The evolution of artificial intelligence (AI) over the years has led to the realisation of the dreams of robot-human interaction. This idea of a robot-human interaction on a whole new level has been a topic of science fiction novels and series right from the early 1950s. The most noted work during this time was Issac Asimov’s literary fiction – I, Robot[1]. The book is famous for introducing the three important laws of robotics that:

(i) A robot may not injure a human being or, through inaction, allow a human being to come to harm.

(ii) A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

(iii) A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

Though the idea may have seemed to be farfetched at the time, with the rapid development of technology in recent years, we already have fully working autonomous machines. With debates sparking globally regarding the legal and ethical issues of autonomous weapon systems[2] (“AWS” hereafter), reconsideration of the importance and viability of the aforementioned three laws of robotics becomes pertinent. This article tries to analyse the existing conundrum surrounding the AWS and its usage while simultaneously trying to envision a robust regulating framework governing their functioning and usage.

Autonomous weapon systems: The UN Convention on conventional weapons

The AWS has already been declared a success[3] and it is only a matter of years before we see their active deployment and usage in the battlefields across the globe. However, the concerns regarding their potential usage in armed conflicts have raised many questions not just from experts[4] and scholars[5], but also from common civil bodies[6] and other related organisations. Owing to the rising concerns, a discussion on AWS was held at the 5th Review Conference of the Member States of the United Nations (“UN” hereafter) Convention on Certain Conventional Weapons (CCW). Post the Convention, an expert body was established with the aim to deliberate upon the legal issues surrounding the potential use of AWS. These issues included the question of morality and ethicality as well as compatibility of the use of AWS with international humanitarian law (“IHL” hereafter) and international human rights law (“IHRL” hereafter)[7]. While some believe that a complete ban on AWS usage is the only answer, which is easily deducible from the rise in popularity of the “Stop Killer Robots” movement, however, it is a rational understanding that the AWS in a mere technological advancement and cannot be just “abandoned”. The best way to deal with the potential ill impacts of AWS could be to bring its deployment and usage strictly under international regulations.

Human rights and AWS: Understanding the legal implications

Though the legal debate surrounds the AWS is primarily focus on international humanitarian law[8], it is the need of the hour to steer the discussion towards understanding the human rights angle of the issue. The active assault that AWS could unleash on victims can be seen as the usage of force by humans through AWS. It is a steeled principle that human rights law applies to the use of force at all times. Further, it is complementary to IHL during armed conflict, and where there is no armed conflict it applies to the exclusion of IHL. Thus, even though the IHL implications indeed needs to be deliberated upon, the human rights angle needs to be the pivotal point under consideration. However, to understand this we need to distinguish between three paradigms and the application of human rights law in each of them.

1. Armed conflict

Right to life and the right to dignity are two of the most important human rights that are to be considered in this context. While we have already discussed above that human rights complement the IHL, however, at times of armed conflict, IHL being “more specialised” comes into application.[9] But, even though IHL comes into effect, both sides still maintain their human rights and hence, there is change but merely on the contextual level. This means that during such times we can interpret the provisions of IHL with reference to human rights.

2. Non-armed conflict

There can arise situations where the conflict in question does not qualify the definition of an armed conflict because of the fact that it fails to fall in an area that is an established battlefield, e.g., anti-insurgency, anti-terrorism activities. During these situations, the application of IHL is not possible and hence the issues need to pass the subjectivity test of human rights alone. For example, the usage of drones and other similar activities may fall in the same context as AWS if used for anti-terrorist operations and should therefore be governed by IHRL and not IHL as a potential armed conflict is absent.[10]

3. Domestic law enforcement

While as of now the AWS has no prospects of being used domestically, but then the possibility of a toned-down model with less lethal weapons being used for domestic law enforcement cannot be ruled out altogether. This would include the deployment of AWS for not merely guarding of prisons but to the even wider spheres of daily law enforcement. As such usage of AWS would constitute a use of “force”, it can be easily brought under the ambit of IHRL.

Thus, we can see that three major paradigms exist and what is common with the usage of AWS in all of these is the fact that AWS and its usage would fall under the ambit of IHRL. It is this hypothesis that will help us understand how AWS has a huge impact on human rights and why the demands of international regulations are being made.

AWS and the weapon laws:  Martens clause and the implications

The international human rights are extremely stringent when it comes to the use of force and firearms, however, it in no way poses any restrictions or limitations on what kind of weapons can be manufactured. This is where we can find the answers in IHL which has a separate and special branch of weapon laws that clearly specify as to which weapons are permissible to be used in armed conflicts and which are not. Article 36 of Additional Protocol I to the Geneva Convention[11] makes it mandatory for the State parties to subject new weapons to a review;

to determine whether its employment would, in some or all circumstances, be prohibited by this Protocol or by any other rule of international law applicable to the High Contracting Party.

It is important to note that the words “or by any other rule of international law” can easily seem to be indicative of the fact that to pass the scrutiny under Article 36[12] review, the weapon in question has to even abide by the international human rights law which prominently includes the right to life, dignity, etc.

The Martens Clause has formed a part of the laws of armed conflict since its first appearance in the Preamble to the 1899 Hague Convention (II)[13] with respect to the laws and customs of war on land and states that:

in cases not covered by the law in force, the human person remains under the protection of the principles of humanity and the dictates of the public conscience.

The words are clearly indicative of the fact that special emphasis has been supplied to “principles of humanity” and “the public conscience”. At a point in history where people have finally started to understand the importance of human rights, it is only more fitting to have the reasoning that weapons above a certain level of autonomy would be a disgrace to “principles of humanity” and “the public conscience” as probably the two parameters will transform to mere factual matrix of data for the AWS. Further, the clause also states among other things that the absence of an explicit prohibition in no way means that usage of such weapons are permitted.

Right to life and dignity: Analysing the impact

Article 6(1) of the International Covenant on Civil and Political Rights (ICCPR)[14] states that “every human being has the inherent right to life. This right shall be protected by law. No one shall be arbitrarily deprived of his life”. While the IHL has certain parameters such as “collateral damage” or “combatant’s privilege” to justify the use of force, such concepts are totally alien to the IHR. In place of these, the IHR has other parameters such as “necessity” and “proportionality”. While the formers indicate that usage of force shall be the last resort, the latter points to the maximum force that can be used to achieve a specific legitimate purpose. Further, the basic Principle 9 deals specifically with firearms clearly states that “use of firearms may only be made when strictly unavoidable in order to protect life”.

The doctrine of self-preservation which allows for use of lethal force by policemen in situations of grave danger is not applicable in the case of AWS as it fails to qualify as a human and hence, a danger to it poses no danger to “human life”. The very fact that the kill list is prepared by a machine is highly incongruous with the IHR laws which clearly point towards the premise – “the final decision to use lethal force must be reasonable and taken by a human”.

Article 1 of the Universal Declaration of Human Rights[15] (UDHR) provides that:

All human beings are born free and equal in dignity and rights. They are endowed with reason and conscience and should act towards one another in a spirit of brotherhood.

Though the ICCPR[16] does not entail the right to dignity as a spate right, it is a constitutive part of a number of the rights contained in that Treaty. The last point deliberated in the preceding paragraph clearly points that death by data matrix means people are treated as interchangeable entities, like inanimate entities and not as a human who has an inherent dignity. Critical decisions such as the ones made to deploy force especially deadly force needs to be only taken after due consideration by a “human being” who has rationally analysed the situation and concluded that there is no other alternative in the specific case and hence holds the responsibility for the outcome of his final decision. Thus, there exists no speck of doubt that the right to life as well as the right to dignity are vehemently violated in the use of AWS.[17]

AWS and human rights: The future

After the above deliberations, we are compelled to hold congruence with the views of Dr Akbar Nasir Khan[18]. He has very often advocated the view that sustainability is a key component that needs to be used as a yardstick to objectify the effectiveness of a policy. As far as the AWS and other related systems are considered, they have been on the receiving end of endless criticism primarily due to human rights violations. The UDHR has become part of customary international law and hence is applicable during both, the war and peace times. Also, the right to life and dignity has become part of jus cogens over time. These considerations and developments keep on posing a huge question when it comes to legitimising such AWS mechanisms. It is to be understood that technology will keep on developing and will keep on encroaching the boundaries of human control. However, it will be up to us to decide in favour of whether to retain human control over life and death decisions or relinquishing it. However, what needs to be understood is that once lost such human control will be impossible to be regained. The international community needs to understand this peril of AWS and come together, after understanding the potentials and drawbacks, to deliberate peacefully and tactfully on the future of AWS for any decision made will have a huge impact on human life in the years to come.


Student, Bachelor of Law at Integral University, India;  Research Analyst for Centre for New Economics Studies, O.P. Jindal Global University, India, e-mail: rameezrazaofficial@gmail.com.

†† BA LLB at National University of Study and Research in Law (NUSRL), Ranchi, and the Coordinator of Think India, Ranchi, e-mail: raj.nusrl@gmail.com.

[1] Isaac Asimov, I, Robot (Bantam Books 2004).

[2] Lethal Autonomous Weapons Systems, Future of Life (4-3-2021, 09:09 P.M.) <https://futureoflife.org/lethal-autonomous-weapons-systems/?cn-reloaded=1>.

[3] Alcides Eduardo dos Reis Peron and Rafael de Brito Dias, “No Boots on the Ground”: Reflections on the US Drone Campaign through Virtuous War and STS Theories, 40(1) Contexto Internacional, (2018) 53-71.

[4] Ibid.

[5] Akbar Nasir Khan, The US Policy of Targeted Killings by Drones in Pakistan, 12(1) IPRI Journal (2011) 21- 40.

[6] Human Rights Watch, Q&A: US Targeted Killings and International Law (7-3-2021, 06:16 P.M.) <http://www.hrw.org/news/2011/12/19/q-us-targeted-killings-and-international>.

[7] K.J. Heller, One Hell of a Killing Machine, Signature Strikes and International Law, 11 JICJ (2013)  91.

[8] Ibid.

[9] Christopher Drew, Drones are Weapons of Choice in Fighting Qaeda, The New York Times, 11-3-2021, 10:43 P.M.) <http://www.nytimes.com/2009/03/17/business/17uav.html>.

[10] Peter Bergen and Katherine Tiedemann, The Year of the Drone: An Analysis of US Drone Strikes in Pakistan, 2004- 2010, Pak Tea House (12-3-2021, 09:33 A.M.)  <http://pakteahouse.wordpress.com/2010/03/02/the-year-of-the-drone-bypeterbergen-and-katherine-tiedemann/>.

[11] Geneva Convention Relative to the Protection of Civilian Persons in Time of War of 12-1-1949 (Fourth Geneva Convention), 75 UNTS 287 (1949).

[12] Ibid.

[13] International Committee of the Red Cross, Hague Convention (II) with Respect to the Laws and Customs of War on Land, 29-7-1899.

[14] UN General Assembly, International Covenant on Civil and Political Rights, United Nations, Treaty Series, Vol. 999, p. 171.

[15] UN General Assembly, Universal Declaration of Human Rights, 10-12-1948,  General Assembly Resolution 217 A (III) http://www.scconline.com/DocumentLink/BNEbfDpI.

[16] Supra note 14.

[17] US Drone War Delivers Results, But at What Price?, Dawn (Print) – Islamabad, (10-1-2010) https://www.dawn.com/news/911327/us-drone-war-delivers-results-but-at-what-price

[18] Supra note 5.

Exit mobile version