Understanding Lethal Autonomous Weapon Systems or LAWS: What are they, what makes them dangerous
This is an AI generated image
Various military conflicts across the globe have shown that artificial intelligence (AI) has become a game-changer and force multiplier in the battlefield. From drones, robot dogs and mosquito-sized unmanned aerial vehicles, AI-powered weapons have changed the way wars are fought, information is collected and analysed, and defence equipment of various kinds is deployed or destroyed.
While drones (manned or unmanned) and robots require some level of human intervention, there is a class of weapons that can conduct military operations without human guidance, once switched on. They are Lethal Autonomous Weapon Systems (LAWS)—military technologies capable of independently searching for, selecting, and engaging targets without human intervention once activated.
While there is no recorded use of LAWS on battlefields anywhere, according to the US Congress, there is a widespread belief, especially in countries like the United States, that such weapons are important to enhance conventional military deterrence. According to the US Congress, the country's policy does not prohibit the development or deployment of LAWS.
Countries like the UK, China, Israel, Russia, South Korea, and Türkiye too are also reportedly investing in building these systems.
Why these could be dangerous
According to the International Committee of the Red Cross (ICRC), autonomous weapons are those that are able to “independently select and attack targets, i.e., with autonomy in the ‘critical functions’ of acquiring, tracking, selecting and attacking targets.” While these weapons could be extremely useful and could be deployed in various environments— air, land, sea, underwater, and even in space—they also present ethical and humanitarian concerns.
When machines act without direct human oversight, assigning responsibility for mistakes or unlawful actions becomes extremely complicated, especially in cases of unintended civilian casualties.
There is also the problem of adhering to principles of international humanitarian law, such as distinguishing between combatants and civilians, using force proportionally, and only attacking legitimate military targets. Besides, there is the fear of these systems falling into the hands of non-state actors.
Where does India stand
Lieutenant General Manjinder Singh, South Western Army Commander during a seminar on 'Next Generation Combat-Shaping Tomorrow's Military Today' at Jaipur Military Station on Wednesday, said that strong ethical standards, human oversight and adherence to international humanitarian law are important when developing LAWS.
At the December 2023 UNGA resolution on LAWS, India said the Convention on Certain Conventional Weapons is the appropriate forum to discuss issues relating to lethal autonomous weapons, with a view to striking a balance between military necessity and humanitarian imperatives. "Noting that a substantial body of work has been done and continues to be done by the Group of Governmental Experts associated with that Convention on emerging technologies in lethal autonomous weapons, he said that work must be built upon, including to deepen a common understanding of definitions and characterization of lethal autonomous weapons," a statement from the UN read.
Defence