Danger Ahead: Robotics in War are Ethically Flawed

Danger Ahead: Robotics in War are Ethically Flawed

The world is moving fast in adopting robotics in war without addressing the ethical issues

Mohsen Fakhrizadeh, an Iranian nuclear scientist was driving on a highway outside Tehran with a security detail on November 27, 2020. A satellite-controlled machine gun 'zoomed in' on his face and fired 13 rounds killing him on the spot. Fakhrizadeh must have never imagined that his death would be caused by futuristic robotics in war technology, which further shocked the whole world.

The incident was just the beginning of advanced warfare robots' performance in the real world. Before the killing, even when people were aware of autonomous weapons and robotic warfare technology, no one actually feared it. But the brutal attack shined a bright light on what a future war ground would look like. Artificial intelligence and robotics in warfare are nothing new for many of the world's military. Already, the defense departments of many countries are using Lethal Autonomous Weapons System (LAWS) to enrich their military capabilities. But what is moving too fast is the development of robotics in war applications. They are becoming extremely sophisticated that at some point, people fear the concept of 'killer robots.' On the other hand, there is a surge in the usage of autonomous weapons which eventually leads to controversial debates on its ethics.

Why are we moving to robotic warfare and autonomous weapons?

Whenever we hear about advanced warfare robots, we might wonder if military personnel are not enough to carry out operations. Yes, in the digital world that is revolving around technology, army personals with advanced weapons are nothing in front of robotic warfare. The face of war is drastically changing. In ancient days, people used to fight with swords, but today, we are using army tankers and sophisticated guns. This is the evolution that we are experiencing with technology. Besides, humans are flesh and blood. We have health issues, conditions and most importantly, we get tired. These are some of the grounds that have given a reason for research departments to come up with advanced warfare robots. A study has unraveled the hurdles faced by human fighters on the ground based on demographic variables. The dataset contains multiple health and demographic features among patients in 2013. According to the dataset, the top three conditions that veterans suffer from are hypertension, lipid disorders, and diabetes. Besides, stress and hypertension also affect military readiness.

The surge of Lethal Autonomous Weapons System (LAWS)

Lethal Autonomous Weapons System (LAWS) are a special class of weapon systems that use sensor suits and computer algorithms to independently identify a target and employ an onboard weapon system to engage and destroy the target without human control of the system. Although these systems are not yet in widespread usage, it is believed that they would enable military operations in communications on degraded and denied environments in which traditional systems may not be able to operate. The attractiveness of such technologies is obvious and it will revolutionize warfare. Lethal Autonomous Weapons System falls under three categories namely munition, platforms, and operational systems. While concerns may be overstated for Lethal Autonomous Weapons System, when thinking about autonomous weapons platforms or operational systems for managing wars, it raises more important questions on ethics and algorithm. Caution and a realistic focus on maintaining the centrality of the human in decisions about war will be critical.

Touching the ethical pain points

Not just robotics in war, even in general, artificial intelligence is facing ethical challenges routinely. Humans have a thing called the sixth sense that decides whether it is right or wrong to carry out certain actions. Army personnel finalize whether to fire at a person holding a gun or not. But robotic warfare and autonomous weapons are different. They are programmed to fire at people who pose threat. This has led us to debate on the ethical grounds of robotics in war. Questions like who to blame when the autonomous weapon does anything wrong, what if robotic warfare doesn't follow unilateral risk-free war conditions, what to do when machine discriminate, how to manage biased AI algorithm behind robotics in warfare, etc. are still unanswered. Henceforth, even the future of warfare is robotics, governments need to first clarify the controversies before moving ahead with the autonomous weapon.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net