We live in a world where the growing impact of technology on all facets of our lives is inevitable. The best example is in the advancement of drones.
And while technological advances are happening at a rapid pace, are we truly prepared to deal with the fallout of what happens when Artificial Intelligence (AI) technology decides to take things into its own hands?
AI Has Already Hunted People Down
A report to the UN Security Council revealed that in March 2020, an autonomous drone hunted down and attacked people in Libya without any input from its human controllers.
This was the first time that such an attack was independently launched by AI, though it still remains unclear whether the drone actually killed anyone.
What Happened In Libya?
The incident hails back to March 27, 2020, when Libyan Prime Minister Fayez al-Sarraj ordered his military to go ahead with Operation Peace Storm.
This mission witnessed unmanned combat aerial vehicles (UCAV) being used against the Haftar Affiliated Forces (HAF). Using drones in combat situations is nothing new and has been used for years.
What made this attack in Libya different is that the drones used in the attack were operated without any human input. Moreover, it took place after the initial attack and without human input.
The retreating HAF and logistics convoys were ultimately hunted down and engaged remotely by the drones, including the STM Kargu-2 and other munitions.
What Is The STM Kargu-2?
The STM Kargu-2 is a lethal autonomous weapons system that is programmed to attack targets without needing any data connectivity between the munition and the operator, truly putting into effect the ‘fire, forget and find’ capability.
The drone STM Kargu-2 is a rotary-wing attack drone specially designed for anti-terrorist operations or asymmetric warfare. According to the manufacturers, this drone is highly effective against any target owing to its real-time and indigenous image processing capacities and machine learning algorithms:
In test runs as well, the drone has proven to be highly effective against human targets.
Since the terrorist units are never trained to defend against attack from such type of new technology, they are usually found retreating in total disarray. Even after the retreat, the HAF units were subjected to continuous harassment from other unmanned combat aerial vehicles and various lethal autonomous weapons systems.
Significant Casualties On Enemy Forces
The report submitted to the UN noted that these drones were highly effective in inflicting significant casualties on the enemy forces as well as their Pantsir S-1 surface-to-air missile systems.
On the one hand, the technology did what it was made to do:
However, this attack, purely directed by AI, gives rise to concerns about whether fully autonomous drones are capable of meeting humanitarian law standards worldwide.
This is the biggest evidence to date that drones, complete autonomy, and deadly weaponry shouldn’t mix.