Lethal artificial intelligence: what is the danger of “weapons of the future”

2

Since the beginning of time, humanity has been at war. Initially, the conflicts were primitive. Tribes of ancient people fought with sticks and stones for land and food.

In modern wars, we already use drones, missiles with homing systems, electronic warfare systems and other high-tech weapons. But this is not the limit.



Right now, dozens of the largest military-industrial complex companies around the world are working on autonomous systems that could carry out military tasks without human intervention. No, we are not talking about anthropomorphic robots like the Terminator from the film of the same name. Engineers are trying to create a software algorithm that in the future will be implemented into existing systems and will negate human participation in their management.

For example, today drones are widely used in armed conflicts. In this case, each individual drone is controlled by an operator, and it is the person who makes the decision to use certain UAV capabilities on the battlefield.

In turn, an AI-based system could perform the task autonomously, relying on data about a potential target loaded into it, as well as analyzing the surrounding situation.

Sounds fantastic. Meanwhile, some countries are already close to introducing such systems into their troops.

The only question that remains is what such an evolution of armies will lead to.

It is clear that AI-based systems have a number of advantages. In particular, the use of autonomous weapons completely eliminates the human factor, which often leads to additional civilian casualties and destruction. In addition, robotic systems will save the lives of military personnel who will not have to participate directly in hostilities.

However, “lethal artificial intelligence” also has significant drawbacks. One of them is the risk of such a system being hacked by hackers, which will lead to the most unpredictable consequences. In addition, mass production of autonomous weapons will inevitably make them available to non-state and terrorist organizations.

Right now the UN is discussing the possibility of limiting or even banning such weapons for ethical and other reasons. However, the process is going so slowly that even before the above-mentioned decision is made, more than one army in the world may have weapons with AI.

2 comments
Information
Dear reader, to leave comments on the publication, you must sign in.
  1. -2
    15 December 2023 11: 31
    Labor made a man out of a monkey.
    Artificial intelligence will make a monkey out of a person.
  2. DO
    0
    15 December 2023 19: 03
    The main advantage of autonomous drones over operator-controlled ones is that they can perform combat missions without external control and without the GLONASS positioning system, therefore:
    - can operate at a depth of hundreds of kilometers deep behind enemy lines, where it is difficult or impossible to organize radio channels for their operator control;
    - invulnerable to enemy radio jamming means.

    the risk of such a system being hacked by hackers

    is part of the risk of autonomous weapons being available to

    for non-state and terrorist organizations

    For the illegal use of autonomous weapons requires overcoming organizational and/or technical systems of protection against unauthorized access to programming and activation of autonomous drones.