Artificial intelligence has become the most popular sector of technology. Despite having suffered losses close to 200,000 million euros, companies have full confidence in the virtues of these tools. For this reason, they increasingly integrate functions based on artificial intelligence in different fields. However, there is one in which this option could be the most dangerous: the military industry.
In a recent experiment, Stanford University, the Georgia Institute of Technology, and Northeastern University collaborated with the goal of knowing what would happen if governments give control to AI in the arms field. Thus, as reported by the PC Gamer portal, they developed a text game in which several language models, representing 8 different countries, had to dialogue to find agreements. And, surprisingly, many of these meetings ended in nuclear wars.
The oldest models are the most aggressive. LLMs such as GPT-4, GPT-3.5, Claude 2, or Llama 2 were used. In essence, the turn-based simulation game gave artificial intelligence the opportunity to dialogue with each other, a crucial aspect to define the future of these nations. However, the experiment results showed that, from a neutral scenario, GPT-3.5 and Llama 2 were prone to conflicts that ended in nuclear war. Unlike these, GPT-4 and Claude 2, more evolved, avoided this aspect.
Thus, the study specified that, as they become more sophisticated, language models are less prone to nuclear wars. Therefore, the researchers revealed that, with the data from their analysis, it is evident that “the use of LLM in military decision making is fraught with complexities and risks that we still don’t understand.” In this way, they have managed to demonstrate that, if they have the opportunity, some AIs would not hesitate to press the red button.
In 3DGames | He has earned 115,000 euros with works made by AI and the cake is revealed after five years. This is the story of Rubén Lucas García.
In 3DGames | The first big scam with AI raises 23.8 million euros. They tricked the victim with a call in which he was the only real person.
Main image by Ilja Nedilko (Unsplash)