Can AI Create Biological Weapons? Experts Weigh In

Media

The impact that artificial intelligence may have on society is a worrying aspect for experts. Some, the most extreme, have predicted that this technology could end humanity as we know it. And others, more focused on labor aspects, point out that AI could leave us without jobs in the next 20 years if it continues with the current trend. Be that as it may, the lack of regularization and uncontrolled growth could pose a danger if measures are not taken.

Therefore, a recent study of OpenAI (the creators of ChatGPT) wanted to delve into a particular premise: how much help GPT-4 offers when creating biological weapons. In recent months, scientists, lawmakers, and ethicists have said that powerful AI models can help terrorists, criminals, and other malicious actors. And, for this reason, OpenAI wanted to delve deeper into this hypothesis with a study that involved 100 participants: 50 experts in advanced biology and 50 biology students at the university level.

Artificial intelligence offers a slight advantage

In a recent publication, OpenAI divided the 100 participants into random groups. On the one hand, some had unrestricted access to GPT-4, the language model that gives life to the most advanced version of ChatGPT; On the other hand, the rest were able to consult the Internet, but without having the option of using the chatbot. Thus, they were asked five research tasks related to biological weapons (for example: how to synthesize and rescue the Ebola virus?), and after this they scored from 1 to 10 based on different criteria such as precision, innovation, and completeness.

To no one’s surprise, the results determined that the group that used artificial intelligence achieved more detailed answers than the one who doesn’t. However, as OpenAI details, its final baggage was slightly higher, since the difference was not very wide. Therefore, it is true that AI can help malicious agents, but the creators of ChatGPT assure that the differences are minimal. And, in case it was necessary to clarify it, they revealed that there is only a small possibility, practically negligible, that ChatGPT helps create biological weapons.

Featured image from CDC (Unsplash)

Leave a Comment