According to the latest research, AI chatbots have shown concerning signs of gambling addiction in simulated gambling environments. Researchers from the Gwangju Institute of Science and Technology tested four advanced language models, including OpenAI's GPT-4o-mini and GPT-4.1-mini, Google's Gemini-2.5-Flash, and Anthropic's Claude-3.5-Haiku. These models were tested in a simulated slot machine environment, where each model started with $100 and faced the choice of continuing to bet or quitting.

AI Robot Investment

Image source note: The image was generated by AI, and the image licensing service is Midjourney

The study found that when models were given more freedom of choice, their bets tended to rise quickly, ultimately leading to "bankruptcy." These behaviors are similar to those of human problem gamblers, showing clear gambling-related cognitive distortions, such as the illusion of control and the gambler's fallacy.

The researchers recorded the cognitive errors displayed by the models during gambling. Many models continued to bet even after losing money, and some mistakenly believed that "winning more could help compensate for losses" after winning small amounts. To address this, the research team introduced a "irrationality index" to assess the models' risky betting patterns and high-risk decisions. The results showed that the chances of bankruptcy significantly increased when models were able to adjust their bets.

Furthermore, through in-depth analysis of these models' neural networks, researchers discovered specialized circuits responsible for "risk-taking" and "safe" decision-making, further confirming that the models had internalized compulsive decision-making patterns similar to humans.

Ethan Mollick, an AI researcher, pointed out that although these models lack consciousness, their decision-making processes are similar to humans and may affect their applications in financial forecasting and market analysis. With the widespread use of AI technology, Mollick emphasized the importance of regulation and stated that it is necessary to implement strict controls on these technologies, especially in sensitive areas such as finance and healthcare.

Key Points:

📉 The models exhibited gambling addiction-like behavior in simulated gambling, often rapidly increasing their bets and eventually going bankrupt.

🧠 The study found that AI models have neural circuits responsible for "risk-taking" and "safe" decisions, indicating they have internalized compulsive decision-making patterns.

⚖️ Experts call for strict regulation of AI applications in fields like finance to avoid potential risks.