According to CNBC, Mustafa Suleyman, CEO of Microsoft's AI business, clearly stated at the Perell International Council Summit in Menlo Park, California on Thursday that Microsoft will not develop adult-oriented AI services, and emphasized, "This is absolutely not a service we intend to provide," showing the company's clear stance on the ethical boundaries of generative AI.

AI painting Cyberpunk Metaverse Female Cool (1)

This statement came just one week after Microsoft's long-time partner OpenAI publicly announced it would allow verified adults to create adult content on ChatGPT. Sam Altman, CEO of OpenAI, said at the time that the company "is not the moral judge of the world," and this decision sparked widespread discussion and controversy in the industry.

Suleyman pointed out, "AI that appears to have consciousness is emerging, and a significant portion of it is focused on adult services." He warned that this trend carries the risk of sliding into the "sexualization of sex robots," which he described as a "very dangerous direction" in the AI industry, and called on the industry to maintain restraint and ethical awareness.

He also mentioned Grok, a virtual companion feature launched by Elon Musk's company in July, which includes female anime characters, and said such phenomena reflect the industry gradually approaching the edge of risks during commercial exploration.

Notably, Microsoft also launched several new features for its Copilot chatbot on the same day, including an AI companion named Mico, which can interact with users through voice calls and express emotions through color changes. However, Microsoft emphasized that the purpose of this feature is to enhance emotional companionship and productivity support, rather than entertainment or adult-oriented directions.

Analysts believe that Microsoft's move is not only a response to public ethical concerns but also a strategic signal to clearly define the boundaries between business and ethics amid intensified competition in the global generative AI market.