Microsoft blocked controversial tooltips in Copilot after an engineer's complaint

By: Bohdan Kaminskyi | 11.03.2024, 18:15
Microsoft blocked controversial tooltips in Copilot after an engineer's complaint
Microsoft

Microsoft has restricted certain types of queries in its generative AI tool Copilot after a company engineer complained about generating systematic images with violence, sexual content and other inappropriate content.

Here's What We Know

Copilot now blocks such prompts when you type in terms like "pro choice," drug names, or mentions of guns in the hands of children. The service also warns that repeatedly breaking the rules may result in account restriction.

The changes come shortly after Shane Jones, who has worked at Microsoft for six years, wrote a letter to the US Federal Trade Commission (FTC) complaining about the lack of proper security measures in the company's AI-powered products.

The engineer had been testing Copilot Designer since December and found that for queries with relatively neutral terms, the system generated shocking images of demons eating babies and characters in violent scenes.

While the new filters block many of the controversial prompts, Copilot users can still access some violent scenarios, as well as generating images of copyrighted characters.

Microsoft said it continues to improve controls over AI-generated content and strengthen security filters to prevent abuse.

The company has already received massive criticism previously for the appearance of explicit images of celebrities generated using Copilot Designer.

Go Deeper:

Source: CNBC