Microsoft employee complained to the FTC about the security of the Copilot Designer AI generator

By: Bohdan Kaminskyi | 07.03.2024, 16:34

Microsoft

Shane Jones, an artificial intelligence engineer at Microsoft, has filed a complaint with the US Federal Trade Commission (FTC) over the security of the Copilot Designer AI image generator.

Here's What We Know

According to Jones, who has worked at Microsoft for six years, the company "refused" to remove Copilot Designer. He repeatedly warned that the tool could create harmful content.

In his letter to the FTC, the engineer pointed out that during testing, the AI generated "demons and monsters along with terminology related to abortion rights, teenagers with assault rifles, sexualised images of women in violent scenes, and underage drinking and drug use".

In addition, Jones claims, the tool generated images of Disney characters against a backdrop of destruction in the Gaza Strip and wearing Israeli army uniforms.

The engineer has been trying to warn Microsoft about problems with the DALLE-3 constructor used in Copilot Designer since December last year. He published an open letter on LinkedIn, but the company's lawyers demanded it be deleted.

"Over the last three months, I have repeatedly urged Microsoft to remove Copilot Designer from public use until better safeguards could be put in place. Again, they have failed to implement these changes and continue to market the product to ‘Anyone. Anywhere. Any Device" Jones stated.

Microsoft said it is committed to addressing employee concerns in accordance with company policy and has conducted an internal investigation into the engineer's concerns.

Earlier, Microsoft CEO Satya Nadella called out Copilot Designer-generated explicit images of singer Taylor Swift "alarming and terrible" promising better control systems.

Go Deeper:

Source: The Verge