Discord has banned the distribution of child abuse images generated by artificial intelligence
The Discord platform has banned the distribution of child sexual abuse material (CSAM) generated by artificial intelligence.
Here's What We Know
Discord's vice president of trust and safety John Redgrave said the platform is expanding its rules regarding generative AI that can create fake content. Specifically, the updated policy prohibits the creation and distribution of CSAM images, as well as sexualising children in text chats.
Last month, the Washington Post reported that artificial intelligence-generated CSAM images had spread across the internet in recent months. One such platform turned out to be Discord.
The platform hosted several integrations that allow users to create pictures directly in chats. It turned out that users often generated images with sexual themes.
However, a Discord spokesperson said that work on updating the policy had been underway since the last quarter of 2021. According to him, the new rules are not a reaction to any recent reports.
Source: NBC