Dating app Bumble to ask users to report AI-generated images

By: Nastya Bobkova | 10.07.2024, 02:13

Bumble is aiming to make it easier for its users to report profiles created using artificial intelligence (AI). The dating and social networking platform has added a new option called "Use AI-generated photos or videos" to its fake profile reporting menu.

Here's What We Know

According to a survey of Bumble users, 71% of Gen Z and millennial respondents want to see restrictions on the use of AI-generated content on dating apps. Another 71% find it unacceptable to see AI-generated photos of people in places they've never been or doing activities they've never done.

Why It Matters

Fake profiles can also lead to significant financial losses for people. In 2022, the US Federal Trade Commission received reports of romance scams from nearly 70,000 people, and their losses from these scams totalled $1.3 billion. Many dating apps are taking significant security measures to protect their users from fraud and other dangers, and the use of AI to create fake profiles is becoming another threat.

Earlier this year, Bumble released a tool called "Deception Detector" that uses AI to detect fake profiles. It also introduced an AI-based tool that protects users from viewing unwanted nudity. This year, Tinder launched its own approach to profile verification in the US and the UK.

Source: StartupNews.fyi