Bumble’s ‘Private Detector’ is finally here to protect users from unwanted nudes.
After announcing the feature earlier this year, Bumble is now rolling it out to users.
For those who missed the initial announcement, Private Detector is an artificial intelligence-powered feature that automatically detects nude images and blurs them. Then, it alerts users about the potentially inappropriate message. Users can then decide whether to view or block the image. They can also report the image to Bumble.
The dating platform claims the number of nude images shared through it is “minimal,” but acknowledges that users appreciate the feature. Further, it says users reported over 15 percent of all nude photos sent to women and over 5 percent sent to men. Bumble banned those responsible for sending the pictures.
Bumble co-founder Whitney Wolfe Herd has worked with Texas state lawmakers over the last year to develop a bill that makes sharing unsolicited pictures a crime. The bill recently passed the Committee on Criminal Jurisprudence, and the governor of Texas passed it into law on September 1st, 2019.
Private Detector joins a suite of safety features on Bumble, including a ban on guns and other weapons in profile pictures, a ban on hate speech in profiles, the ability to call and video chat without sharing phone numbers or personal data and photo verification to eliminate catfishing. Additionally, Bumble has a global moderation team to help and handle user needs.