Google releases dataset of 3,000 deepfakes to help researchers


In an effort to help researchers working on detecting deep fakes, Google has released a massive dataset of deepfake videos.

Deepfake videos are used to manipulate people and spread disinformation during elections. The videos are incredibly authentic and look real. They have also been used to recreate fake pornographic videos that look real and star celebrities.

The video below is a deepfake. Actress Amy Adams is on the left in an original video, while on the right, actor Nicholas Cage has been superimposed on Adams to create a deepfake.

The database includes 3,000 deepfakes that Google generated by filming actors in different scenes and then used existing deep fakes that are publicly available, Google said in a recent blogpost.

Researchers will now be able to utilize information from the dataset and train automated detection tools so they are more effective and accurate when trying to spot these types of images.

Google said in the post that it hopes to add more videos to the dataset.

“Since the field is moving quickly, we’ll add to this dataset as deepfake technology evolves over time, and we’ll continue to work with partners in this space,” Google said.

“We firmly believe in supporting a thriving research community around mitigating potential harms from misuses of synthetic media, and today’s release of our deepfake dataset in the FaceForensics benchmark is an important step in that direction.”

Source: Google Via: Engadget

Related Articles