Facebook AI spots deep fakes and their sources

Deep Fakes - facebook logo in eye - digital

The system was developed by the company with a team from Michigan State university.

Facebook and a team from Michigan State University have announced that they have developed a new software system that can identified deep fakes as well as where they came from.

Deep fakes are videos that have been edited using AI in some way, appearing increasingly realistic.

Deep fakes are becoming increasingly difficult for humans to tell what is real when they are viewed online, such as when they are shared on Facebook. According to the researchers at Facebook, the new AI software can be trained to detect whether or not a video is real based on a single still image or video frame.

Moreover, apparently the software can also identify the specific artificial intelligence that was employed for creating the altered media in the first place, no matter how cutting edge the technique may be.

It is now possible to train artificial intelligence software “to look at the photo and tell you with a reasonable degree of accuracy what is the design of the AI model that generated that photo,” said Facebook applied research lead Tal Hassner, as quoted in a recent CNBC article.

The deep fakes AI system was moved beyond research conducted last year on photo shots.

Last year, researchers from Michigan State University used artificial intelligence to be able to determine what camera model was used to take any given photograph. According to Hassner, the new system builds on the one the team developed last year.

Deep fakes are a serious threat to Facebook and other social media platforms. Facebook is continually working to eliminate this type of artificial content from its primary platform as well as Instagram, WhatsApp and Messenger. While they were banned in January 2020, the company still struggles to try to remove the banned content from its platform fast enough.

Deep Fakes - facebook logo in eye - digitalAccording to Hassner, just trying to detect them is a “cat and mouse game,” on the platform, pointing out that they are becoming simpler to make but much harder to detect. This has become particularly problematic with pornography deep fakes in which case faces are swapped onto other people’s bodies.

Leave a Comment


This site uses Akismet to reduce spam. Learn how your comment data is processed.