Gaya Hidup Miscellaneous

Microsoft’s New Video Authenticator Could Help Weed Out Dangerous Deepfakes

The tool believes the image on the right is fake with 100 percent confidence.
Credit: Microsoft

Deepfakes can be fun. Those videos that perfectly inserted Jim Carrey into Jack Nicholson’s role in The Shining were endlessly entertaining. As the upcoming U.S. election closes in, however, analysts expect that deepfakes could play a role in the barrage of misinformation making its way out to potential voters.

This week, Microsoft announced new software called Video Authenticator. It’s designed to automatically analyze videos to determine whether or not algorithms have tampered with the footage.

The software analyzes videos in real-time and breaks it down frame-by-frame. In a way, it works similarly to familiar photographic forensic techniques. It looks for inconsistencies in edges, which can manifest as subtle changes in color or small pixel clusters (called artifacts) that lack color information. You would be hard-pressed to notice them with your own eyes, especially when dozens of frames are zipping by every second.

As the software performs its analysis, it spits out either a percentage or a confidence score to indicate how sure it is about an image’s legitimacy. Faking an individual frame is relatively simple at this point using modern AI techniques, but movement introduces an extra level of challenge, and that’s often where the software can glean its clues.

Research has shown that errors typically happen when subjects appear in profile, quickly turn more than 45 degrees, or if another object rapidly travels in front of the person’s face. While these happen relatively commonly in the real world, they rarely happen during candidate speeches or video calls, which are prime targets during an election season.

To train the Video Authenticator, Microsoft relied on the FaceForensics++ dataset—a collection of manipulated media specifically to help train people and machines to recognize deepfakes. It contains 1.5 million images from 1,000 videos. Once built, Microsoft tested the software on the DeepFake Detection Challenge Dataset, which Facebook AI created as part of a contest to build automated detection tools. Facebook paid 3,426 actors to appear in more than 100,000 total video clips manipulated with a variety of techniques, including everything from deep learning methods to simple digital face swaps.

Facebook’s challenge ended earlier this year and the winning entrant correctly identified the deepfakes 82 percent of the time. The company says it’s already using internal tools to sniff out deepfakes, but it hasn’t publicly given any numbers about how many have shown up on the platform.

For now, average users won’t have access to Microsoft’s Video Authenticator. It will be available exclusively to the AI Foundation as part of its Reality Defender 2020 program, which allows candidates, press, campaigns, parties, and social media platforms to provide suspected fake media for authentication. But, down the road, these tools could become more available to the public.

Another big tech company—Google—has been hard at work on ways to detect face swaps; last year it hired actors and built its own dataset using paid actors similar to Facebook’s methods. While Google doesn’t have public plans for a specific deepfake detection site for everyday users, it has already implemented some image manipulation tools as part of its Image Search function, which is often the first step in trying to figure out if a photo is fake.

Microsoft didn’t share specific numbers about the success rate its AI-driven tool achieved, but the benchmark isn’t all that high when you survey the top performing players. The winner of Facebook’s challenge achieved its 82-percent success rate on familiar data—once it was applied to new clips taken in the real world with fewer controlled variables, its accuracy dropped to around 60 percent. Canadian company Dessa had similar success with the Google-produced videos with controlled variables. With videos pulled from other places on the web, however, it struggled to hit the 60 percent success mark.

We still don’t know how big a role deepfakes will play in the 2020 election, and with social media platforms doing their own behind-the-scenes detection, we may never know how bad it could have been. Maybe by the next election, computers will be better at recognizing the handiwork of other automated manipulators.

Source: https://www.popsci.com/story/technology/microsof-video-authenticator-deepfakes/

19 Replies to “Microsoft’s New Video Authenticator Could Help Weed Out Dangerous Deepfakes

  1. Oh my goodness! Impressive article dude! Thank you, However I am going through issues with your RSS. I don’t understand the reason why I can’t join it. Is there anybody else getting the same RSS issues? Anyone who knows the answer can you kindly respond? Thanx!!|

  2. Hello! Do you know if they make any plugins to safeguard against hackers? I’m kinda paranoid about losing everything I’ve worked hard on. Any recommendations?|

  3. This design is steller! You obviously know how to keep a reader entertained. Between your wit and your videos, I was almost moved to start my own blog (well, almost…HaHa!) Wonderful job. I really loved what you had to say, and more than that, how you presented it. Too cool!|

  4. I’m really inspired together with your writing abilities and also with the format on your blog. Is this a paid topic or did you customize it yourself? Either way keep up the excellent high quality writing, it is rare to see a nice weblog like this one today..|

  5. Write more, thats all I have to say. Literally, it seems as though you relied on the video to make your point. You definitely know what youre talking about, why throw away your intelligence on just posting videos to your site when you could be giving us something enlightening to read?|

  6. Great goods from you, man. I’ve understand your stuff previous to and you are just extremely great. I really like what you have acquired here, certainly like what you are saying and the way in which you say it. You make it enjoyable and you still take care of to keep it wise. I cant wait to read much more from you. This is actually a wonderful web site.|

Leave a Reply

Your email address will not be published. Required fields are marked *