From Instagram to TikTok, social media has been flooded with videos from Doublicat / REFACE, a new app that allows users to place their faces onto the image of someone in an existing GIF or video. Want to be Britney Spears in her “Toxic” music video? How about Robert Downey Junior in Iron Man? No problem — it takes just a few seconds to get a vid that will impress and amuse your friends. But is the Doublicat / REFACE app safe?
Doublicat / REFACE Privacy Concerns
First, let’s clear up some confusion. While the general public is referring to the app as Doublicat, the URL is Doublicat.com, and many of the videos are watermark with the Doublicat name and logo, it’s now being officially called “REFACE: face swap videos” and the Doublicat URL redirects to reface.app.
“Doublicat is now REFACE!” it announces in the Apple and Google stores, where the app is currently ranked #17 and #14, respectively. “We believe now it’s much easier to search and remember. And our new name communicates better what the app stands for. Enjoy refacing & stay tuned!”
Here’s an example of the new watermarked version, from Instagrammer Robbie Tursi-Masick (known as WonderRobbie on Instagram) embracing his Wonder Woman and Princess Leia fantasies.
Writer Andy Moser documented his journey with the app on Mashable, when it was still called Doublicat. Here he is as Jennifer Anniston and Henry Cavil’s Superman.
Super fun (pardon the pun) but are there legitimate reasons to be concerned for a person’s online security, privacy, and the threat of deepfake videos? Here’s the breakdown.
Concerns about Doublicat / REFACE actually stem from last year’s worries over the Russian-owned FaceApp, which aged a person’s photo with startling results. FaceApp requires access to a user’s entire camera roll and there was concern this would also allow bad entities to see sensitive information, use the image to initiate your Face ID login, and more.
The fear was mostly bred from the notion that this was a Russian-owned company, and though FaceApp released statements saying “most” photos uploaded to the app are held on “remote servers” for about 48 hours before being deleted, some experts were still concerned. However, as with most things in tech, FaceApp soon faded when the next great app appeared.
Similar fears surround Doublicat / REFACE, however REFACE is registered in the United States. It’s for ages 13 and older, and if a parent has concerns about their child using the app they can easily contact the company at email@example.com.
The company also collects your facial feature data separately from your photos “only to provide you with the face-swapping functionality of REFACE. Please note, we collect the facial feature data that is not biometric data.” This is essentially saying that they can’t use the data they collect for Face ID functionality.
Photos you upload are kept on REFACE for up to 24 hours after the editing session before being deleted. The facial feature data “is stored on the REFACE’s server for a limited period of 30 calendar days after your last use of the REFACE application.”
They also make a point of calling out what they won’t use your image for:
REFACE does not use your photos and facial features for any reason other than to provide you with the face-swapping functionality of REFACE. In no way will REFACE use your photos and facial features for face recognition, as REFACE does not introduce the face recognition technologies or other technical means for the processing of biometric data for the unique identification or authentication of a user.
However, there is one more concern that Doublicat / REFACE raises among tech and security watchdogs: the threat of deepfake videos.
Risks of Doublicat / REFACE Deep Fake Videos
There is a concern that Doublicat / REFACE can be used to create deepfake videos. This is where the person in an original video has their face or lips replaced with someone else’s, making it appear as though they said or did something that never happened. Deepfakes can also alter the original video footage slightly, such as when a doctored video of House Speaker Nancy Pelosi made her appear drunk or impaired. This video garnered millions of views and brought the concept of deepfakes to public attention.
“Deepfakes tend to promote conspiracies,” Caroline Knorr, Senior Parenting Editor for Common Sense Media, told Parentology. “Topics that tend to be lightning rods, controversial issues, and anything that’s a popular topic of discussion could be a potential subject for a deep fake.”
The use of altered videos has also led US intelligence officials to issue a warning ahead of the 2020 elections. According to CBS News, this year’s Worldwide Threat Assessment said “adversaries and strategic competitors would likely attempt to use deepfakes” to influence campaigns in the US, as well as for other nefarious reasons.
“What if somebody creates a video of President Trump saying, ‘I’ve launched nuclear weapons against Iran, or North Korea, or Russia?’ We don’t have hours or days to figure out if it’s real or not,” Hany Farid, a professor of computer science at the University of California Berkeley, told CBS News. “The implications of getting that wrong are phenomenally high. What you have to understand about this technology, is that it’s not in the hands of few, it’s in the hands of many.”
The risks of that happening with Doublicat / REFACE videos are low. They only provide videos of known pop culture moments, not official press conferences or political events. However, if Reface can create something this good for a phone app, it wouldn’t be hard for foreign powers or radical groups to either hack this tech or create their own.
There are some tricks to spotting a deepfake that everyone should know.
- The audio doesn’t sync perfectly.
- Unusual shadowing.
- It doesn’t match what you know to be true.
This last point is probably the most important. If you suspect a video is fake — whether or not it fits your own beliefs about the person in the video — do some research before reposting. Google “Is the [insert name here] [insert topic here] video real?” and search for recent news. Also, check legitimate news sites and fact-checking sites like Snopes.com. Two minutes of extra research can often stop deepfake videos before they go viral.