Child sexual abuse imagery (CSAI) is a growing issue — child exploitation reports spiked during the pandemic as young users spent more time online. In a new petition, parents are demanding that Snapchat, owned by Snap Inc., use technology to scan for abusive videos to protect young users from child predators.
ParentsTogether, a national parent group, has rallied more than 100,000 parent signatures around the United States. They report that Snapchat is used by 92% of social media users ages 12-17 and receives 1.4 billion video views every day. The petition’s site shows their message to Snapchat:
“Snapchat must do better to protect children from grooming, sexual abuse, and exploitation on Snapchat. Snapchat must immediately commit to proactively using PhotoDNA to look for both photos and videos of child sexual abuse material and reporting all material to law enforcement and the National Center for Missing and Exploited Children.”ParentsTogether, Snapchat Petition
There is legitimate cause for concern. The petition lists seven incidents from 2020 alone where videos of sexually exploited children were posted to Snapchat (links below). These include a high school coach in New Mexico who extorted sexual videos from several girls as young as 14, a Cleveland man who posed as a therapist and blackmailed a 13-year-old girl into sending him sexual videos and photos, and a Nebraska man who posted a video of himself having sex with a teenged girl.
But Snapchat isn’t the only social media app where trouble lurks. New York authorities arrested 16 men last year — including a police officer, minister, and NYC school teacher — who targeted kids between the ages of 14 and 15 on social media and gaming apps like Grindr, Tinder, MeetMe, Adam4Adam, Fortnite, Minecraft, Kik, Skout, and Hot or Not.
So, what is Snapchat doing to protect its young users? Turns out, a lot, with more on the way.
Contrary to what ParentsTogether says in their petition, Snapchat already uses PhotoDNA technology to scan for inappropriate images.
A Snapchat spokesperson tells Parentology that Snapchat images are scanned locally on Snap’s own servers against a ‘hash bank’ that is provided by the National Center for Missing and Exploited Children (NCMEC). A “hashed” photo or video means that it has been given a unique digital fingerprint that can be used to identify matching images or videos.
The hash bank is a database of CSAI that’s shared industry-wide by NCMEC and other companies, according to the Snapchat spokesperson. The database is downloaded periodically and matched against Snap media. Snapchat doesn’t scan all activity but focuses on the activity they believe CSAI could appear.
That said, PhotoDNA does not work on Snapchat videos. There is tech Google created called CSAI Match, which is already in use on YouTube (which is owned by Google). The technology identifies videos containing CSAI or users soliciting CSAI through comments or other communications.
Using CSAI Match, a video is flagged and reported to the NCMEC, which works with global law enforcement agencies. From just this year alone, YouTube reports the removal of 1,482,109 videos using that tech.
“Ensuring the safety of our Snapchat community is our top priority,” the Snap spokesperson tells Parentology. “It’s long been on our roadmap to adopt Google’s CSAI Match technology, and we expect to have it in place this fall. We appreciate and welcome parents’ feedback and look forward to continuing to strengthen our efforts to protect Snapchatters’ safety and privacy.”
“ParentsTogether is heartened by how seriously Snapchat takes these concerns and is encouraged by their commitment to implement technology to find and report videos of child sexual abuse material (CSAM) by Fall 2020,” said Amanda Kloer, Campaigns Director at Parents Together, in a press release.
Kloer said, “This type of detection and reporting is critical to keeping kids safe online, but sadly is not widespread across the tech industry. We strongly encourage all platforms that allow users to upload, share, or store images or videos to take these same steps to find and report all known CSAM.”
Snapchat Child Predators — Sources
- 3 Ohio men raped an unconscious teen girl in a hotel room and shared the video on Snapchat.
- New Mexico high school coach used Snapchat to extort sexual videos from several girls as young as 14.
- A Cleveland man posed as a therapist and blackmailed a 13-year-old girl into sending him sexual videos and photos.
- A Virginia man was arrested for running a “sextortion” ring on Snapchat, coercing children into sending sexually explicit material
- A Nebraska man posted a video of himself having sex with a teen girl on Snapcat
- A Florida man was arrested for sending CSAM videos to children on Snapchat
- A Pennsylvania man filmed a sexual video with a 15-year-old without her consent and shared it with her friends, some as young as 13, on Snapchat.