Parentology

Does Snapchat Enable Child Predators?

Child sexual abuse imagery (CSAI) is a growing issue — child exploitation reports spiked during the pandemic as young users spent more time online.

ParentsTogether, a national parent group, has rallied more than 100,000 parent signatures around the United States. They report that Snapchat is used by 92% of social media users ages 12-17 and receives 1.4 billion video views every day. The petition’s site shows their message to Snapchat:

“Snapchat must do better to protect children from grooming, sexual abuse, and exploitation on Snapchat. Snapchat must immediately commit to proactively using PhotoDNA to look for both photos and videos of child sexual abuse material and reporting all material to law enforcement and the National Center for Missing and Exploited Children.”

Some recent of examples of child sexual abuse incidents that have taken place on Snapchat include:

  • 3 Ohio men raped an unconscious teen girl in a hotel room and shared the video on Snapchat.
  • A high school coach in New Mexico used Snapchat to extort sexual videos from several girls as young as 14.
  • A Cleveland man posed as a therapist and blackmailed a 13-year-old girl into sending him sexual videos and photos.
  • A Virginia man was arrested for running a “sextortion” ring on Snapchat, coercing children into sending sexually explicit material
  • A Nebraska man posted a video of himself having sex with a teen girl on Snapcat
  • A Florida mad was arrested for sending CSAM videos to children on Snapchat
  • A Pennsylvania man filmed a sexual video with a 15-year-old without her consent and shared it with her friends, some as young as 13, on Snapchat.

Snapchat Pushes Back

Contrary to what ParentsTogether says in their petition, Snapchat already uses PhotoDNA technology to scan for inappropriate images.

A Snapchat spokesperson tells Parentology that Snapchat images are scanned locally on Snap’s own servers against a ‘hash bank’ that is provided by the National Center for Missing and Exploited Children (NCMEC). A “hashed” photo or video means that it has been given a unique digital fingerprint that can be used to identify matching images or videos.

The hash bank is a database of CSAI that’s shared industry-wide by NCMEC and other companies, according to the Snapchat spokesperson. The database is downloaded periodically and matched against Snap media. Snapchat doesn’t scan all activity but focuses on the activity they believe CSAI could appear.

That said, PhotoDNA does not work on Snapchat videos. There is tech Google created called CSAI Match, which is already in use on YouTube (which is owned by Google). The technology identifies videos containing CSAI or users soliciting CSAI through comments or other communications.

Using CSAI Match, a video is flagged and reported to the NCMEC, which works with global law enforcement agencies. From just this year alone, YouTube reports the removal of 1,482,109 videos using that tech.

“Ensuring the safety of our Snapchat community is our top priority,” the Snap spokesperson tells Parentology. “It’s long been on our roadmap to adopt Google’s CSAI Match technology, and we expect to have it in place this fall. We appreciate and welcome parents’ feedback and look forward to continuing to strengthen our efforts to protect Snapchatters’ safety and privacy.”

ParentsTogether is heartened by how seriously Snapchat takes these concerns and is encouraged by their commitment to implement technology to find and report videos of child sexual abuse material (CSAM) by Fall 2020,” said Amanda Kloer, Campaigns Director at Parents Together, in a press release.

 

Carlo Garcia

Carlos is a new dad, an old son, and a youth sports coach. He's also a freelance writer who loves to buy and try the latest tech products.

Add comment

Subscribe to the Parentology Weekly Newsletter