As platforms get better at detecting child abuse videos, they’re finding more of them
Five years ago, the number of videos reported was less than 350,000. Last year there was a record-breaking 70 million images and videos that were reported to the National Center for Missing and Exploited Children last year. Many of these images and videos were published multiple times as they spread across various platforms.
Facebook, as the largest social media platform, reported close to 60 million photos and videos. However, during an interview with The Verge, Facebook did state that not all of that content is considered “violating” and that only about 29.2 million met the official criteria.
- In 449,000 reports, Google declared 3.5 million total videos and photos.
- Imgur reported 260,000 photos and videos based on about 74,000 reports.
- An important note is that the number of reports and the number of images discovered wasn’t always proportional. The following declarations reflect that.
- Dropbox only made about 5,000 reports in 2019 but found more than 250,000 photos and videos.
- Apple was one of the lower-reporting big companies, tipping 3,000 images, and no videos.
- While Amazon was almost entirely absent from this list.
Facebook has shared the algorithms it uses to so it can more quickly remove the content that portrays child abuse. “The size and expertise of our team, together with our sophisticated technology, have made us industry leaders in detecting, removing and reporting these images, and thwarting people from sharing them,” Antigone Davis, Facebook global head of safety said in an emailed statement to The Verge.
Cloud services are a significant factor in why it isn’t effortless to always be able to catch everything.
Due to this, even with better detection, it’s still not possible to fully map the problem of online videos of child sexual abuse. According to the Times, some cloud storage services, including Amazon, don’t scan for illegal content. And the content on Apple’s messaging app is encrypted, so Apple can’t browse it to find illegal material.
With privacy, a growing concern for the general public, platforms are having a difficult time trying to deduce. The debate on how to accurately detect and remove this content without introducing unnecessary friction for users is a difficult task at hand. Facebook is considering moving toward encryption, but taking a lot of flak for it due to this.
A bill is being drafted to create a National Commission on Online Child Exploitation Prevention, which would reduce legal protections for websites while establishing rules for detecting and removing content that exploits children — potentially including limits on encryption.
Reports from Bloomberg and The Information say that Sen. Lindsey Graham (R-SC) is behind the bill, currently dubbed the Eliminating Abusive and Rampant Neglect of Interactive Technologies (or EARN IT) Act. It would amend Section 230 to make companies liable for state prosecution and civil lawsuits over child abuse and exploitation-related material unless they follow the committee’s best practices. They wouldn’t lose Section 230 protections for other content like defamation and threats.
Techdirt founder Mike Masnick also notes that Section 230 doesn’t cover federal crimes — so the Justice Department could already prosecute companies if they’re enabling abuse. This bill would just let it write a new set of rules by threatening much broader liability.
While lots of progress has been made with efforts to remain vigilant on content being spread across platforms, it doesn’t seem like a long term solution has been found. It seems the amount of progress being made is minimal compared to the amount of abusive content being shared.
Technology has been a double-edged sword in the battle against the abusive imagery of children. The way human experts interact with these algorithms will change as well. “Right now, the algorithm spits out results, and a detective checks if these images depict abuse. But their feedback isn’t shared with the algorithm, so it doesn’t learn anything. In the future, the process will be a two-way street where humans and computers help each other improve.” says Duin.
Imagine a world where soldiers will have cyborg exoskeleton armor that will give them increased physical and mental performance. The Warrior Web program, funded by the