Bailey Calfee
Sep 12, 2023

ChildFund PSA calls out AI’s role in the proliferation of child sexual abuse material online

The campaign features a predator who describes how “there’s never been a better time to be a monster.”

(Photo credit: ChildFund International / YouTube)
(Photo credit: ChildFund International / YouTube)
AI offers endless possibilities for working more efficiently and uncovering new ideas. It has also made it much easier to create child sexual abuse material (CSAM).
 
In a disturbing new campaign titled #TakeItDown, the nonprofit ChildFund International illustrates just how easy it is for predators to “hide in plain sight” thanks to the internet and tech companies’ limited responsibilities when it comes to reporting and taking down CSAM. 
 
As reported by the Washington Post in June, AI-generated CSAM has increased month-over-month since AI tools became more widely available to the public in Fall 2022. And while every U.S. state’s attorney general has signed a petition for Congress to further study and create guards against the proliferation of this content, there are still limited barriers to keep this type of content from being generated or distributed.
 
A video spot, created in partnership with social impact agency Wrthy, features a predator whose face quickly switches from unassuming when speaking to his kids or colleagues into a pale monster when using his computer. 
 
 
“If you ask me, there’s never been a better time to be a monster,” he says after describing how technology has allowed predators the increased ability to review, rate and recommend CSAM. 
 
The video is part of a multimedia campaign and also includes a widget which allows viewers to add their voices to demand action from policymakers, as well as a mini-documentary. Included in the short film are Sonya Ryan, whose daughter was murdered by an online predator, and Jim Cole, a retired Homeland Security Investigations special agent with knowledge of the technology tools that exist but aren’t used.
 
“Instead of being a place for learning, playing and connecting with friends and family, the internet has become a place rife with ways to exploit and abuse children,” said Erin Kennedy, ChildFund International’s vice president of external affairs and partnerships, in a press release.
 
Tech companies are not legally required to search for CSAM shared or held via their platforms. While they are required to report it once they have been made aware of its existence, these companies are not typically punished for neglecting to quickly remove it.
 
The National Center for Missing and Exploited Children noted in a report earlier this year that it receives around 80,000 reports to its Cyber TipLine each day, with a majority of the CSAM reported living on the open web (as opposed to the dark web, which is much harder to access). 
 
This campaign puts the onus on tech companies, whose platforms serve as hosts for this media. “We want technology companies to recognize their responsibility,” noted Kennedy. “Profit should not come before the protection and well-being of children.”

 

Source:
Campaign US
Tags

Related Articles

Just Published

2 hours ago

TikTok ban looms: Meta and YouTube positioned to gain

With over 170 million users and seven million businesses bracing for impact, the looming ban is similar to TikTok’s struggles in APAC—from outright bans in India and Nepal to restrictions in Australia and New Zealand.

2 hours ago

One year on: Running an indie and the price of ...

"We were the same folks, the same award-winning team, just with a new name. But being indie was somehow synonymous with 'cheap' in the market. Seven lost pitches, six on price, it was a rude awakening," writes Moonfolks’ Anish Daryani.

4 hours ago

X escalates fight against advertisers

Less than a week before President-elect Trump takes office, X doubles down on legal war against advertisers with plans to expand its antitrust lawsuit.

4 hours ago

Spikes Asia 2025: Banana Balloon’s creatives on ...

Winning at Spikes in its first year of operation increased confidence and morale at China-based independent agency Banana Balloon.