NewsNational News

Actions

Google and X lag peers in addressing non-consensual explicit images, lawmakers say

Google
Posted

WASHINGTON — Google, X and Discord are among a group of tech giants that could be doing more to address the rising threat of non-consensual explicit images online, according to a letter that a group of senators sent to the companies Friday.

The letter criticizes nearly a dozen tech firms for their lack of participation in two programs that make it easier for people to request the removal of non-consensual explicit images and videos from the internet.

The programs are voluntary, but they already count other internet giants, such as Meta, Snap, TikTok and PornHub as participants. And the letter comes as both lawmakers and tech leaders face pressure to do more to combat non-consensual sexual images, sometimes known as revenge porn, especially as artificial intelligence makes it easier to create and spread such content.

This year alone, women around the world were targeted by AI-generated pornographic images, ranging from popstar Taylor Swift to high school girls. And while nine US states currently have laws against the creation or sharing of non-consensual deepfake images, none exist on the federal level — limiting the options for victims of this form of harassment who wish to seek help or accountability.

Friday’s letter, shared exclusively with CNN, is addressed to the chief executives of 11 tech companies: X, Google parent company Alphabet, Amazon, Match, Zoom, Pinterest, Discord, OpenAI, Twitch, Microsoft and Patreon.

It urges them to join theNational Center for Missing and Exploited Children’s “Take it Down” program, which helps people remove nude or sexually explicit images or videos of children from online platforms, as well as the Revenge Porn Helpline’s “StopNCII” initiative, which helps adults remove explicit images that were shared online without their consent. Both programs let users create a unique numerical code for an image they want to remove, which participating platforms can then easily use to search their sites and remove the image.

“By increasing participation in these programs, companies can take actionable steps to stop the life-altering impact that the (non-consensual intimate imagery) has on the life, career and family of those affected,” the letter states. The letter was spearheaded by Democratic Sen. Jeanne Shaheen and Republican Sen. Rick Scott, and co-signed by eight other senators.

Most of the companies named in the letter have policies against the creation or sharing of non-consensual, explicit images, and in some cases offer their own ways for users to report or request the removal of such content. Google also recently announced that aim to keep such content from appearing near the top of search results.

But the benefit of joining the group is that users need to submit only one removal request that is directed to all the participating platforms, rather than having to contact each individual company one-by-one.

The fight to address non-consensual explicit images and deepfakes has received rare bipartisan support. A group of teens and parents who had been affected by AI-generated porn testified at a hearing on Capitol Hill, where Republican Sen. Ted Cruz introduced a bill — supported by Democratic Sen. Amy Klobuchar and others — that would make it a crime to publish such images and require social media platforms to remove them upon notice from victims.

“You threw my son under the bus. You didn't take care of him.”

The State of Florida and the VA are under scrutiny after the Baker Act was used incorrectly on a young veteran who went to a Florida VA hospital for help.

Baker Act used incorrectly on young veteran who went to Florida VA hospital for help