Artificial intelligence (AI) might already be so out of control that it poses a growing threat to women’s safety? Non-consensual deepfake porn like the images of Taylor Swift that made headlines are only part of the story. Sexual exploitation and the normalization of violent imagery are on the rise, with AI opening up new ways to exploit and harass women.
My work as the second Black President of the National Organization for Women (NOW), the country’s largest intersectional women’s rights organization, as well as my personal background as a licensed social worker and mental health professional, makes it clear to me what must be done to stop the mental anguish and long-term damage happening online.
Congress and state legislatures are responding–but the questions remain: Are they acting fast enough? And how enforceable will their actions be?
There’s a new kind of thievery preying on women today–the theft of our bodily autonomy. When it happens to people in the spotlight, it becomes a media sensation, but 99.9% of the victims of deepfakes and online sexual abuse can’t fight back the way a superstar can or wait for the damage to die down. Fake imagery causes real harm to reputations and self-esteem and shatters privacy.
Women are the first to be exploited, attacked, and abused online in the most invasive ways possible–and with AI, the threat changes every day. That’s why it’s so vital that we push lawmakers both in Washington, D.C. and the states to protect women from the dark side of AI.
The women-led dating and social networking app Bumble surveyed its community and found that one in three women have been sent unsolicited lewd images, and a 2021 study by the Pew Research Center similarly found that 33% of women under 35 reported being sexually harassed online.
There are no federal laws that make it illegal to create or distribute deepfake pornography, although legislation to address this danger has been introduced. The Taylor Swift AI-generated images prompted a new wave of proposals and calls for action–but none have materialized seen so far.
Congressman Joseph Morelle (D-NY-25) along with 40 co-sponsors, has introduced the Preventing Deepfakes of Intimate Images Act, which makes illegal the nonconsensual sharing of altered or deepfake intimate images online, prosecutes wrongdoers and makes them pay.
In the U.S. Senate, a bipartisan bill called the DEFIANCE Act would be the first federal law to prevent nonconsensual deepfake pornography. Introduced by Senate Majority Whip Dick Durban (D-IL), along with Sens. Lindsey Graham (R-SC), Amy Klobuchar (D-MN), and Josh Hawley (R-MO), it acts on a provision of the Violence Against Women Act’s most recent reauthorization to prevent and prosecute cybercrimes.
We must build public support for laws like these–and for state legislation as well.
According to Axios, practically all of the state legislatures working today have taken up AI-related legislation, of which nearly half deal with deepfakes. As of early February, there were 407 bills relating to AI, up from 67 one year earlier.
At least 10 states have passed deepfake-related laws, including Georgia, Hawaii, Texas and Virginia. California and Illinois have given victims the right to sue.
The conversations we’re having today among elected officials, government agencies, tech companies, and consumers will determine the impact AI have on people and society for generations to come. It took over a decade for government to understand the changes brought by social media platforms—but we can’t wait another 10 or even five years to take the next steps.
We need to update the rules and strengthen the online guardrails that were created when AI was a science-fiction movie prediction. Now that it’s here, AI is fueling a culture of toxic masculinity, misogyny, and abuse. We may not have long to seize this moment–and protect women from online abuse.
Christian F. Nunes, MBA, MS, LCSW, is the president of the National Organization for Women.
The opinions expressed in Fortune.com commentary pieces are solely the views of their authors and do not necessarily reflect the opinions and beliefs of Fortune.
© 2024 Fortune Media IP Limited. All Rights Reserved. Use of this site constitutes acceptance of our Terms of Use and Privacy Policy | CA Notice at Collection and Privacy Notice | Do Not Sell/Share My Personal Information | Ad Choices
FORTUNE is a trademark of Fortune Media IP Limited, registered in the U.S. and other countries. FORTUNE may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.
S&P Index data is the property of Chicago Mercantile Exchange Inc. and its licensors. All rights reserved. Terms & Conditions. Powered and implemented by Interactive Data Managed Solutions.
