Artificial Intelligence and Sexual and Domestic Violence

Dear Friends of SafeHouse,

 

As we move further into a world shaped by rapid technological change, many of us are asking the same question: What does the rise of artificial intelligence mean for survivor safety? AI has already become a part of everyday life for many people. And increasingly, it has become part of how people seek help, information, and connection. For those experiencing sexual and domestic violence, this shift brings new opportunities—and new risks.

 

24/7, Anonymous Support, No App Required

 

On the hopeful side, AI has opened doors for safer, more discreet access to support. Confidential, expert-reviewed chatbots like Sophia and the Hope Chat AI tool from Domestic Shelters.org allow individuals to seek information anonymously, without the digital footprint that traditional apps or browsing may create. For someone being monitored by an abusive partner, the ability to quietly ask questions, explore safety planning, or document abuse without downloading an app can be life-changing. Access matters. And meeting survivors where they are, day or night, anywhere in the world, can save lives.

 

The Risk: When Generative AI Becomes Dangerous

 

But as with all technology, the benefits come alongside real concerns. Recent reports shows that generative AI (like ChatGPT) can sometimes mirror a user’s emotional state in unhealthy ways. When someone is isolated, distressed, or struggling with violent thoughts, certain chatbots may unintentionally reinforce those thoughts instead of de-escalating them. This is not because technology is malicious, but because it lacks the human discernment, grounding, and accountability that crisis support requires.

Read More: When AI Chatbots Encourage Violence from Psychology Today

Read More: AI and Domestic Violence: Boon, Bane – Or Both? from Forbes

 

Deepfakes: A Growing Threat

 

Another growing concern is the rise of AI-generated deepfakes. Deepfakes use artificial intelligence to place a person’s face onto an existing pornographic image or video or generate entirely new sexual images from just a few photos. We’re seeing a disturbing trend involving “nudify apps” that strip clothing from people in photos which have been used by minors to create nude images of classmates.

 

Deepfakes disproportionately impact women and girls across demographics, and survivors of domestic and sexual violence face unique and heightened risks. Abusive partners are beginning to misuse this technology to create fake images or videos designed to shame, threaten, or control their partners. These deepfakes can be convincing enough to damage reputations, threaten employment, or isolate survivors from friends and family. They can also be used as tools of coercion: “if you leave, I’ll send this to everyone you know.” Even when the content is fabricated, the fear and harm it can cause are very real. This new digital form of abuse adds yet another layer to the already complex barriers survivors face when trying to stay safe.

Read More: Deepfakes and Domestic Violence: Perpetrating Intimate Partner Abuse Using Video Technology from Montreal AI Ethics Institute

 

Weaponizing Evidence

 

That’s not all. For years survivors have relied on digital evidence (screenshots, messages, video recordings, photographs) to document abuse. Deepfakes complicate this in troubling ways. Abusers can fabricate text or videos to discredit the survivor. They may claim real evidence is faked. Courts, police, and workplaces may struggle to determine authenticity. Survivors may fear they won’t be believed, which is already a societal problem, and hesitate to come forward. When anything can be faked, perpetrators gain a powerful new tool for coercive control, and survivors face new barriers to justice.

Read More: “How AI and Deepfakes Can Impact Domestic Violence Cases” from New Jersey Lawyer

 

Deepfakes and the Law

 

Some US states have taken steps to regulate non-consensual deepfake images (Alabama is one of those states), but no federal law exists. Revenge-porn laws don’t often apply because the depicted nudity “isn’t real.” Creation of deepfakes is sometimes not illegal unless the image is distributed, and inconsistent language leaves many victims unprotected. Until federal legislation addresses the creation, possession, and distribution of deepfake sexual images, survivors will continue to be vulnerable.

Read More: Deepfakes Are Spreading – Can The Law Keep Up? From Forbes

 

Where do we go from here?

 

AI isn’t going away and neither are the harms. But with awareness, policy change, and survivor-centered safety planning, we can reduce risk and ensure that technology helps more than it hurts.

 

For community Members:

  1. Verify information before sharing
  2. Practice recognizing Deepfakes
  3. Report deepfake images when encountered
  4. Support legislation that protects survivors from digital abuse

For Survivors

If you’re worried that someone may be misusing AI or technology to harm, monitor, or intimidate you, you are not alone—and what you’re experiencing is real and valid. Technology-facilitated abuse is abuse. Consider these options:

  1. Create a digital safety plan
  2. Check out this Technology Safety & Privacy Toolkit for Survivors
  3. Report to local law enforcement and/or the FBI
  4. Use StopNCII.org to have your intimate images removed for participating websites
  5. The CCRI Crisis Helpline helps victims of nonconsensual pornography or “revenge porn”
  6. Document abuse safely
  7. Access shelter, advocacy, and support services. For survivors in Clay, Coosa, Chilton, and Shelby counties in Alabama, reach out to SafeHouse.

For minors of images/videos of minors

  1. Report child sexual exploitation at the CyberTipline
  2. Take it Down: service to remove nude or sexual explicit photos taken before you were 18

 

A Safer Future Requires All of Us

 

AI can be an incredible tool for support, safety planning, and expanding access. But it can also be weaponized in ways our communities are not yet prepared for. As AI grows, our response must grow with it: stronger laws, more digital literacy, better platform accountability, and survivor-centered Safety strategies.

 

SafeHouse remains committed to staying informed and advocating for protections that reflect the realities survivors face in an AI powered world. Thank you for standing with us as we continue to adapt, stay informed, and ensure that every survivor has access to support that is not only innovative, but trustworthy. And if you would like help navigating technology facilitated abuse, or if you have concerns about your safety, please reach out. We’re here. Always.

 

With Gratitude,

Janelle Sierra

Executive Director

SafeHouse