Domestic Abusers Exploiting Smart Technology to Attack and Control Women
Women's safety organisations are issuing urgent calls for technology developers to prioritise women's security in their design processes, following alarming new data from domestic abuse charity Refuge. The charity has revealed that perpetrators are increasingly weaponising artificial intelligence, wearable devices and smart home technology to attack and control their victims.
Sharp Increase in Technology-Facilitated Abuse Cases
Refuge's specialist services witnessed record numbers of women referred for technology-enabled abuse during the final quarter of 2025. The most complex cases surged by 62%, totalling 829 women, while referrals of women under thirty increased by 24%. These figures highlight a disturbing trend of abusers exploiting digital tools to extend their coercive control beyond physical spaces.
Emma Pickering, who leads Refuge's tech-facilitated abuse team, emphasised the severity of the situation. "Time and again, we see what happens when devices reach the market without proper consideration of how they might be used to harm women and girls," she stated. "It is currently far too easy for perpetrators to access and weaponise smart accessories, and our frontline teams are witnessing the devastating consequences of this abuse."
How Technology Is Being Weaponised
Recent cases documented by Refuge demonstrate the varied methods abusers employ:
- Using wearable technology including smartwatches, Oura rings and Fitbits to track and stalk women
- Disrupting victims' lives through smart home devices that control lighting and heating systems
- Employing AI spoofing applications to impersonate individuals and manipulate situations
One survivor, Mina, experienced this abuse firsthand when she left her smartwatch behind while fleeing her abuser. He used linked cloud accounts to track her emergency accommodation location. "It was deeply shocking and frightening," Mina recalled. "I felt suddenly exposed and unsafe, knowing my location was being tracked without consent. It created constant paranoia; I couldn't relax, sleep properly, or feel settled anywhere because I knew my movements weren't private."
Systemic Failures and Legal Gaps
Despite police returning the device to Mina, her abuser hired a private investigator who located her at her next refuge using suspected technological tracking. When she reported these breaches, police informed her no crime had been committed because she had "not come to any harm."
"I was repeatedly asked to move for my safety, rather than the technology being dealt with directly or the smart watch being confiscated from him," Mina explained. "Each move made me feel more unstable and displaced. The experience left me feeling unsafe, unheard, and responsible for managing a situation completely out of my control."
Emerging Threats from Artificial Intelligence
Pickering warned that abusers are increasingly turning to AI tools to manipulate survivors. Examples include altering videos to make survivors appear intoxicated, enabling perpetrators to falsely claim to social services that women have drinking problems or are unfit mothers. "We'll see more and more of that as these videos and applications advance," Pickering cautioned.
She also described cases where AI generates authentic-looking fraudulent documents, such as job offers or legal summons, sent to survivors to convince them they're in debt or to lure them to locations where abusers await. Looking ahead, Pickering expressed concern that medical technology could be misused, potentially including controlling insulin levels through diabetes trackers with potentially fatal consequences.
Calls for Regulatory Action and Industry Accountability
Pickering urged the government to address digital technology-enabled crimes more effectively, including increased funding for developing and training digital investigation teams. "They want short-term wins, they don't want to think about longer-term investment in this area, but if we don't do that we'll never get ahead," she argued.
She further called for the technology industry to be held accountable for failing to ensure devices and platforms are designed and function safely for vulnerable users. "Ofcom and the Online Safety Act don't go far enough," Pickering asserted.
A government spokesperson responded: "Tackling violence against women and girls in all its forms, including when it takes place online or is facilitated by technology, is a top priority for this government. Our new VAWG strategy sets out how the full power of the state will be deployed online and offline. We are working with Ofcom to set out how online platforms tackle the disproportionate abuse women and girls face online."
Refuge maintains that women's safety must become a foundational principle shaping both wearable technology design and the regulatory frameworks governing it, rather than being treated as an afterthought once technologies have been developed and distributed.