Refuge's Tech Safety Newsletter September 2025
Deaf Awareness Month: Shining a Light on Hidden Abuse in partnership with SignHealth

This month, we are shining the light on Deaf Awareness Month. It is a chance to better understand and remind ourselves about the challenges faced by survivors who are deaf, especially when it comes to safety in this fast-evolving digital world.
In 2023, the Technology-Facilitated Abuse and Economic Empowerment team led the two-year national Home Office funded Tech Partnership project. The team delivered training and workshop sessions, facilitated consultation and engagement meetings with led-by-and-for partner agencies, and made the service’s resources accessible for survivors who face additional barriers. The scheme reached over 45 led-by-and-for specialist organisations, upskilling over 700 professionals. The project created stronger inter-agency cooperation and influenced national policy.
On that note, we are very pleased to have Vicki Trapps, Service Manager at SignHealth, as our special guest writer, who explores the important and often overlooked experiences of the Deaf community around technology-facilitated abuse. Vicki’s insights highlight the ways technology can both empower and harm, and why accessibility matters more than ever.
The experience of technology-facilitated abuse can be even more complex and isolating for survivors who are Deaf. As awareness of technology-facilitated and economic abuse grows, it is crucial that we also understand how these forms of abuse uniquely impact Deaf people.
Deaf people can often live in increased isolation, especially when key services, support networks, and information are not accessible in British Sign Language (BSL) or other visual formats. For a Deaf person experiencing abuse, the isolation can be used against them. They may rely on their partner or others to interpret calls, access services, or even attend medical appointments. It gives perpetrators more control and reduces opportunities to seek help.
When professionals such as police officers, healthcare workers, or social workers rely on family members or the perpetrator to interpret for a victim/survivor, it can lead to serious risks. The perpetrator or family member may control what is said and how it is translated, filtering or distorting information to hide abuse or manipulate the situation. In some cases, the perpetrator may lie about what the victim says and speak on their behalf instead of translating their actual words, convincing professionals that everything is fine when it is not. As a result, the victim/survivor may be too afraid to speak openly, especially if the abuser is present. They may feel trapped and silenced, unable to ask for help, which can lead to further isolation or punishment after the interaction.
Technology is often seen as a tool for connection, but for some survivors, it becomes a tool of control used by perpetrators. For Deaf people, technology-facilitated abuse may involve tampering with visual communication tools such as video relay services or captioned calls, monitoring conversations through hacked devices or shared apps, or withholding access to phones, tablets, or Wi-Fi, which cuts off essential communication.
Technology is a lifeline for many Deaf people, and losing access can mean losing connection to the outside world. Deaf people already face barriers in employment and education, often due to a lack of accessible opportunities or interpretation and translation support. Perpetrators may exploit this vulnerability by controlling finances or withholding disability benefits, blocking access to work or training to create financial dependence, or damaging or withholding tools needed for employment, such as phones or laptops.
It becomes significantly harder for survivors who are deaf to leave abusive situations or rebuild their lives without financial independence. They are also navigating abuse that others can’t see or hear. By raising awareness of how technology and economic abuse specifically affect Deaf people, we can begin to dismantle barriers and build a safer, more inclusive support system for all.
To do this, we must make our systems more inclusive. It includes funding Deaf-aware support services and ensuring websites and helplines are accessible in BSL. It also involves training mainstream services to understand the specific risks and barriers Deaf people face and to make people aware of accessible educational resources around healthy relationships and digital safety within the Deaf community. It is vital not to use family members or perpetrators as interpreters, even if they offer their assistance, and always ensure that the survivor has the opportunity to speak privately and safely.
As Deaf Awareness Month reminds us, inclusion means more than just recognising differences; it means actively working to remove barriers and protect everyone’s right to communicate safely. Technology can be a powerful tool, but only when it is used responsibly and with everyone’s needs in mind. We hope this month’s article sparks reflection and conversation.
A big thank you to Vicki for sharing her expertise and helping us all better understand the distinctive experiences of the Deaf community in this digital age.
How to contact SignHealth Domestic Abuse Services Text ONLY 07800 003421 da@signhealth.org.uk
Groundbreaking lawsuit against Google over image-based sexual abuse
In a disturbing story of intimate image-based abuse, a German woman has filed a lawsuit against Google after private, intimate photos and videos of her were leaked from her personal cloud, along with her identity documents. Not only were these private materials exposed, but they also appeared in Google image searches under her name, despite numerous takedown requests. Though Google has de-listed specific URLs, the content has been uploaded repeatedly on pornographic sites, highlighting how search engines can unintentionally spread abuse rather than contain it.
This case illustrates how intimate image abuse disproportionately and significantly impacts women. The victim was forced to change jobs and relocate. She will also experience the ongoing psychological effects of stress and stigma.
HateAid, a German nonprofit helping victims of cybercrime, is covering her legal costs and emphasises the need for legal clarity. Should search engines be required by law to permanently remove content even when it resurfaces?
From a VAWG perspective, this is not just a privacy breach it is a gendered attack. It weaponises technology to bypass consent. Search engines become amplifiers of a survivor’s trauma, especially when content is manipulated using AI and deepfake technologies as they can be harder to locate. We need to ensure tech firms are responsible for mitigating harm, and that the law reflects this.
What results from this case could set legal precedent. If courts confirm that platforms like Google must act proactively, even against repeated re-uploads it could mark a huge shift in tech-safety, protecting survivors from ongoing digital abuse and affirming that technologies should preserve dignity not destroy it.
From Lifesaving to Life-Threatening: The Hidden Dangers of Assistive Tech in Domestic Abuse
Evidence shows that technology is now routine in cases of Domestic Abuse. Whilst technology can be a lifeline for many, improving independence, convenience and removing barriers to support – technology is being increasingly misused to cause harm. The Technology-Facilitated Abuse & Economic Empowerment team’s recent report found that cases related to technology abuse have risen by 58% since 2018, with new methods including Wi-Fi hacking, GPS tracking, and the misuse of AI. SafeLives reported that Disabled survivors typically experience domestic abuse for a longer period of time before accessing support (3.3 years’ average length of abuse vs 2.3 years), and that Disabled women are twice as likely to experience domestic abuse as non-disabled women.
Assistive and medical technologies introduce additional risks. Many hearing aids, fall-detectors and telecare systems can be configured via companion apps or online portals. If a perpetrator controls the account, they can disable alerts or listen through home hubs connected to pendants. The UK government recently warned that around 2 million people rely on telecare alarms, highlighting how widespread, safety-critical and potentially abusable these systems are if account access is compromised.
Some telecare or GPS-enabled devices like pendants, trackers, or smartwatches are set up to notify a carer if the person leaves a safe zone, called “wandering alerts”. Usually designed for those with dementia, learning disabilities, or other conditions where safety monitoring is needed. A perpetrator with access to the account or app could send false alerts to create distress, make the survivor seem unsafe, or to justify restricting their movement. They could also suppress real alerts, meaning if the person genuinely needed help, no one would know. It is essential that all front-line staff are trained to look for and spot the signs of tech-facilitated abuse.
Improving accessibility should also mean tech-safety by design:
• Access to support when devices are compromised. Pharmacies/banks with trained staff to support, free webchat from library PCs, and BSL access can provide additional options for survivors when personal devices are compromised.
• Trauma-informed tech checks. Frontline services should ask who controls logins to things such as hearing-aid apps, telecare dashboards, smart locks, etc. With support to reset account login information and remove shared access as part of safety planning (where this is deemed to be safe to do).
• Safety-by-design. Prioritising the risks of tech-facilitated abuse and involving tech-safety experts and end-users during development.
Accessible domestic abuse support saves lives. But for disabled and chronically ill survivors in particular, safety depends on recognising that assistive and medical devices can double as instruments of coercive control, so the focus should be on building services, products and protocols that prevent this.
The Tech-facilitated abuse and Economic empowerment team regularly conduct consultations and reviews for apps, devices and other services to highlight any potential risks of tech-abuse. We also produce guides and resources on securing accounts and devices. If this is of interest, please get in touch at techabuse@refuge.org.uk.