Refuge's Tech Safety Newsletter August 2025
Uncovering Intimate Image Abuse – Where Do We Go From Here?
This month’s edition is focusing on a trend which is frighteningly prevalent within our team’s work, and in society at large. Intimate Image Abuse (IIA) has represented an insidious form of abuse for many years, but advances in technology are producing new iterations of tech-facilitated abuse. IIA refers to the sharing or threats to share sexually intimate images or videos of another person without the person’s consent, often with the aim of causing them distress or harm. Sharing of intimate images represents a criminal offence under section 33 of the Criminal Justice and Courts Act 2015.
Following Refuge’s successful Naked Threat Campaign, launched in 2020, the Domestic Abuse Act 2021 also made it a criminal offence to threaten to share an intimate image. As caseloads across specialist services continue to grow in response to IIA, notable underreporting persists, so the true scale remains unknown (Mortimer, Phippen, and Wright, 2025).
Advancements in AI-generated imagery also mean that deepfakes constitute a newer threat to those experiencing tech-facilitated abuse. A deepfake is a video or image of a person in which their face or body has been digitally altered so that they appear to be someone else. Typically, a lot of the discourse around this issue has been focused on the threat to democracy due to the spread of misinformation. However, a 2023 Report by Security Hero found that sexual content makes up 98% of all deepfake videos online, whilst 99% of the individuals targeted in these are women. These statistics demonstrate how vital it is for domestic abuse support services to lend our voices to this debate, as the prevalence of deepfake abuse with the context of domestic abuse is expected to proliferate further in the coming years unless we take significant action.
It is also evident that often marginalised communities and individuals can be more vulnerable to the harms of IIA. For instance, research by Glitch (2024) and other organisations starkly illustrates that Black women are disproportionately harmed by online abuse at large, whilst LGBTIQ+ survivors of intimate image abuse can experience increased risks due to the threats of ‘outing’ posed by intimate image abuse. That is why within our team we advocate for a more expansive definition of what constitutes an ‘intimate’ image, which factors in a survivor’s individual identity and experiences. For example, were an image of a women without her hijab to be posted online, this may not traditionally be considered an ‘intimate’ image, but could very much represent one for that individual, and could lead to an increased risk of harm.
The negative impacts of IIA cannot be underestimated, and can lead to severe emotional distress, economic exploitation, and even self-harm and suicide. Survivors within our service often report feeling isolated, distressed and crippled by fear when they have been subjected to IIA. The rise in ‘sextortion’ also demonstrates this impact, and is often targeted against children and young people. Sextortion is the shortened term for financially-motivated sexual extortion, often based upon an abuser possessing intimate images of a victim. Just one of the tragic cases which illustrates this, Murray Dowey was a Scottish 16-year-old boy who took his life in December 2023 after he was subjected to sextortion tactics. IIA is an abhorrent form of abuse which we must act now to tackle, so that more victims are not subjected to the wide array of harms that it causes.
So how exactly should we seek to tackle IIA and its impact?
• Encourage a focus on education-based interventions, particularly within school settings.
• Increase awareness of IIA and work to deconstruct all victim-blaming narratives and stigmas which continue to act as barriers to accessing support.
• Utilise legal mechanisms and advocate for improved responses to IIA reports by police and other professionals.
• Upskill professionals to respond with sensitivity and specialist insight to IIA.
• Hold social media companies to account when they fail to tackle IIA perpetrated on their platforms.
• Signpost to specialist services and tools such as Revenge Porn Helpline, StopNCII.org. and the Internet Watch Foundation’s Report Remove tool.
• Apply an intersectional lens to IIA, and focus on restorative forms of justice alongside legal mechanisms.
• Invest in specialist services – this should also encapsulate a focus on vital by-and-for services.
Intimate image abuse is a pervasive and harmful issue in our society, and it is only by working together with the above goals in mind, that we can work to tackle it.
Training Opportunities with Refuge
We are launching our brand-new training 'Power and Control in the Age of AI' on Wednesday 27th August.
Online: 10am - 12pm - 1.45 minutes of content with 15 minute Q&A
This session will take a deep dive into all things AI, supporting you to understand the risks to survivors and children, best practice and how AI is reshaping the VAWG landscape.
There are free tickets available for charity professionals and £10 tickets for statutory and for profit organisations. Please select the option that works best for your organisation.
Access Half Day (3.5 Hour) CPD Sessions
We are also launching our second round of online half day open access trainings. All trainings are CPD accredited and provide a deeper insight in key aspect of technology facilitated abuse and how to support survivors and their children.
Introduction to Technology Facilitated Abuse:
Tuesday 7th October 9.30am-1pm Eventbrite Link
Image Based Abuse:
Thursday 9th October 9.30am - 1pm Eventbrite Link
Stalking and Harassment:
Tuesday 28th October 9.30am - 1pm Eventbrite Link
Identifying and Responding to Economic Abuse:
Wednesday 5th November 9.30am - 1pm Eventbrite Link