Archives

“United to End Sextortion”: Launching a New Animation for the 16 Days of Activism

By Dr Nadia Aghtaie, Associate Professor in Criminology, University of Bristol

Sextortion is one of the fastest-growing – and least understood – forms of abuse facing young people today. Yet it is still rarely discussed in schools, families or policy debates.

Our new animation, launched as part of the UN 16 Days of Activism against Gender-Based Violence, aims to start those conversations in a way that is accessible, sensitive and grounded in research.

What do we mean by “sextortion”?

We use sextortion to describe situations where someone threatens to share explicit, intimate or embarrassing sexual images without consent in order to force a person to do something – often to send more images, carry out sexual acts, hand over money, or provide other favours (see Ray & Henry 2024; Wolak et al. 2017).

There is still no consistent terminology. Different organisations talk about “image-based sexual abuse”, “online blackmail”, “sexual extortion”, or “sexual corruption”. This lack of shared language makes it harder for young people, parents, teachers and professionals to recognise what is happening and to know where to turn for help.

Why a sextortion animation for young people?

Sextortion doesn’t affect all young people in the same way. Research by NSPCC points to gendered patterns in both who is targeted and what is demanded: boys are often targeted by organised cybercrime gangs demanding money, while girls are more likely to face pressure from people they know – peers, partners or ex-partners – to share more nudes or agree to unwanted sexual acts. Whatever the context, sextortion can be devastating, combining sexual abuse, psychological control, financial exploitation and, for many, intense shame and fear.

These are also experiences that are very hard to talk about. Shame, fear of being blamed, worries about family reactions or community honour, and concerns around immigration status can all create silence. Animation gives us a different way in: it lets us tell a story that feels recognisable without identifying any individual, show clearly that victims are not to blame, signpost routes to support, and open up honest but non-graphic conversations in classrooms, youth groups and families. By launching this animation during the 16 Days of Activism, we place sextortion firmly within global efforts to end violence against women and girls, while recognising that boys and gender-diverse young people are affected too.

From Iran-focused campaign to a wider audience

The initial animations (Coercive Control, Economic Abuse, Active Bystander and Technology Facilitated Abuse) grew out of a wider project on violence against women and girls (VAWG) that began with a focus on Iran. The aim was to design a research-based, evidence-informed animated campaign, United to End Violence Against Women and Girls, to raise awareness of different forms of VAWG and to support intergenerational conversations that might help reduce violence over time.

Although the early work centred on Iranian contexts and the Middle East and North Africa (MENA) region, our previous studies, together with conversations across NGO networks and a school in the UK, quickly showed that these issues are relevant beyond national borders.

Collaboration, creativity and cultural sensitivity

The animations have been funded by VISION Consortium and Bristol ESRC Accelerator Award and shaped by a multidisciplinary team including academic colleagues from City St George University (Dr Ladan Hashemi & Professor Sally McManus); the University of Bristol (Associate Professor Nadia Aghtaie); Leeds Beckett University (Dorreh Khatibi-Hill); Goldsmiths (Dr Atlas Torbati) and University for the Creative Arts (Professor Birgitta Hosea). We have worked closely with two NGOs as advisory partners and an animation team:

  • An Iran-based NGO working with women from diverse backgrounds and running specialist programmes for survivors of gender-based violence. We do not name the organisation here for security reasons, as this could limit how widely the animations can be shared.
  • IKWRO, a London-based charity supporting women and girls from Middle Eastern and North African communities who are facing “honour”-based abuse, forced marriage and other forms of VAWG.
  • “Resilient Anonymous Creators”: An animation team based in Iran. For security reasons, we cannot name them publicly. The name “resilient anonymous creators” is a reflection of both their creativity and the structural barriers they are forced to navigate. We are deeply grateful for their courage, commitment and artistry; this work is only possible because of them.

Ultimately, our goal is simple: to give young people, and the adults who support them, a starting point – a shared language, a shared story and a shared commitment to challenging sextortion and other forms of digital-based abuse wherever they occur.

Link to Women’s Research Hub YouTube Channel 

Link to Women’s Research Hub Instagram Account  

For further information, please contact Ladan at ladan.hashemi@citystgeorges.ac.uk

Webinar: Using animation to campaign against VAWG

As part of VISION’s campaign to support the 2025 United Nations’ 16 Days of Activism Against Gender-Based Violence, “End Digital Violence Against Women and Girls”, we invite you to a lunchtime webinar on Monday, 8 December.

We will showcase a series of animations created to raise awareness about digital violence including technology-facilitated abuse and sextortion (image-based abuse).

The webinar will explore research behind the animations and how animation can be used as a creative, accessible tool to engage audiences, share lived experiences, and promote safer digital spaces for all.

Feel free to bring your lunch and join us!

8 December 2025, 12:00 – 13:00, online 

To register for the event and receive the Teams link, please email: VISION_Management_Team@city.ac.uk

For further information on the animation project, funded by VISION, City St George’s University of London and University of Bristol, please see the VISION blog: United to End Violence Against Women and Girls: An Online Animated Campaign

 

What is tech abuse and how can we tackle it?

Drs Leonie Tanczer and Madeleine Janickyj of the University College London (UCL) Gender and Tech Research Lab and the VISION research consortium, developed a policy briefing, What is tech abuse and how can we tackle it?, with their colleagues at the Gender and Tech Research Lab and the UCL Policy Impact Unit.

Technology-facilitated abuse (tech abuse) refers to the deliberate (mis)use or repurposing of digital systems to coerce, harass, or abuse others. While it is most commonly associated with domestic abuse and stalking, it also occurs in professional and institutional contexts, as well as from strangers.

It is a widespread problem: in the UK, 1.4 million women experienced domestic abuse in 2023-24. In abusive intimate relationships, tech abuse can extend and intensify existing patterns of coercive control, leading to greater levels of harm. Abusers may, for example, send persistent, obscene, or threatening digital communications or track a partner’s movements via GPS or app surveillance. They may also restrict access to accounts, services, or finances.

Despite a shared understanding of tech abuse across sectors and stakeholders, a consensus remains lacking on its precise definition and scope. This definitional ambiguity hinders efforts to measure its prevalence and impact, ultimately limiting how effective prevention and intervention strategies can be.

Recommendations

Tackling tech abuse requires a whole systems approach and better measurement. Other recommendations include:

  • Enforce safety-by-design principles and mandatory abusability testing for technology products to proactively address potential misuse
  • Deepen understanding of perpetrator behaviour and motivations to inform prevention and intervention strategies
  • Leverage innovative methods, such as machine learning, to better understand and respond to tech abuse
  • Improve coordinated responses from police, frontline domestic abuse services, tech companies, and government/international bodies, backed by sufficient and sustainable funding
  • Future-proof policies and regulations, clarify responsibility, and determine accountability across different stakeholders
  • Stop the normalisation of Tech Abuse to support more victims/survivors to seek help, including through honest conversations around digital consent

To download: What is Tech Abuse and how can we tackle it?

To cite: Janickyj, M., Koukopoulos, N., Polamarasetty, A., Reed, J., & Tanczer, L. M. (2025). Policy Brief: What is Tech Abuse and how can we tackle it? Gender and Tech Research Lab, University College London.

For further information, please contact Maddy at m.janickyj@ucl.ac.uk

Illustration from Adobe Stock subscription

Upcoming event: Weaving Stories of Peer Sexual Abuse 

This event is in the past.

Insights from a youth co-created animation project

Weaving Stories is a pilot animation project developed by County-Durham arts education company, Changing Relations, and funded via the VISION research consortium through the Small Projects Fund.

The animation was co-produced with Secondary-aged students, survivors of peer sexual abuse, and an artistic team, to amplify young people’s voices on the theme of unwanted sexual behaviour and the culture that enables it. The students and young survivors shaped every aspect of the animation.

An interdisciplinary Steering Group of academic researchers, creative practitioners, and child protection and sexual violence specialists from a North East school and Rape Crisis centre, were also involved in the project.

With this animation and associated school based learning programme, Changing Relations seeks to influence knowledge, behaviour, and institutional change using the impactful animation as stimulus for reflection. Following this pilot project, VISION and Changing Relations have organised a one-hour webinar for UK policymakers and practitioners to:

  • Watch the co-created animation (20 minutes)
  • Hear young people’s perspectives on the key themes and co-production approach
  • Explore the animation’s potential impact on school cultures, disclosure, help-seeking, and victim-blaming attitudes
  • Engage in academic-informed analysis of trauma-informed safeguarding and youth-centred approaches to sexual violence prevention
  • Gain practical insights on using creative participatory approaches to engage young people in conversations about violence and abuse
  • Consider actionable recommendations for policy and practice
  • Contribute your reflections

This webinar will be of interest to a wide range of professionals who work with adolescents and / or in violence-prevention. Educators, social workers, academics, and third sector, central and local government policy analysts and researchers in particular may be interested.

There are two dates providing the option to choose between a more practice or policy oriented session:

  • Thursday 8th May 1-2pm for policymakers
  • Wednesday 14th May 3-4pm for practitioners

Speakers and facilitators

  • Lisa Davis, Managing Director, Changing Relations
  • Kate Gorman, Creative Producer and Artistic Director, Changing Relations
  • Kimberly Cullen, Knowledge Exchange Manager, UKPRP VISION research consortium, City St George’s UoL

Webinar registration

To register for free for either the 8th or 14th of May, please visit our page on Ticket Tailor.

The webinar will be on Microsoft Teams and you will receive the link on the day you choose to attend.

For further information, please contact VISION_Management_Team@citystgeorges.ac.uk

United to End Violence Against Women and Girls: An Online Animated Campaign  

Violence against women and girls (VAWG) is a pressing issue in Iran, a Middle Eastern country marked by its patriarchal structure and systematic and pervasive gender discrimination. Educational programmes addressing this issue are scarce, and cultural barriers often hinder open discussion. The United to End Violence Against Women and Girls campaign aims to break this silence through a series of animated videos in Farsi and English and images designed to inform public discourse and to empower victims to seek support.

 The United to End Violence Against Women and Girls project was led by VISION researchers Ladan Hashemi and Sally McManus, in collaboration with colleagues from other UK universities including the University of Bristol, Goldsmiths University, Animation Research Centre at the University for the Creative Arts, and Leeds Beckett University. 

They worked with an animation production team in Iran, a social media advisor, and two advisory groups. The advisory groups were Mehre Shams Afarid, an Iran-based non-governmental organisation (NGO), and IKWRO, a London-based charity providing services to women victims of violence from the Middle Eastern and North African (MENA) region—to incorporate culturally specific insights.

Although the project initially focused on Iran, engaging with the UK-based NGO revealed an interest in extending its reach. As a result, English subtitles were added to make the animations accessible to a wider audience. This collaboration helped the content resonate with audiences both in Iran and within the global diaspora community, particularly those from the MENA region.

The animations are grounded in evidence from a survey of 453 women in Iran, which explored the manifestation of various forms of VAWG in Iran and women’s perspectives on how to eliminate it. The survey was designed by Fatima Babakhani, CEO of Mehre Shams Afarid.

Key findings from participants’ open-ended responses to the survey showed that, despite structural inequalities and deeply ingrained societal, cultural, and religious norms that perpetuate VAWG, change is possible through education and legal reforms.

As one survey participant noted: “Unfortunately, many still don’t understand what violence truly is. Raising awareness is the solution.”

The first four United to End Violence Against Women and Girls campaign animations focus on coercive control, economic abuse, technology-facilitated abuse, and active bystander interventions, with two more animations in development.

With guidance from an Iranian social media advisor, a digital strategy was developed to maximise the campaign’s impact. Instagram was chosen as the primary distribution platform, as it is the most widely used social media platform in Iran, with over 47 million users. The animations are also shared on YouTube to further extend the campaign’s reach.

Influencers and women’s rights activists with followings from thousands to millions were partnered with to amplify the campaign’s reach. The online campaign officially launched 25th November, on the International Day for the Elimination of Violence Against Women and Girls.

By leveraging evidence-based content and strategic partnerships, we hope to spark meaningful conversations and drive change across Iran and the diaspora communities from the MENA region.

Join us in raising awareness and advocating for change. Please follow and share the campaign links on your social media to help spread the message.

Link to Instagram page

Link to YouTube channel

This project was funded by City St George’s, University of London Higher Education Impact Fund (HEIF) Knowledge Exchange and by the UKPRP VISION research consortium.

For further information, please contact Ladan at ladan.hashemi@city.ac.uk

Cybercrime victimisation and the association with age, health and sociodemographic characteristics

By Ben Havers, PhD Candidate at the Dawes Centre for Future Crime, University College London

The UK has an ageing population; the Office for National Statistics (ONS, 2024) has predicted that the number of people aged 85 and over will increase from 1.6 million (2.5% of the total population) to 2.6 million (3.5%) over the next 15 years. Concerningly, a recent Age UK report (2024) revealed that more than one in three over 65s lack the basic skills to use the internet successfully. This would suggest that the number of older adults ill-equipped to deal with online threats is set to grow.

This blog describes a recent study conducted by Ben Havers (University College London) and colleagues, including Professor Sally McManus from VISION, exploring how cybercrime victimisation, repeat victimisation and financial impact are associated with age and other sociodemographic and health-related characteristics.

The authors analysed data from the 2019-2020 Crime Survey for England and Wales, an annual national crime victimisation survey carried out by the ONS. The survey is administered via face-to-face interviews with more than 35,000 adults across England and Wales. Participants are asked whether they have been a victim of crime in the past 12 months, and other personal information on topics such as housing, work and health.

Some of the key findings of the study were:

  • People aged 75+ were most likely to experience repeat cybercrime victimisation and associated financial loss than younger demographics.
  • Men were more likely to experience victimisation and repeat victimisation than women. A plausible explanation is that men, who have been found to take more risks than women generally (Hudgens & Fatkin, 1985), may also engage in riskier behaviour or activities online, leaving them more vulnerable to malicious actors.
  • People of Black and mixed/multiple ethnicity were more likely to be cybercrime victims than participants of White ethnicity. Research on the drivers behind ethnic disparities in crime victimisation in the UK and abroad is limited. Salisbury and Upson’s ( 2004) crime survey analysis found that people of Black and minority ethnicity are more likely than White people to fall victim to crime in general. Future research might explore differing patterns and types of internet use, and systemic disadvantages, for example linguistic barriers to safe cyber navigation.
  • Worse cognitive, physical, mental and general health were associated with greater risk, across the ages. This relationship is likely to be bidirectional as poor health might increase the risk of cybercrime (Abdelhamid, 2020) and being a victim of cybercrime may worsen mental health (Rhoads, 2023).

The findings from this study indicate that future developments in online platform and process design, as well as multi-agency collaboration and information sharing, should focus on (a) empowering older adults to detect fraudulent activity before loss is incurred, and (b) removing barriers to reporting so that support can be provided before the individual is victimised a second or third time.

To read or download the article for free: Cybercrime victimisation among older adults: A probability sample survey in England and Wales | PLOS ONE

To cite: Havers, B., Tripathi, K., Burton, A., McManus, S., & Cooper, C. (2024). Cybercrime victimisation among older adults: A probability sample survey in England and Wales. PLOS ONE, 19(12), e0314380. https://doi.org/10.1371/journal.pone.0314380

Or for further information, please contact Ben at benjamin.havers.20@ucl.ac.uk

Illustration from Adobe Photo Stock subscription

Reaching a consensus: Technology-facilitated abuse conceptualisation, definition, terminology, and measurement

The rapid development of digital systems has benefited modern societies but also created opportunities for the proliferation of harms. Specifically, the term ‘technology-facilitated abuse’ (TFA) describes the misuse or repurposing of digital systems to harass, coerce, or abuse. It is a global problem involving both existing and emerging technologies.

TFA is regularly discussed in the context of domestic abuse, where it is perpetrated via a range of systems, including phones, laptops, and tablets, smart home/Internet of things appliances, as well as online accounts, that are either shared or accessed without the partner’s consent. In the United Kingdom, 32% of women and children who sought support for domestic abuse in 2022 to 2023.

The research field lacks comprehensive and standardised measurement tools and in 2022, the UN Secretary-General emphasized that the absence of agreed definitions and measures impedes any efforts to understand the true scale of TFA. Despite significant work across research, policy, and practice to understand the issue, the field operates within linguistic, conceptual, and disciplinary silos, inhibiting collaboration.

To address this, the present study led by Dr Nikolaos Koukopoulos (University College of London) in collaboration with VISION researchers Dr Madeleine Janickyj and Dr Leonie Tanczer used the Delphi technique to reach a consensus on TFA conceptualization, definition, terminology, and measurement among subject experts.

Following a literature review, a global, cross-disciplinary sample of academics, practitioners, and policymakers (n = 316) reflected on TFA across three survey rounds. The results showed both aligned and opposing perspectives. “Technology” and “facilitated” were the most preferable terms. Still, there was uncertainty regarding the need for additional terminologies to denote the scope of abuse, such as gendered descriptors. Participants had little familiarity with existing TFA measurement tools, with two-thirds unaware of any.

Most experts agreed on conceptualising TFA based on the perpetrator’s behaviour, the victim’s harm and impact, and consent. They also supported an expansive TFA definition, beyond intimate relationships, that can involve groups and communities as perpetrators or targets. However, they were more reluctant to perceive TFA as a distinct abuse form, or one guided by social norms, legal thresholds, or involving child perpetrators.

Recommendations:

  • The fragmentation and contrasting conceptualisations of TFA observed in this research underscore the need for greater cross-disciplinary communication among researchers, practitioners, and policymakers to move closer toward a unified understanding of TFA. Some form of standardization is particularly crucial, given the rapidly developing ways existing and emerging technologies are weaponized in the digital realm. Concrete, practical steps could help bridge these divides by consolidating published work into a searchable database. This could include suggestions for conceptually similar terminology across various sectors and subject areas.
  • Furthermore, an interactive online map of key TFA stakeholders and research groups could facilitate greater collaboration and knowledge-sharing, which the research team is now working on.

To download the paper: Defining and Conceptualizing Technology-Facilitated Abuse (“Tech Abuse”): Findings of a Global Delphi Study – Nikolaos Koukopoulos, Madeleine Janickyj, Leonie Maria Tanczer, 2025

To cite the paper: Koukopoulos, N., Janickyj, M., & Tanczer, L. M. (2025). Defining and Conceptualizing Technology-Facilitated Abuse (“Tech Abuse”): Findings of a Global Delphi Study. Journal of Interpersonal Violence, 0(0). https://doi.org/10.1177/08862605241310465

Illustration from Adobe Photo Stock subscription

VISION/VASC Webinar Series: Into the Light Index

This event is in the past.

We are pleased to announce our next webinar for the VISION and Violence & Society Centre (VASC) Webinar Series on Tuesday, 21 January, 1100 – 1150.

Deborah Fry, Director of Data at Childlight – Global Child Safety Institute and Professor of International Child Protection Research at University of Edinburgh, will present on the Into the Light Index, published last year on the prevalence of technology-facilitated child sexual exploitation and abuse. She will also discuss some of the measurement challenges in this field and how they are documenting and exploring those challenges.

Professor Fry undertakes primary research to measure the magnitude, drivers and consequences of violence against children, barriers and enablers to appropriate prevention and response systems including in school settings and the effectiveness of existing interventions.

She leads the data division at Childlight – Global Child Safety Institute. The Data Institute, funded by the Human Dignity Foundation, aims to take a data driven, evidence-based approach to understanding the prevalence of child sexual exploitation and abuse across the globe and translating that data into sustainable action that safeguards children. The mission is to establish a world leading independent institute that gathers, translates and visualises the prevalence of child sexual exploitation and abuse across the world.  

To register for the event in order to receive the Teams invitation, please contact: VISION_Management_Team@city.ac.uk

The purpose of the VISION/VASC webinar series is to provide a platform for academia, government and the voluntary and community sector that work to reduce and prevent violence to present their work / research to a wider audience. This is a multidisciplinary platform and we welcome speakers from across a variety of fields such as health, crime, policing, ethnicity, migration, sociology, social work, primary care, front line services, etc. If interested in presenting at a future Series webinar, please contact: VISION_Management_Team@city.ac.uk

This webinar series is sponsored by the UK Prevention and Research Partnership consortium, Violence, Health and Society (VISION; MR-V049879) and the Violence and Society Centre at City St George’s, University of London.

Tech-facilitated abuse and the ‘new normal’

by Dr Leonie Tanczer, Associate Professor, University College London, and Co-Investigator, UKPRP VISION

The growth of digital technologies in our lives creates new habits, practices, and expectations. We need better public awareness and debate about the “new normal” we are experiencing in a society where the misuse of digital technologies has become widespread. 

I don’t know about you, but there used to be a time when I was excited and thrilled by technology. I remember how ecstatic I was when I got my first mobile – and, later, my first smartphone. How unbelievably neat it felt “browsing” the web to research a school assignment. And how empowering and beneficial I once perceived platforms such as Twitter.  

That’s sadly no longer how I think and feel. And I must confess, I’ve become quite pessimistic.  

You can blame my dreary outlook on living through my 20s in a world where digital innovations became entrenched in daily life and now constantly demand our attention. Alternatively, you may say that my perspective has changed since I started to study technology-facilitated domestic violence (tech abuse). My interest in tech abuse emerged in 2018 when I set out to examine how smart, Internet-connected devices – such as Amazon’s Ring Doorbell or the Google Home smart speaker – impacted domestic abuse victims and survivors. It should have been only a short, six-month research project, but it developed a life of its own. Since then, my research focus and team have steadily grown and we are researching tech abuse as part of the VISION project. As the research grows, so has the scale and awareness of tech abuse. 

Tech can exacerbate societal problems  

I never fully bought into the narrative that tech can solve all societal ills. If anything, my research on tech abuse has shown how the misuse of digital technologies can exacerbate societal problems. The boundaries have started to blur around what is and isn’t acceptable online and where one can draw the line around what may or may not be abusive when handling digital tech. 

Tech abuse is the misuse of “everyday” digital systems to alter, amplify, and accelerate coercive and controlling behaviour in the context of intimate partner violence (IPV). Tech abuse is a major concern because it offers perpetrators of domestic abuse new and powerful tools to monitor and harass. And let’s be clear: domestic abuse is an epidemic. It is widespread (approximately 1 in 5 UK adults aged 16 years and over had experienced domestic abuse since the age of 16 years); harmful (it impacts victims’/survivors’ mental, emotional, physical, social and financial wellbeing); as well as gendered and deadly ( Homicide Index data for the year ending March 2019 to the year ending March 2021 show that 72.1% of victims of domestic homicide were female).

To date, our research group has investigated numerous angles related to this expanding abuse form, from usability tests of digital devices and the analyses of legal tools to tech abuse’s interconnection with mental health. We have been outspoken about shortcomings in policy debates and the wider cybersecurity sector and collaborated with and been informed by the efforts of key stakeholders that represent the voice and lived experience of victims and survivors, as well as those working with perpetrators.

What is “normal” and “acceptable”?

The functionalities and abilities many digital services offer (and for which consumers actively pay!) create a slippery slope towards their misuse. For example, I am all up for the remote control of my heater from my UCL office, the sharing of geolocation data whilst in an Uber, and the exchange of streaming service passwords with family and friends. I mean as a white, privileged, tech-savvy woman in a consensual partnership and with supportive colleagues and friends, these features frequently benefit me.  

But, what if they don’t? What if I wasn’t the legal owner and account holder of the systems I use? What if I had to think of the inferences corporations and governments will make based on my data profile? And what if it were down to my coercive partner to control the temperature, to know my whereabouts, and to set up/maintain my Netflix or email account?  

At present, many concerns that digital systems cause are addressed along the principle of informed consent, which is technically quite simple: once something happens without the awareness and approval of all parties involved, a line has been breached. But what are we doing when ensuring informed consent is impossible or doesn’t go far enough to protect someone from abuse?  

More profoundly, I believe we must start to ask ourselves important questions around the “new normal” that is looming and that I don’t think we have begun to unpack: is it OK for my partner to know my email password? Is it OK for my partner to check who I’ve been texting? And is it OK for my partner to ask for nudes via text? Plus, what if we bring children into the mix? Is it OK for parents to overtly install parental control software on devices legitimately purchased and gifted to their kids? And can – and should – a 15-year-old reject? 

We need a public debate

Undoubtedly, I don’t have definite answers to any of the above-posed questions. But they have been in my mind for some time, and I’d love to see them addressed. Relationships – whether with our children, parents, friends, or romantic partners – are not always pure bliss. They can be overshadowed by conflict, but in the worst case, they can be toxic and outright destructive and harmful. Our digital systems must be capable to account for this. I, thus, believe a public debate or a re-evaluation on what we should accept as ‘normal’ is urgently needed. This then may hopefully lead to safeguards put into place so that everyone – independent of their situation – can make conscious choices on tech’s impact on their lives as well as partnerships.  

Photo by Luca Bravo on Unsplash

Technology-facilitated abuse seminar

This event is in the past.

Wednesday 10 May 2023, 1 – 2 pm, hosted by the Oxford Internet Institute

Dr Leonie Tanczer, Associate Professor in International Security and Emerging Technologies at University College London and Co-Investigator of the UKPRP Violence, Health and Society (VISION) consortium, presented on technology-facilitated abuse (“tech abuse”) in the context of intimate partner violence (IPV) .

She examined the “boundary questions” that tech abuse creates and provided an overview of the current research landscape whilst discussing the findings of a recent comparative survey conducted with UK and Australian support sector representatives.

For further information on the seminar please see: Technology-facilitated abuse in the context of intimate partner violence Tickets, Wed 10 May 2023 at 13:00 | Eventbrite

Photo by David Carillet / Shutterstock.com