Archives

Cybercrime victimisation and the association with age, health and sociodemographic characteristics

    By Ben Havers, PhD Candidate at the Dawes Centre for Future Crime, University College London

    The UK has an ageing population; the Office for National Statistics (ONS, 2024) has predicted that the number of people aged 85 and over will increase from 1.6 million (2.5% of the total population) to 2.6 million (3.5%) over the next 15 years. Concerningly, a recent Age UK report (2024) revealed that more than one in three over 65s lack the basic skills to use the internet successfully. This would suggest that the number of older adults ill-equipped to deal with online threats is set to grow.

    This blog describes a recent study conducted by Ben Havers (University College London) and colleagues, including Professor Sally McManus from VISION, exploring how cybercrime victimisation, repeat victimisation and financial impact are associated with age and other sociodemographic and health-related characteristics.

    The authors analysed data from the 2019-2020 Crime Survey for England and Wales, an annual national crime victimisation survey carried out by the ONS. The survey is administered via face-to-face interviews with more than 35,000 adults across England and Wales. Participants are asked whether they have been a victim of crime in the past 12 months, and other personal information on topics such as housing, work and health.

    Some of the key findings of the study were:

    • People aged 75+ were most likely to experience repeat cybercrime victimisation and associated financial loss than younger demographics.
    • Men were more likely to experience victimisation and repeat victimisation than women. A plausible explanation is that men, who have been found to take more risks than women generally (Hudgens & Fatkin, 1985), may also engage in riskier behaviour or activities online, leaving them more vulnerable to malicious actors.
    • People of Black and mixed/multiple ethnicity were more likely to be cybercrime victims than participants of White ethnicity. Research on the drivers behind ethnic disparities in crime victimisation in the UK and abroad is limited. Salisbury and Upson’s ( 2004) crime survey analysis found that people of Black and minority ethnicity are more likely than White people to fall victim to crime in general. Future research might explore differing patterns and types of internet use, and systemic disadvantages, for example linguistic barriers to safe cyber navigation.
    • Worse cognitive, physical, mental and general health were associated with greater risk, across the ages. This relationship is likely to be bidirectional as poor health might increase the risk of cybercrime (Abdelhamid, 2020) and being a victim of cybercrime may worsen mental health (Rhoads, 2023).

    The findings from this study indicate that future developments in online platform and process design, as well as multi-agency collaboration and information sharing, should focus on (a) empowering older adults to detect fraudulent activity before loss is incurred, and (b) removing barriers to reporting so that support can be provided before the individual is victimised a second or third time.

    To read or download the article for free: Cybercrime victimisation among older adults: A probability sample survey in England and Wales | PLOS ONE

    To cite: Havers, B., Tripathi, K., Burton, A., McManus, S., & Cooper, C. (2024). Cybercrime victimisation among older adults: A probability sample survey in England and Wales. PLOS ONE, 19(12), e0314380. https://doi.org/10.1371/journal.pone.0314380

    Or for further information, please contact Ben at benjamin.havers.20@ucl.ac.uk

    Illustration from Adobe Photo Stock subscription

    Reaching a consensus: Technology-facilitated abuse conceptualisation, definition, terminology, and measurement

      The rapid development of digital systems has benefited modern societies but also created opportunities for the proliferation of harms. Specifically, the term ‘technology-facilitated abuse’ (TFA) describes the misuse or repurposing of digital systems to harass, coerce, or abuse. It is a global problem involving both existing and emerging technologies.

      TFA is regularly discussed in the context of domestic abuse, where it is perpetrated via a range of systems, including phones, laptops, and tablets, smart home/Internet of things appliances, as well as online accounts, that are either shared or accessed without the partner’s consent. In the United Kingdom, 32% of women and children who sought support for domestic abuse in 2022 to 2023.

      The research field lacks comprehensive and standardised measurement tools and in 2022, the UN Secretary-General emphasized that the absence of agreed definitions and measures impedes any efforts to understand the true scale of TFA. Despite significant work across research, policy, and practice to understand the issue, the field operates within linguistic, conceptual, and disciplinary silos, inhibiting collaboration.

      To address this, the present study led by Dr Nikolaos Koukopoulos (University College of London) in collaboration with VISION researchers Dr Madeleine Janickyj and Dr Leonie Tanczer used the Delphi technique to reach a consensus on TFA conceptualization, definition, terminology, and measurement among subject experts.

      Following a literature review, a global, cross-disciplinary sample of academics, practitioners, and policymakers (n = 316) reflected on TFA across three survey rounds. The results showed both aligned and opposing perspectives. “Technology” and “facilitated” were the most preferable terms. Still, there was uncertainty regarding the need for additional terminologies to denote the scope of abuse, such as gendered descriptors. Participants had little familiarity with existing TFA measurement tools, with two-thirds unaware of any.

      Most experts agreed on conceptualising TFA based on the perpetrator’s behaviour, the victim’s harm and impact, and consent. They also supported an expansive TFA definition, beyond intimate relationships, that can involve groups and communities as perpetrators or targets. However, they were more reluctant to perceive TFA as a distinct abuse form, or one guided by social norms, legal thresholds, or involving child perpetrators.

      Recommendations:

      • The fragmentation and contrasting conceptualisations of TFA observed in this research underscore the need for greater cross-disciplinary communication among researchers, practitioners, and policymakers to move closer toward a unified understanding of TFA. Some form of standardization is particularly crucial, given the rapidly developing ways existing and emerging technologies are weaponized in the digital realm. Concrete, practical steps could help bridge these divides by consolidating published work into a searchable database. This could include suggestions for conceptually similar terminology across various sectors and subject areas.
      • Furthermore, an interactive online map of key TFA stakeholders and research groups could facilitate greater collaboration and knowledge-sharing, which the research team is now working on.

      To download the paper: Defining and Conceptualizing Technology-Facilitated Abuse (“Tech Abuse”): Findings of a Global Delphi Study – Nikolaos Koukopoulos, Madeleine Janickyj, Leonie Maria Tanczer, 2025

      To cite the paper: Koukopoulos, N., Janickyj, M., & Tanczer, L. M. (2025). Defining and Conceptualizing Technology-Facilitated Abuse (“Tech Abuse”): Findings of a Global Delphi Study. Journal of Interpersonal Violence, 0(0). https://doi.org/10.1177/08862605241310465

      Illustration from Adobe Photo Stock subscription

      VISION/VASC Webinar Series: Into the Light Index

        We are pleased to announce our next webinar for the VISION and Violence & Society Centre (VASC) Webinar Series on Tuesday, 21 January, 1100 – 1150.

        Deborah Fry, Director of Data at Childlight – Global Child Safety Institute and Professor of International Child Protection Research at University of Edinburgh, will present on the Into the Light Index, published last year on the prevalence of technology-facilitated child sexual exploitation and abuse. She will also discuss some of the measurement challenges in this field and how they are documenting and exploring those challenges.

        Professor Fry undertakes primary research to measure the magnitude, drivers and consequences of violence against children, barriers and enablers to appropriate prevention and response systems including in school settings and the effectiveness of existing interventions.

        She leads the data division at Childlight – Global Child Safety Institute. The Data Institute, funded by the Human Dignity Foundation, aims to take a data driven, evidence-based approach to understanding the prevalence of child sexual exploitation and abuse across the globe and translating that data into sustainable action that safeguards children. The mission is to establish a world leading independent institute that gathers, translates and visualises the prevalence of child sexual exploitation and abuse across the world.  

        To register for the event in order to receive the Teams invitation, please contact: VISION_Management_Team@city.ac.uk

        The purpose of the VISION/VASC webinar series is to provide a platform for academia, government and the voluntary and community sector that work to reduce and prevent violence to present their work / research to a wider audience. This is a multidisciplinary platform and we welcome speakers from across a variety of fields such as health, crime, policing, ethnicity, migration, sociology, social work, primary care, front line services, etc. If interested in presenting at a future Series webinar, please contact: VISION_Management_Team@city.ac.uk

        This webinar series is sponsored by the UK Prevention and Research Partnership consortium, Violence, Health and Society (VISION; MR-V049879) and the Violence and Society Centre at City St George’s, University of London.

        United to End Violence Against Women and Girls: An Online Animated Campaign  

          Violence against women and girls (VAWG) is a pressing issue in Iran, a Middle Eastern country marked by its patriarchal structure and systematic and pervasive gender discrimination. Educational programmes addressing this issue are scarce, and cultural barriers often hinder open discussion. The United to End Violence Against Women and Girls campaign aims to break this silence through a series of animated videos and images designed to inform public discourse and to empower victims to seek support.

           The United to End Violence Against Women and Girls project was led by VISION researchers Ladan Hashemi and Sally McManus, in collaboration with colleagues from other UK universities including the University of Bristol, Goldsmiths University, Animation Research Centre at the University for the Creative Arts, and Leeds Beckett University. 

          They worked with an animation production team in Iran, a social media advisor, and two advisory groups. The advisory groups were Mehre Shams Afarid, an Iran-based non-governmental organisation (NGO), and IKWRO, a London-based charity providing services to women victims of violence from the Middle Eastern and North African (MENA) region—to incorporate culturally specific insights.

          Although the project initially focused on Iran, engaging with the UK-based NGO revealed an interest in extending its reach. As a result, English subtitles were added to make the animations accessible to a wider audience. This collaboration helped the content resonate with audiences both in Iran and within the global diaspora community, particularly those from the MENA region.

          The animations are grounded in evidence from a survey of 453 women in Iran, which explored the manifestation of various forms of VAWG in Iran and women’s perspectives on how to eliminate it. The survey was designed by Fatima Babakhani, CEO of Mehre Shams Afarid.

          Key findings from participants’ open-ended responses to the survey showed that, despite structural inequalities and deeply ingrained societal, cultural, and religious norms that perpetuate VAWG, change is possible through education and legal reforms.

          As one survey participant noted: “Unfortunately, many still don’t understand what violence truly is. Raising awareness is the solution.”

          The first four United to End Violence Against Women and Girls campaign animations focus on coercive control, economic abuse, technology-facilitated abuse, and active bystander interventions, with two more animations in development.

          With guidance from an Iranian social media advisor, a digital strategy was developed to maximise the campaign’s impact. Instagram was chosen as the primary distribution platform, as it is the most widely used social media platform in Iran, with over 47 million users. The animations are also shared on YouTube to further extend the campaign’s reach.

          Influencers and women’s rights activists with followings from thousands to millions were partnered with to amplify the campaign’s reach. The online campaign officially launched 25th November, on the International Day for the Elimination of Violence Against Women and Girls.

          By leveraging evidence-based content and strategic partnerships, we hope to spark meaningful conversations and drive change across Iran and the diaspora communities from the MENA region.

          Join us in raising awareness and advocating for change. Please follow and share the campaign links on your social media to help spread the message.

          Link to Instagram page

          Link to YouTube channel

          This project was funded by City St George’s, University of London Higher Education Impact Fund (HEIF) Knowledge Exchange and by the UKPRP VISION research consortium.

          For further information, please contact Ladan at ladan.hashemi@city.ac.uk

          Tech-facilitated abuse and the ‘new normal’

            by Dr Leonie Tanczer, Associate Professor, University College London, and Co-Investigator, UKPRP VISION

            The growth of digital technologies in our lives creates new habits, practices, and expectations. We need better public awareness and debate about the “new normal” we are experiencing in a society where the misuse of digital technologies has become widespread. 

            I don’t know about you, but there used to be a time when I was excited and thrilled by technology. I remember how ecstatic I was when I got my first mobile – and, later, my first smartphone. How unbelievably neat it felt “browsing” the web to research a school assignment. And how empowering and beneficial I once perceived platforms such as Twitter.  

            That’s sadly no longer how I think and feel. And I must confess, I’ve become quite pessimistic.  

            You can blame my dreary outlook on living through my 20s in a world where digital innovations became entrenched in daily life and now constantly demand our attention. Alternatively, you may say that my perspective has changed since I started to study technology-facilitated domestic violence (tech abuse). My interest in tech abuse emerged in 2018 when I set out to examine how smart, Internet-connected devices – such as Amazon’s Ring Doorbell or the Google Home smart speaker – impacted domestic abuse victims and survivors. It should have been only a short, six-month research project, but it developed a life of its own. Since then, my research focus and team have steadily grown and we are researching tech abuse as part of the VISION project. As the research grows, so has the scale and awareness of tech abuse. 

            Tech can exacerbate societal problems  

            I never fully bought into the narrative that tech can solve all societal ills. If anything, my research on tech abuse has shown how the misuse of digital technologies can exacerbate societal problems. The boundaries have started to blur around what is and isn’t acceptable online and where one can draw the line around what may or may not be abusive when handling digital tech. 

            Tech abuse is the misuse of “everyday” digital systems to alter, amplify, and accelerate coercive and controlling behaviour in the context of intimate partner violence (IPV). Tech abuse is a major concern because it offers perpetrators of domestic abuse new and powerful tools to monitor and harass. And let’s be clear: domestic abuse is an epidemic. It is widespread (approximately 1 in 5 UK adults aged 16 years and over had experienced domestic abuse since the age of 16 years); harmful (it impacts victims’/survivors’ mental, emotional, physical, social and financial wellbeing); as well as gendered and deadly ( Homicide Index data for the year ending March 2019 to the year ending March 2021 show that 72.1% of victims of domestic homicide were female).

            To date, our research group has investigated numerous angles related to this expanding abuse form, from usability tests of digital devices and the analyses of legal tools to tech abuse’s interconnection with mental health. We have been outspoken about shortcomings in policy debates and the wider cybersecurity sector and collaborated with and been informed by the efforts of key stakeholders that represent the voice and lived experience of victims and survivors, as well as those working with perpetrators.

            What is “normal” and “acceptable”?

            The functionalities and abilities many digital services offer (and for which consumers actively pay!) create a slippery slope towards their misuse. For example, I am all up for the remote control of my heater from my UCL office, the sharing of geolocation data whilst in an Uber, and the exchange of streaming service passwords with family and friends. I mean as a white, privileged, tech-savvy woman in a consensual partnership and with supportive colleagues and friends, these features frequently benefit me.  

            But, what if they don’t? What if I wasn’t the legal owner and account holder of the systems I use? What if I had to think of the inferences corporations and governments will make based on my data profile? And what if it were down to my coercive partner to control the temperature, to know my whereabouts, and to set up/maintain my Netflix or email account?  

            At present, many concerns that digital systems cause are addressed along the principle of informed consent, which is technically quite simple: once something happens without the awareness and approval of all parties involved, a line has been breached. But what are we doing when ensuring informed consent is impossible or doesn’t go far enough to protect someone from abuse?  

            More profoundly, I believe we must start to ask ourselves important questions around the “new normal” that is looming and that I don’t think we have begun to unpack: is it OK for my partner to know my email password? Is it OK for my partner to check who I’ve been texting? And is it OK for my partner to ask for nudes via text? Plus, what if we bring children into the mix? Is it OK for parents to overtly install parental control software on devices legitimately purchased and gifted to their kids? And can – and should – a 15-year-old reject? 

            We need a public debate

            Undoubtedly, I don’t have definite answers to any of the above-posed questions. But they have been in my mind for some time, and I’d love to see them addressed. Relationships – whether with our children, parents, friends, or romantic partners – are not always pure bliss. They can be overshadowed by conflict, but in the worst case, they can be toxic and outright destructive and harmful. Our digital systems must be capable to account for this. I, thus, believe a public debate or a re-evaluation on what we should accept as ‘normal’ is urgently needed. This then may hopefully lead to safeguards put into place so that everyone – independent of their situation – can make conscious choices on tech’s impact on their lives as well as partnerships.  

            Photo by Luca Bravo on Unsplash

            Technology-facilitated abuse seminar

              This event is in the past.

              Wednesday 10 May 2023, 1 – 2 pm, hosted by the Oxford Internet Institute

              Dr Leonie Tanczer, Associate Professor in International Security and Emerging Technologies at University College London and Co-Investigator of the UKPRP Violence, Health and Society (VISION) consortium, presented on technology-facilitated abuse (“tech abuse”) in the context of intimate partner violence (IPV) .

              She examined the “boundary questions” that tech abuse creates and provided an overview of the current research landscape whilst discussing the findings of a recent comparative survey conducted with UK and Australian support sector representatives.

              For further information on the seminar please see: Technology-facilitated abuse in the context of intimate partner violence Tickets, Wed 10 May 2023 at 13:00 | Eventbrite

              Photo by David Carillet / Shutterstock.com