Archives

Tech-facilitated abuse and the ‘new normal’

    by Dr Leonie Tanczer, Associate Professor, University College London, and Co-Investigator, UKPRP VISION

    The growth of digital technologies in our lives creates new habits, practices, and expectations. We need better public awareness and debate about the “new normal” we are experiencing in a society where the misuse of digital technologies has become widespread. 

    I don’t know about you, but there used to be a time when I was excited and thrilled by technology. I remember how ecstatic I was when I got my first mobile – and, later, my first smartphone. How unbelievably neat it felt “browsing” the web to research a school assignment. And how empowering and beneficial I once perceived platforms such as Twitter.  

    That’s sadly no longer how I think and feel. And I must confess, I’ve become quite pessimistic.  

    You can blame my dreary outlook on living through my 20s in a world where digital innovations became entrenched in daily life and now constantly demand our attention. Alternatively, you may say that my perspective has changed since I started to study technology-facilitated domestic violence (tech abuse). My interest in tech abuse emerged in 2018 when I set out to examine how smart, Internet-connected devices – such as Amazon’s Ring Doorbell or the Google Home smart speaker – impacted domestic abuse victims and survivors. It should have been only a short, six-month research project, but it developed a life of its own. Since then, my research focus and team have steadily grown and we are researching tech abuse as part of the VISION project. As the research grows, so has the scale and awareness of tech abuse. 

    Tech can exacerbate societal problems  

    I never fully bought into the narrative that tech can solve all societal ills. If anything, my research on tech abuse has shown how the misuse of digital technologies can exacerbate societal problems. The boundaries have started to blur around what is and isn’t acceptable online and where one can draw the line around what may or may not be abusive when handling digital tech. 

    Tech abuse is the misuse of “everyday” digital systems to alter, amplify, and accelerate coercive and controlling behaviour in the context of intimate partner violence (IPV). Tech abuse is a major concern because it offers perpetrators of domestic abuse new and powerful tools to monitor and harass. And let’s be clear: domestic abuse is an epidemic. It is widespread (approximately 1 in 5 UK adults aged 16 years and over had experienced domestic abuse since the age of 16 years); harmful (it impacts victims’/survivors’ mental, emotional, physical, social and financial wellbeing); as well as gendered and deadly ( Homicide Index data for the year ending March 2019 to the year ending March 2021 show that 72.1% of victims of domestic homicide were female).

    To date, our research group has investigated numerous angles related to this expanding abuse form, from usability tests of digital devices and the analyses of legal tools to tech abuse’s interconnection with mental health. We have been outspoken about shortcomings in policy debates and the wider cybersecurity sector and collaborated with and been informed by the efforts of key stakeholders that represent the voice and lived experience of victims and survivors, as well as those working with perpetrators.

    What is “normal” and “acceptable”?

    The functionalities and abilities many digital services offer (and for which consumers actively pay!) create a slippery slope towards their misuse. For example, I am all up for the remote control of my heater from my UCL office, the sharing of geolocation data whilst in an Uber, and the exchange of streaming service passwords with family and friends. I mean as a white, privileged, tech-savvy woman in a consensual partnership and with supportive colleagues and friends, these features frequently benefit me.  

    But, what if they don’t? What if I wasn’t the legal owner and account holder of the systems I use? What if I had to think of the inferences corporations and governments will make based on my data profile? And what if it were down to my coercive partner to control the temperature, to know my whereabouts, and to set up/maintain my Netflix or email account?  

    At present, many concerns that digital systems cause are addressed along the principle of informed consent, which is technically quite simple: once something happens without the awareness and approval of all parties involved, a line has been breached. But what are we doing when ensuring informed consent is impossible or doesn’t go far enough to protect someone from abuse?  

    More profoundly, I believe we must start to ask ourselves important questions around the “new normal” that is looming and that I don’t think we have begun to unpack: is it OK for my partner to know my email password? Is it OK for my partner to check who I’ve been texting? And is it OK for my partner to ask for nudes via text? Plus, what if we bring children into the mix? Is it OK for parents to overtly install parental control software on devices legitimately purchased and gifted to their kids? And can – and should – a 15-year-old reject? 

    We need a public debate

    Undoubtedly, I don’t have definite answers to any of the above-posed questions. But they have been in my mind for some time, and I’d love to see them addressed. Relationships – whether with our children, parents, friends, or romantic partners – are not always pure bliss. They can be overshadowed by conflict, but in the worst case, they can be toxic and outright destructive and harmful. Our digital systems must be capable to account for this. I, thus, believe a public debate or a re-evaluation on what we should accept as ‘normal’ is urgently needed. This then may hopefully lead to safeguards put into place so that everyone – independent of their situation – can make conscious choices on tech’s impact on their lives as well as partnerships.  

    Photo by Luca Bravo on Unsplash

    Unlocking violence information from clinical text

      Blog by Dr Lifang Li, Research Associate with UKPRP VISION, Kings College London

      Clinical Record Interactive Search (CRIS)

      In 2008, the Clinical Record Interactive Search (CRIS) system was launched. CRIS removes personal identifiers from the health records of the South London and Maudsley NHS Trust, making them available for use in mental health research. The platform operates under a governance framework that prioritises patient anonymity and places patients at the centre of its operations. The use of exceptionally large volumes of records with unprecedented levels of detail has the potential to revolutionise mental health research. 

      The CRIS Violence application

      The CRIS violence application is computer software that finds clinical text that refers to interpersonal violence, including the presence of violence, patient status (i.e. as perpetrator, witness or victim of violence) and violence type (domestic, physical and/or sexual) using Natural Language Processing (NLP). NLP uses pattern matching and statistical techniques to automatically process natural human language. It was developed by Riley Botelle, Professor Robert Stewart and their colleagues to the identification and classification of experiences of violence in narrative records, described in their 2022 paper “Can natural language processing models extract and classify instances of interpersonal violence in mental healthcare electronic records: an applied evaluative study”. Recently, after a thorough validation process, the CRIS team has started to run the violence application routinely, alongside many other NLP applications (e.g., to find suicidality, agitation, medications, anxiety) that are available for CRIS. Structured output from these, now including various violence-related variables, is saved back into the CRIS database, from where authorised health researchers can access it.  

      How does it serve the researchers and clinicians?

      By accurately identifying the presence of violence, different types of violence, and patient status, the application is enabling researchers to examine how experiences of violence are correlated with various mental health problems, outcomes and treatment trajectories, and how these relate to patients’ characteristics (such and age, gender, and ethnic group), and account for health inequalities.

      Future work

      Given the possibility that psychological abuse and economic abuse may also occur in patients and are recorded in the health record by clinicians, our work as part of the VISION consortium involves updating the current violence application to identify mentions of these, allowing us to extend the violence research possible using CRIS.

      Illustration: Nina Rys / Shutterstock.com