Archives

Causal discovery for studying sexual abuse and psychotic phenomena

 Dr Giusi Moffa

Sexual abuse and bullying are associated with poor mental health in adulthood. Elucidating putative causal relationships between affective and psychotic symptoms may inform the development of therapies. Causal diagrams can help gain insights, but how?

Given a causal diagram, usually represented as a directed acyclic graph (DAG), and observational data from the variables on the graphs, many analytical methods (especially adjustment techniques) allow us to estimate the effect that intervening on a variable is expected to have on another.

In real-world problems, we rarely have a complete picture of an underlying structural mechanism regulating the relationship among different variables. Causal discovery is a technique leveraging statistics and machine learning tools to uncover plausible causal relationships from data, with little to no prior knowledge of them. While learning causal structures from purely observational data relies on unrealistic assumptions (especially causal sufficiency and faithfulness), a causal discovery exercise may help us identify the most promising scenarios to prioritise when designing interventional studies.

In a recent article, now available open access in Psychological Medicine, Dr Giusi Moffa, Statistician affiliated with the University of Basel, Switzerland and colleagues used state-of-the-art sampling methods for inference of directed acyclic graphs (DAGs) on data from the English Adult Psychiatric Morbidity Surveys, to investigate sexual abuse and psychotic phenomena.

The analysis sought to model the interplay among 20 variables, including being a victim of bullying or sexual abuse and a range of psychotic (e.g. paranoia, hallucinations and depression) and affective symptoms (e.g. worry and mood instability) while accounting for the sex of the participant. To respect temporality, we imposed some prior constraints on the DAG structure: childhood sexual abuse and bullying referred to events that were temporally antecedent to the assessment of the psychological variables, and hence they only admit incoming edges from sex and each other.

Contrary to expectations, the procedure favoured models placing paranoia early in the cascade of relationships, close to the abuse variables and generally upstream of affective symptoms. A possible implication is that paranoia follows from early abuse involving bullying or sexual exploitation as a direct consequence. Overall, the results were consistent with sexual abuse and bullying driving a range of affective symptoms via worry. As such, worry may be a salient target for intervention in psychosis.

Check out the paper for a more thorough discussion of the findings (joint work with Jack Kuipers, Elizabeth Kuipers, Paul Bebbington and VISION member Sally McManus).

This is a repost of a blog available on LinkedIn: https://www.linkedin.com/pulse/causal-discovery-studying-sexual-abuse-psychotic-phenomena-moffa

Paper available open access: https://www.cambridge.org/core/journals/psychological-medicine/article/sexual-abuse-and-psychotic-phenomena-a-directed-acyclic-graph-analysis-of-affective-symptoms-using-english-national-psychiatric-survey-data-erratum/CF603075EBBD5D75E60F327CE01C4050

For further information about the approach: giusi.moffa@unibas.ch

VISION member awarded UKDS Impact Fellow focused on the socioeconomics of violence

Dr Niels Blom

We’re delighted that one of VISION’s core researchers, Dr Niels Blom, has been awarded a prestigious UK Data Service (UKDS) Fellowship.

The award will be used to improve the reach and impact of Niels’ research on violence and abuse and its relationship with job loss, health, and wellbeing. He is using several UKDS datasets, including the UK Longitudinal Household Survey and the Crime Survey for England Wales, to understand the link between violence, particularly intimate partner violence, and its socioeconomic, wellbeing, and health impact.

For more information about Niels, his work, and what he hopes to get out of the Fellowship scheme, see his blog on the UKDS website.  

The UKDS is funded by the UKRI and houses the largest collection of economic, social and population data in the UK. Its Data Impact Fellowship scheme is for early career researchers in the academic or the voluntary, community, and social enterprise (VSCE) sector. The focus in 2023 is on research in poverty, deprivation, the cost of living crisis, housing and homelessness, using data in the UK Data Service collection. The purpose of the programme is to support impact activities stemming from data-enhanced work.  

For further information on the UK Data Service please see: UK Data Service

To read Niels’ blog please see: UK Data Service Data Impact Fellows 2023: Niels Blom – Data Impact blog

Or contact Niels at niels.blom@city.ac.uk

Photo by Alina Grubnyak on Unsplash

Tech-facilitated abuse and the ‘new normal’

by Dr Leonie Tanczer, Associate Professor, University College London, and Co-Investigator, UKPRP VISION

The growth of digital technologies in our lives creates new habits, practices, and expectations. We need better public awareness and debate about the “new normal” we are experiencing in a society where the misuse of digital technologies has become widespread. 

I don’t know about you, but there used to be a time when I was excited and thrilled by technology. I remember how ecstatic I was when I got my first mobile – and, later, my first smartphone. How unbelievably neat it felt “browsing” the web to research a school assignment. And how empowering and beneficial I once perceived platforms such as Twitter.  

That’s sadly no longer how I think and feel. And I must confess, I’ve become quite pessimistic.  

You can blame my dreary outlook on living through my 20s in a world where digital innovations became entrenched in daily life and now constantly demand our attention. Alternatively, you may say that my perspective has changed since I started to study technology-facilitated domestic violence (tech abuse). My interest in tech abuse emerged in 2018 when I set out to examine how smart, Internet-connected devices – such as Amazon’s Ring Doorbell or the Google Home smart speaker – impacted domestic abuse victims and survivors. It should have been only a short, six-month research project, but it developed a life of its own. Since then, my research focus and team have steadily grown and we are researching tech abuse as part of the VISION project. As the research grows, so has the scale and awareness of tech abuse. 

Tech can exacerbate societal problems  

I never fully bought into the narrative that tech can solve all societal ills. If anything, my research on tech abuse has shown how the misuse of digital technologies can exacerbate societal problems. The boundaries have started to blur around what is and isn’t acceptable online and where one can draw the line around what may or may not be abusive when handling digital tech. 

Tech abuse is the misuse of “everyday” digital systems to alter, amplify, and accelerate coercive and controlling behaviour in the context of intimate partner violence (IPV). Tech abuse is a major concern because it offers perpetrators of domestic abuse new and powerful tools to monitor and harass. And let’s be clear: domestic abuse is an epidemic. It is widespread (approximately 1 in 5 UK adults aged 16 years and over had experienced domestic abuse since the age of 16 years); harmful (it impacts victims’/survivors’ mental, emotional, physical, social and financial wellbeing); as well as gendered and deadly ( Homicide Index data for the year ending March 2019 to the year ending March 2021 show that 72.1% of victims of domestic homicide were female).

To date, our research group has investigated numerous angles related to this expanding abuse form, from usability tests of digital devices and the analyses of legal tools to tech abuse’s interconnection with mental health. We have been outspoken about shortcomings in policy debates and the wider cybersecurity sector and collaborated with and been informed by the efforts of key stakeholders that represent the voice and lived experience of victims and survivors, as well as those working with perpetrators.

What is “normal” and “acceptable”?

The functionalities and abilities many digital services offer (and for which consumers actively pay!) create a slippery slope towards their misuse. For example, I am all up for the remote control of my heater from my UCL office, the sharing of geolocation data whilst in an Uber, and the exchange of streaming service passwords with family and friends. I mean as a white, privileged, tech-savvy woman in a consensual partnership and with supportive colleagues and friends, these features frequently benefit me.  

But, what if they don’t? What if I wasn’t the legal owner and account holder of the systems I use? What if I had to think of the inferences corporations and governments will make based on my data profile? And what if it were down to my coercive partner to control the temperature, to know my whereabouts, and to set up/maintain my Netflix or email account?  

At present, many concerns that digital systems cause are addressed along the principle of informed consent, which is technically quite simple: once something happens without the awareness and approval of all parties involved, a line has been breached. But what are we doing when ensuring informed consent is impossible or doesn’t go far enough to protect someone from abuse?  

More profoundly, I believe we must start to ask ourselves important questions around the “new normal” that is looming and that I don’t think we have begun to unpack: is it OK for my partner to know my email password? Is it OK for my partner to check who I’ve been texting? And is it OK for my partner to ask for nudes via text? Plus, what if we bring children into the mix? Is it OK for parents to overtly install parental control software on devices legitimately purchased and gifted to their kids? And can – and should – a 15-year-old reject? 

We need a public debate

Undoubtedly, I don’t have definite answers to any of the above-posed questions. But they have been in my mind for some time, and I’d love to see them addressed. Relationships – whether with our children, parents, friends, or romantic partners – are not always pure bliss. They can be overshadowed by conflict, but in the worst case, they can be toxic and outright destructive and harmful. Our digital systems must be capable to account for this. I, thus, believe a public debate or a re-evaluation on what we should accept as ‘normal’ is urgently needed. This then may hopefully lead to safeguards put into place so that everyone – independent of their situation – can make conscious choices on tech’s impact on their lives as well as partnerships.  

Photo by Luca Bravo on Unsplash

Unlocking violence information from clinical text

Blog by Dr Lifang Li, Research Associate with UKPRP VISION, Kings College London

Clinical Record Interactive Search (CRIS)

In 2008, the Clinical Record Interactive Search (CRIS) system was launched. CRIS removes personal identifiers from the health records of the South London and Maudsley NHS Trust, making them available for use in mental health research. The platform operates under a governance framework that prioritises patient anonymity and places patients at the centre of its operations. The use of exceptionally large volumes of records with unprecedented levels of detail has the potential to revolutionise mental health research. 

The CRIS Violence application

The CRIS violence application is computer software that finds clinical text that refers to interpersonal violence, including the presence of violence, patient status (i.e. as perpetrator, witness or victim of violence) and violence type (domestic, physical and/or sexual) using Natural Language Processing (NLP). NLP uses pattern matching and statistical techniques to automatically process natural human language. It was developed by Riley Botelle, Professor Robert Stewart and their colleagues to the identification and classification of experiences of violence in narrative records, described in their 2022 paper “Can natural language processing models extract and classify instances of interpersonal violence in mental healthcare electronic records: an applied evaluative study”. Recently, after a thorough validation process, the CRIS team has started to run the violence application routinely, alongside many other NLP applications (e.g., to find suicidality, agitation, medications, anxiety) that are available for CRIS. Structured output from these, now including various violence-related variables, is saved back into the CRIS database, from where authorised health researchers can access it.  

How does it serve the researchers and clinicians?

By accurately identifying the presence of violence, different types of violence, and patient status, the application is enabling researchers to examine how experiences of violence are correlated with various mental health problems, outcomes and treatment trajectories, and how these relate to patients’ characteristics (such and age, gender, and ethnic group), and account for health inequalities.

Future work

Given the possibility that psychological abuse and economic abuse may also occur in patients and are recorded in the health record by clinicians, our work as part of the VISION consortium involves updating the current violence application to identify mentions of these, allowing us to extend the violence research possible using CRIS.

Illustration: Nina Rys / Shutterstock.com