Caroline Criado-Perez talks exposing data bias ahead of Rewired 2020
- 18 February 2020
“There are two excuses that I get from medical researchers about why they don’t include women in their research. The most common one is that women are just too complicated and too hormonal,” – excuses like this are all too common in Caroline Criado-Perez’s (OBE) world, and they are costing women their lives.
The renowned journalist, author, activist and feminist has spent years researching and raising awareness of gender data gaps in everything from the economy to medicine and technology, and she’s bringing her expertise to Digital Health Rewired 2020.
Caroline has led on successful campaigns to increase the proportion of women in the media, maintain female representation on bank notes and for a statue of suffragette Millicent Fawcett to be erected in Parliament. It’s safe to say she’s a force to be reckoned with, by anyone’s standards.
In 2019 she published ‘Invisible Women: Exposing Data Bias in a World Designed for Men’, examining how women are so frequently omitted from data that governs our everyday lives from policy right down to basic decisions like snow clearing.
Snow clearing? Sounds far-fetched, admittedly, but delving into the order in which cities clear snow from transport routes – main roads first – it’s based on what is considered the average journey to work. What may seem like common sense is actually modelled on the typical male commute and doesn’t take into account that women are more likely to walk children to school or take public transport rather than drive, so therefore main roads are cleared first.
Where the journey began
It’s data gaps like this, though not always intentional, that are highlighted in Invisible Women.
But Caroline herself admits she wasn’t always a feminist – it wasn’t until she studied feminist literature and economics at university that she realised her own ideals we set around men and the male body being the universal norm. That’s where her journey into data gaps began.
“That primed me to start noticing where the default male appeared – where people were using man to mean human and where that was causing problems,” she tells Digital Health News.
“The straw that broke the camel’s back was discovering the gender data gap in medicine, specifically discovering a paper about how women are so much more likely to be misdiagnosed if they have a heart attack and part of the reason for that is that they don’t necessarily experience the classical symptoms.
“It’s one thing for me as a non-medic not to know, but for medics to be missing it as well it was completely shocking.”
Designing things blind
One study from the University of Leeds found women were 50% more likely to be misdiagnosed when having a heart attack purely because research centres around the symptoms men experience, not women.
It’s research like this that Caroline will be discussing at Rewired. Research that, if based on sex disaggregated data, has the potential to save lives, but instead a gender data gap is leading to medicines, studies and health information being bias.
“It’s fundamental and non-negotiable, as far as I am concerned, because the information we have determines the way we allocate resources… the way we design anything from government policy to transport to technology to medicine, to anything you can think of,” Caroline said.
“We don’t just design things blind, we design them based on whatever information we have.
“Women are not men and there are all sorts of ways that the female body reacts differently to the male body.”
Differences between men and women
But, somewhat ironically, it’s the fact the female body reacts differently that often deters medical researchers from conducting studies on women – a problem Caroline labels “unethical”.
“There are two excuses that I get from medical researchers about why they don’t include women in their research. The most common one is that women are just too complicated and too hormonal,” Caroline told Digital Health News.
“I’ve been told by a researcher ‘well we can’t include females because the menstrual cycle will interfere with the results’, which I just find astounding. That’s basically saying ‘reality is too complicated for me to engage with’, but reality will exist whether you engage with it or not.
“The menstrual cycle doesn’t cease to exist because you don’t test it, it just means that you won’t know what the effects are before we use whatever it is you are testing.”
But this thinking is “normalised”, Caroline explained, because data on women is still seen as an extra rather than universal. It has become routine to only include men in research.
“The other thing that people often say is ‘well we will start off in men and if we find anything interesting we will add women at a later date’, which is also a very good example of how we think of the male body as a neutral universal body and the female body is the complicating factor,” she added.
“The thing that is frustrating about that is there are things that will work for women that won’t work for men. So if you start off just in men, and only proceed in women if you find anything interesting, then there are things you will rule out at an early stage just because it would have worked for women.
“We need to be starting off our testing at the cell level, of male and female cells, and not just using male cells as if they are the universal template.”
AI bias?
These biases don’t just exist in the world of medicine, they extend to technology as well.
Artificial intelligence (AI), for example, relies on rich data to inform its algorithms but if that data doesn’t wholly represent the population then dangerous biases can occur, particularly if AI is being used to assess the risk of a disease.
A University of Washington study, which used a dataset with a deliberate bias where pictures of cooking were 33% more likely to contain women, found that by the time an algorithm had finished training the disparity was increased to 68%, meaning the algorithm was mistaking men for women purely because they were next to a stove.
It’s a concept Caroline admits “terrifies” her, adding: “We know that algorithms don’t just reflect our biases back at us, they emphasise them significantly.
“Diversity of all types is just incredibly important because diversity will win you a diverse range of experience and knowledge that you just won’t have if you have a homogenous group.
“A homogeneous group will have gaps in their knowledge and that goes for ethnicity as well as gender as well as disability and all these different types of diversity.”
How you can help
Everyone has a part to play in abolishing data gaps, Caroline explained, but the biggest changes need to come from those with the most power.
“I think this has to come from the top, there has to be regulation which has to involve the government,” she said.
“But I do think organisations can set their own standards. One of the things that has been really encouraging to see is certain journals saying they won’t accept papers that don’t have sex disaggregated data, that is such a fantastic development.
“More generally I think it’s just a case of going out and spreading the word about the default male as I have. A lot of this you don’t even realise is happening, most people aren’t aware they are thinking of men as universal.
“Of course, it’s not a neutral body, it’s just one half of the human body. It’s important to highlight that because once you’ve highlighted it the battle is half won. The resistance to addressing this issue is very driven by this unconscious bias that once you’ve made it conscious and visible becomes a clear nonsense.”
Caroline Criado-Perez will be speaking at the Digital Health Rewired Conference and Exhibition, taking place at Olympia London on 4 March 2020.
For the full Rewired 2020 programme and speaker lineup, visit digitalhealthrewired.com