Why collecting data on gender balance is important
13 October 2015
Tweet
There are at least three things that led to me getting my first faculty job a year ago: hard work, luck, and bias. It almost goes without saying that getting an academic job in today’s climate requires healthy doses of hard work and luck. The existence and persistence of the third factor is more troubling.
Bias probably helped me get a job because I’m a man, and life as a scientist is easier when you’re a man. People write more impressive-sounding letters of recommendation for men. Science faculty tend to rank CVs with men’s names as more competent and hireable than identical CVs with women’s names (NB: this study suggests otherwise, but it has been widely questioned). Men get more startup funding, have higher grant success rates, and get paid more than women. The list goes on and on. Of course these are not all independent processes, but they all point to the same underlying issue: we have a gender bias problem in science. To make matters worse, academia amplifies the bias because future success is largely determined by past success. The depressing upshot of this is most science professors are male (that link has a great interactive graphic). Things are improving slowly in most fields, but the evidence suggests that gender bias is still holding us back. There are many simple things that we can all do to help here. But I want to focus on data.
Collecting data on gender balance can very simple. We’re scientists, collecting data is what we do. And data on gender is everywhere. It’s on your department’s faculty website. It’s in your conference abstract book, your department seminar series website, your granting agency’s report, and so on. So, next time you’re in a conference talk that isn’t as gripping as you’d hoped, why not flick through the abstract book and tally up the number of plenaries given by men and women. Then tweet the data with the hashtag #scigenderdata. Or blog about it, mention it to someone in the bar, write a paper, whatever you like. It doesn’t have to be a big dataset, or a dataset that seems important. And you don’t have to make judgements about the data. You can just report the raw data and leave it for others to make judgements.
Gathering this kind of data does at least two useful things: it can open peoples eyes to the problem, and it can help effect change. For example, at a recent regional conference, a few of us independently collected data on the gender balance of the speakers. We discussed it on twitter, then wrote a paper about it. That experience changed how I (and hopefully others) advise students, and changed how that conference is run (they now select abstracts blind to the gender of the authors). Since then, I’ve twice been able to significantly improve the gender balance at conferences by simply showing the organisers how badly their program was shaping up. Neither set of organisers had intended to have plenaries delivered almost exclusively by men, but both had ended up that way. Both were open to fixing the problems, and both did (in one case I did a lot of work to help out). In all of these cases, I think twitter has helped. Once the raw data is out there, it’s hard for people to make excuses. But this post isn’t about calling people out (but go ahead if you like!), it’s just about getting the data out.
So, see what data you can collect. And see where it takes you. And don’t forget to report data no matter what it says. It would be great to hear good news as well as bad news.
Here are a few suggestions for categories of data that are often simple to collect. In most cases, we need only tally up the representation of different genders in a given category. Feel free to make additional suggestions in the comments. And if you collect some of this data, tweet it with the hashtag #scigenderdata, and let’s see what we find out.
As long as there’s a problem with gender in science, collecting the data is one small way to help out.
There are at least three things that led to me getting my first faculty job a year ago: hard work, luck, and bias. It almost goes without saying that getting an academic job in today’s climate requires healthy doses of hard work and luck. The existence and persistence of the third factor is more troubling.
Bias probably helped me get a job because I’m a man, and life as a scientist is easier when you’re a man. People write more impressive-sounding letters of recommendation for men. Science faculty tend to rank CVs with men’s names as more competent and hireable than identical CVs with women’s names (NB: this study suggests otherwise, but it has been widely questioned). Men get more startup funding, have higher grant success rates, and get paid more than women. The list goes on and on. Of course these are not all independent processes, but they all point to the same underlying issue: we have a gender bias problem in science. To make matters worse, academia amplifies the bias because future success is largely determined by past success. The depressing upshot of this is most science professors are male (that link has a great interactive graphic). Things are improving slowly in most fields, but the evidence suggests that gender bias is still holding us back. There are many simple things that we can all do to help here. But I want to focus on data.
Collecting data on gender balance can very simple. We’re scientists, collecting data is what we do. And data on gender is everywhere. It’s on your department’s faculty website. It’s in your conference abstract book, your department seminar series website, your granting agency’s report, and so on. So, next time you’re in a conference talk that isn’t as gripping as you’d hoped, why not flick through the abstract book and tally up the number of plenaries given by men and women. Then tweet the data with the hashtag #scigenderdata. Or blog about it, mention it to someone in the bar, write a paper, whatever you like. It doesn’t have to be a big dataset, or a dataset that seems important. And you don’t have to make judgements about the data. You can just report the raw data and leave it for others to make judgements.
Gathering this kind of data does at least two useful things: it can open peoples eyes to the problem, and it can help effect change. For example, at a recent regional conference, a few of us independently collected data on the gender balance of the speakers. We discussed it on twitter, then wrote a paper about it. That experience changed how I (and hopefully others) advise students, and changed how that conference is run (they now select abstracts blind to the gender of the authors). Since then, I’ve twice been able to significantly improve the gender balance at conferences by simply showing the organisers how badly their program was shaping up. Neither set of organisers had intended to have plenaries delivered almost exclusively by men, but both had ended up that way. Both were open to fixing the problems, and both did (in one case I did a lot of work to help out). In all of these cases, I think twitter has helped. Once the raw data is out there, it’s hard for people to make excuses. But this post isn’t about calling people out (but go ahead if you like!), it’s just about getting the data out.
So, see what data you can collect. And see where it takes you. And don’t forget to report data no matter what it says. It would be great to hear good news as well as bad news.
Here are a few suggestions for categories of data that are often simple to collect. In most cases, we need only tally up the representation of different genders in a given category. Feel free to make additional suggestions in the comments. And if you collect some of this data, tweet it with the hashtag #scigenderdata, and let’s see what we find out.
- Conference plenary/invited speakers (check out #YAMMM for some shocking examples)
- Conference regular speakers
- Conference poster presenters
- Departmental seminar series speakers
- Faculty in your department
- Professional and technical staff in your department
- Grant awardees for any scheme/year
As long as there’s a problem with gender in science, collecting the data is one small way to help out.