I’ve just been in the SciPy diversity lunch overflow session. Basically all the people who registered too late to get a ticket to the official SciPy diversity lunch got together in a room and and had a good chat about why we care about diversity in coding and science, and what we can possibly do about it. Sitting there I was simultaneously acutely feeling both my diversity and my intense privilege. As I’ve already experienced at numerous computer science conferences, I walk into a room in my flowery skirt and I can feel the eyes on me. Sometimes I’m the only woman in the room, sometimes there’s a handful. But I’m certainly a novelty. But I’m used to this. And I can use our shared cultural familiarity to my advantage. I’m different, but I’m really not that different. I’m also in a room of predominately white, educated, relatively affluent people.
I have heard some very senior scientists claim to promote diversity while claiming to be blind to bias. Statistically speaking, you are probably biased. Likely across multiple attributes. I know I am. I found it both confronting and a relief to realise that being bias-free is not an option, so I can only work towards overcoming my biases. You can test your implicit bias on a number of categories.
So now that I accept that I have a problem, what can I do about it? These suggestions for increasing diversity in hiring are directly pilfered from our lunch conversation at SciPy. Some are mine, most are not.
- Advertise jobs openly and transparently. We all know that most jobs are not advertised, and are allocated based on networks. Our networks are going to be mostly people like us, and therefore less diverse.
- See out diverse networks to share job ads.
- Remove gender/race identifiers from CVs before they are considered.
- Put wording in the job description that makes it clear you’re looking for diversity. For example you could explicitly be open to people who are transitioning fields or mention that you can support visa applications.
- Make job titles more flexible and/or less intimidating. For example some people might be more likely to apply for an “analyst” position rather than an “engineer” position, even if they had the same skills.
- If the applications pass through a recruitment organisation, and HR department, or some algorithm, consider that this may be introducing an additional layer of bias.
There were lots of other great ideas (thanks to Miquela Stein for sharing your notes), but this is what I took away for the gathering. Some of these issues may be more or less relevant or apparent in academia, but there’s still a lot we can learn from each other.