
2 days ago
Judy Wajcman - Gender bias in data and technology
Judy Wajcman, Anthony Giddens Professor of Sociology at the London School of Economics, talks to us about tackling bias in data and technology.
About Judy Wajcman
"I am the Anthony Giddens Professor of Sociology at the London School of Economics and a Fellow at the Alan Turing Institute.
My work has broadly been within the area of the sociology of technology, work and employment, and I’ve been particularly interested in how gender relations are embedded in these fields."
Programmed to think that tech is a male preserve
I’ve been working on gender and technology for many, many decades. I was originally interested in why the workforce is very gendered, why some kinds of work are seen as appropriate for women and some kinds of work are seen as appropriate for men. When I started doing this work, it was very clear that women were trained much more to be nurses and teachers, and men were trained to do technical work, engineering work, industrial work. This notion of masculinity being associated with scientific, technical, industrial work had a long history. This was transposed in the computer era so that computer programming, quite early on, became a male preserve.
By doing some historical work, I realised that this hadn’t always been the case for the early computers – those huge computers that take up a whole room that you see just after the Second World War. There were lots of women programmers in there, but they had somehow been hidden from history. Somehow, the contemporary notion was that physics, maths and computing were very much a male preserve.
I remember reading Sherry Turkle’s early work on MIT hackers, as she called them – a culture of engineers who were guys who worked all night and ordered pizza at 3am and who were very hooked on their machines. In more recent years, I’ve been interested in how it is that if you look at conferences of roboticists, people who work in artificial intelligence, and now machine learning, we find that these areas are very male-dominated.
Key Points
• In the era of early computers, just after World War Two, lots of programmers were women, but they were hidden from history.
• The data that is going into a lot of the latest technologies is embedded with all kinds of bias – such as the algorithms used in recruitment.
• Facial recognition technologies identify white male faces much more easily than dark-skinned female faces, because they’re trained on white faces. If the training data is biased, then the outcomes will be biased.
• The notion that all social problems have a technological solution is the central idea of the Silicon Valley companies. It’s very important that we resist that kind of technological determinism.
No comments yet. Be the first to say something!