On October 10th, I attended a seminar at the FDA titled “Gender Bias in Artificial
Intelligence and Other Biomedical Innovations: How to Fix Knowledge.” The speaker, Londa
Schiebinger, came all the way from Stanford University in California to present this seminar, and there was a room full of attendees as well as a number of people who watched the seminar remotely.
Dr. Schiebinger’s seminar was compelling and gave me a lot to think about. She started
off the seminar by talking about the difference between sex and gender. Sex is determined by the physical attributes of the body whereas the word “gender” encompasses a slew of societal norms and stereotypes. From there, Schiebinger discussed the issue of women not being represented in research and what the effect of this disparity is. For instance, Schiebinger explained that because women are given limited participation in studies to test the safety of drugs, there can be dangerous consequences. Drugs can affect men and women differently, and when they are only tested on men, it is unknown whether they will affect women in the same way. In many cases, they don’t. To illustrate, Schiebinger gave us this eye-opening fact: Between 1997 and 2000, 10 drugs were withdrawn from the U.S. market because of life-threatening health effects – 8 of those showed greater severity in women. After this discussion, Dr. Schiebinger dove into the topic of digital voice assistants like Apple’s Siri and Amazon’s Alexa and how they reinforce gender bias. According to Schiebinger, female voice assistants are wired to be more passive and compliant than male voice assistants. Though there have been complaints about this, developers are slow to make changes.
As a young professional entering the workforce, I want to be aware and conscious of gender bias and the many consequences that come with it. The more information I have about it, the more I will be able to help end gender bias, especially as someone entering the male-dominated field of engineering.