The first three industrial revolutions that transformed our modern society — Mechanization as in steam engines, mass production with the advent of electricity, and automation and digital technology – profoundly changed our world. Referred to as 4IR, it has been suggested [1] that we are at the cusp of a fourth industrial revolution with rapid advancement in artificial intelligence and machine learning.
AI is now being deployed in complex activities that have until now been regarded as the exclusive domain of the human mind. As has been noted, “the models and systems we create, and train are a reflection of ourselves [2]”, so it is critical that as we embark on this 4IR, fair and impartial machine systems are built with a keen eye on inclusivity for positive impact on society’s interactions and development. To achieve this fair, impartial and inclusive balance, both women and men should be equally represented in the scientific design and technological development of AI and machine learning. Yet as AI permeates all aspects of our lives and reshapes our society it has been noted that representation of women researchers involved in machine learning is very small [3, 5].
Generally, availability of data on machine learning researchers is very limited because most technology companies keep such information guarded, noting proprietary concerns. The most current and available data was collated [4a, 4b] on global AI talent at machine learning conferences that brought together a number of machine learning researchers from around the world. It was found that only 12 percent of machine learning researchers are women [4b].
This lack of gender diversity has prompted convenings such as Women in Machine Learning (WiML) [6] that focus on advancing careers and supporting women in machine learning. These convenings are run concurrently with sought after global machine learning conferences such as the Conference and Workshop on Neural Information Processing Systems (NIPS) to focus attention on this critical gap shaping the 4IR.
Not only is representation of women in machine learning limited, there is also cause for concern on how to keep women and advance their careers in equal terms as their male counterparts in the industry. Research shows that that 45% of women in science, engineering and technology as compared to their men are more likely to completely leave the industry in their first year [7] with reasons ranging from feeling unwelcome to even hostility.
Some [8] have suggested that occurrences such as the manifesto-type memo circulated by a Google software engineer in 2017 underpins such hostility towards women in the industry. The memo suggested that women are less suited to certain roles in technology and leadership because they have lower tolerance for stress. That the distribution of preferences and abilities of men and women differ in part due to biological causes which may explain why there are unequal representation of women in technology and leadership. The memo concluded that women’s “stronger interest in people” and “neuroticism” might make them less naturally suited to being coders at Google. Google fired him [8].
This perception is not new. John Gray’s wildly popular book, “Men Are from Mars, Women Are from Venus,” postulates that metaphorically speaking, men and women live on different planets [9a]. Eliot (2010), a neuroscientist and author of “Pink Brain, Blue Brain” has discredited many of the vague perceptions about inherent differences in women and men. Based on scientific research, Eliot asserts that personality differences are not hardwired at birth but rather gender conformity is what creates differences in adults’ brains [10]. So how can the tide of systematic underrepresentation of women in machine learning research in the 4IR be curbed and to keep and advance women once they are in?
First, provide women access to information on available opportunities in the machine learning space as men do. It has been found that women are less likely than men to be shown ads on Google for higher paying jobs [11]. So how would a woman know to apply for a job she never saw advertised? [2]. The underlying reasons have been attributed to complexity of how search engines show ads, to internet users or how advertisers prefer showing ads to men, all attributions of algorithmic flaws. This flaw underscores the reason why more women are needed in the design and development of machine learning to combat such biases.
Second, provide avenues for visibility to women in machine learning research. Women in technology are amongst the most invisible [12a]. In the 2015 interview by Wired.com of the then White House CTO Megan Smith, she shared that there are about 16 million programmers worldwide and up to 3 million are women. Her question is why don’t we ever see [or know about] them [12a]? Video: 18:17/36:02 Women TechMakers at Google seeks to provide such visibility, community and resources for women in technology [13]. This and similar platforms could be used to provide targeted and even more visibility, fairness, access and support for women in machine learning research, highlighting their impact in the 4IR to those within and all others outside the technology industry.
Finally, ensuring accountability from the technology community, particularly the male counterparts in machine learning research, governments and public institutions to commit to fairness and due process [2] will help ensure that all of society will benefit from fair, balanced and representative AI and machine learning systems in the 4IR.
References
[1] Wall Street Journal. Retrieved February 28, 2020 from https://www.wsj.com/articles/the-robots-are-coming-for-garment-workers-thats-good-for-the-u-s-bad-for-poor-countries-1518797631
[2] Kate Crawford. (2016). Artificial Intelligence’s White Guy Problem. New York Times. Retrieved, February 28, 2020 from https://www.nytimes.com/2016/06/26/opinion/sunday/artificial-intelligences-white-guy-problem.html
[3] Stanford University. (2016). One Hundred Year Study on AI: 2015–2016. Retrieved February 24, 2020 from https://ai100.stanford.edu/2016-report
[4a] Wired (n.d.) Retrieved February 28, 2020 from https://www.wired.com/story/artificial-intelligence-researchers-gender-imbalance/
[4b] Element AI, (n.d.) Retrieved February 28, 2020 from https://www.elementai.com/?cutm_campaign=6644368285&cutm_medium=cpc&cutm_source=google&gclid=EAIaIQobChMI-qm5jOz15wIVBK7ICh2xzQwIEAAYASAAEgJJTfD_BwE
[5] Chin, C. (2018). AI is the future but where are the women? Retrieved February 28, 2020 from https://www.wired.com/story/artificial-intelligence-researchers-gender-imbalance/
[6] Women in Machine Learning. (n.d.). Retrieved February 28, 2020 from https://wimlworkshop.org
[7] Hewlett, S. A. (2014). What’s holding women back in science and technology industries. Harvard Business Review. Retrieved February 28, 2020 from https://hbr.org/2014/03/whats-holding-women-back-in-science-and-technology-industries
[8] Murphy, M. (2017) That fired google engineer was totally wrong that ‘de-emphasizing empathy’ leads to better thinking. Forbes.com, Retrieved February 28, 2020 from https://www.forbes.com/sites/markmurphy/2017/08/11/that-fired-google-engineer-was-totally-wrong-that-de-emphasizing-empathy-leads-to-better-thinking/#63090996e57d
[9a] Naveen Jain (2013). Men are from mars, so are women. Forbes.com, Retrieved February 28, 2020 from https://www.forbes.com/sites/singularity/2012/08/13/men-are-from-mars-so-are-women/#4b4711f130f4
[9b] Gray, J. (1951). Men are from mars, women are from venus : A practical guide for improving communication and getting what you want in your relationships. New York, NY :HarperCollins
[10] Eliot, L. (2010). The myth of pink and blue brains. Educational Leadership: A Journal of the Department of Supervision and Curriculum Development, N.E.A. 68. 32-36.
[11] Byron Spice. (2015). Questioning the fairness of targeting Ads online. Carnegie Mellon University. Retrieved February 28, 2020 from https://www.cmu.edu/news/stories/archives/2015/july/online-ads-research.html
Video Credit: [12a] Wired (2015). Retrieved February 28, 2020 from https://www.wired.com/video/watch/white-house-cto-megan-smith-on-the-value-of-tech-diversity
[12b] Tolga Bolukbasi , Kai-Wei Chang , James Zou , Venkatesh Saligrama , Adam Kala.i (2016). Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings. 30th Conference on Neural Information Processing Systems (NIPS 2016), Barcelona, Spain. Retrieved February 26, 2020 from https://papers.nips.cc/paper/6228-man-is-to-computer-programmer-as-woman-is-to-homemaker-debiasing-word-embeddings.pdf
[13] Women Techmakers. (n.d.). Retrieved February 29, 2020 from https://www.womentechmakers.com
Buolamwini, J. & Gebru, T (2018). Gender shades: Intersectional accuracy disparities in commercial gender classification. Proceedings of the 1st Conference on Fairness, Accountability and Transparency, PMLR 81, 77-91
Photo CreditAnalyticsindiamag: https://analyticsindiamag.com/biased-ai-princeton-study-indicates-race-gender-bias-crept-machine-learning-algorithms/

