Autocorrect mistakes can be hilarious. There are sites dedicated solely to laughing at other people's texting mishaps. It's a humorous reminder of the old phrase, "Do you want something done right, or do you want it done fast?"
Google's "Smart Compose" feature in Gmail uses AI to auto-complete a user's sentences for quicker emailing. The time-saving tool ran into problems when the AI began assuming gender was associated with certain words.
In January, one of Google's research scientists discovered the problem when he typed "I am meeting the investor next week." The auto-fill feature suggested "Do you want to meet him?" as a follow-up question, according to Reuters.
Nothing about the word "investor" suggests it's a male-only occupation.
Paul Lambert, Gmail product manager told Reuters that gender is "a big, big thing" to get wrong.
The company set to solve the problem, but none of the proposed solutions seemed to fix the AI's gender bias. Instead, Google decided to limit the coverage. The pronoun ban would affect less than one percent of cases where Smart Compose proposed something.
The decision to limit the feature could stem from Google's past experience of questionable predictive tools. The company banned expletives and racial slurs from its predictive technologies after the search engine populated anti-semitic rhetoric in 2015 and racist images in 2016.
Of course, AI learns by following patterns. If the information available to the program is disproportionately mentioning men and associating them with certain roles, the AI will emulate this.
An extreme example is Microsoft's chatbot, Tay. The company launched the AI on Twitter in 2016 and deleted it after barely 24 hours. Tay assimilated the internet's darkest qualities and started spouting racist, misogynistic, and anti-semitic rhetoric.
Assuming a person's gender can lead to trouble. Even if gender isn't correlated to a certain word, Google could mistakenly assign gender to a non-binary or trans individual based on a
traditionally male or female name.
As technology advances, developers and tech companies are learning that they have to raise their awareness around issues of unconscious bias.
FOLLOW Download.com on Twitter for all the latest app news.
- Gmail's Smart Compose AI began assuming gender was associated with certain words and therefore perpetuated some stereotypes and biases.
- If the information available to the AI is disproportionately mentioning men and associating them with certain roles, the algorithm simply emulates it.
- Workplace sexual harassment app offers 24/7 access to a lawyer
- Report: Women are 79 percent more likely than men to purchase something in a mobile game
- Indian safety apps map harassment hotspots for women to avoid and authorities to counter
- Gmail for Android can now undo a sent email
- New apps use AI to help make the world more accessible for blind and visually impaired people
- Riot pledges to change its 'cultural DNA' after sexism allegations (CNET)
- Microsoft wants to give $4 million to two female-led startups (CNET)
- 'There's a culture that works against women': Changing the face of tech startups (ZDNet)
- Only 49% of women in tech feel they are treated equally to their male colleagues (TechRepublic)
- Why women in tech face bigger challenges than their male counterparts (TechRepublic)