istock-allanswart476663949.jpg
(Credit: Allan Swart/iStockphoto)

Autocorrect mistakes can be hilarious. There are sites dedicated solely to laughing at other people's texting mishaps. It's a humorous reminder of the old phrase, "Do you want something done right, or do you want it done fast?"

Google's "Smart Compose" feature in Gmail uses AI to auto-complete a user's sentences for quicker emailing. The time-saving tool ran into problems when the AI began assuming gender was associated with certain words.

In January, one of Google's research scientists discovered the problem when he typed "I am meeting the investor next week." The auto-fill feature suggested "Do you want to meet him?" as a follow-up question, according to Reuters.

SEE: Are dating apps responsible for a rise in sexual assaults? Not likely

Nothing about the word "investor" suggests it's a male-only occupation.

Paul Lambert, Gmail product manager told Reuters that gender is "a big, big thing" to get wrong.

The company set to solve the problem, but none of the proposed solutions seemed to fix the AI's gender bias. Instead, Google decided to limit the coverage. The pronoun ban would affect less than one percent of cases where Smart Compose proposed something.

The decision to limit the feature could stem from Google's past experience of questionable predictive tools. The company banned expletives and racial slurs from its predictive technologies after the search engine populated anti-semitic rhetoric in 2015 and racist images in 2016.

Of course, AI learns by following patterns. If the information available to the program is disproportionately mentioning men and associating them with certain roles, the AI will emulate this.

An extreme example is Microsoft's chatbot, Tay. The company launched the AI on Twitter in 2016 and deleted it after barely 24 hours. Tay assimilated the internet's darkest qualities and started spouting racist, misogynistic, and anti-semitic rhetoric.

Assuming a person's gender can lead to trouble. Even if gender isn't correlated to a certain word, Google could mistakenly assign gender to a non-binary or trans individual based on a

traditionally male or female name.

As technology advances, developers and tech companies are learning that they have to raise their awareness around issues of unconscious bias.

FOLLOW Download.com on Twitter for all the latest app news.

Takeaways

  1. Gmail's Smart Compose AI began assuming gender was associated with certain words and therefore perpetuated some stereotypes and biases.
  2. If the information available to the AI is disproportionately mentioning men and associating them with certain roles, the algorithm simply emulates it.

Also see

Shelby is an Associate Writer for CNET's Download.com. She served as Editor in Chief for the Louisville Cardinal newspaper at the University of Louisville. She interned as Creative Non-Fiction Editor for Miracle Monocle literary magazine. Her work appears in Glass Mountain Magazine, Bookends Review, Soundings East, and on Louisville.com. Her cat, Puck, is the best cat ever.