Gmail’s Smart Compose will no longer be able to suggest gender-based pronouns like “him” and “her” in emails.
Google reportedly made the change due to fear Smart Compose would get it wrong. The search giant introduced the limitation after a research scientist at the company discovered the issue back in January.
The researcher was composing an email about meeting an investor when Smart Compose suggested a follow-up question of “Do you want to meet him?” The question misgendered the investor.
Paul Lambert, Gmail’s product manager, told Reuters that his team tried several options to fix the problem. However, none of the solutions were reliable.
Ultimately, the team decided the best solution was to remove those types of suggestions entirely. Google says the change affects fewer than one percent of Smart Compose predictions.
Lambert told Reuters that it pays to be cautious with cases like this as gender is a “big, big thing” to get wrong.
While it might not seem like a big deal to some, this issue highlights some of the underlying problems with machine learning. Often, machine learning can reflect and reinforce societal biases.
Smart Compose, like many machine learning and AI systems, learns by studying past data. In the case of Smart Compose, and its sibling Smart Reply, it gathers data by learning from old emails.
In the researcher’s case, Smart Compose learned from past data that investors were more likely to be male than female and wrongly predicted this investor was as well.
This isn’t the first time an issue like this has cropped up. Recently, Amazon had to shut down an internal machine learning recruiting tool because it was biased against female candidates.
Ultimately, it shows that machine learning tools like this can have significant bias flaws.
MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.