Google is driving diversity and inclusion initiatives with its latest artificial intelligence tools.
On Thursday, the company announced that its Google Translate feature now includes feminine and masculine options for words in several languages.
In a company blog post, Google Translate product manager James Kuczmarski wrote:
Over the course of this year, there’s been an effort across Google to promote fairness and reduce bias in machine learning. Our latest development in this effort addresses gender bias by providing feminine and masculine translations for some gender-neutral words on the Google Translate website.
Google Translate learns from hundreds of millions of already-translated examples from the web. Historically, it has provided only one translation for a query, even if the translation could have either a feminine or masculine form. So when the model produced one translation, it inadvertently replicated gender biases that already existed. For example: it would skew masculine for words like “strong” or “doctor,” and feminine for other words, like “nurse” or “beautiful.”
[RELATED: Share your award-worthy employee comms work.]
The translations currently show up on single-word translations from English to languages including Spanish, Italian, Portuguese, French and Turkish. The feature also shows up for translations from Turkish to English, as shown below:
Google’s move is one of many designed to shape its technology to fit calls for more diversity and inclusion. By doing so, the company is helping users better relate to its AI tools.
The announcement also brings Google positive PR, especially as more organizations are including more diverse images and messages in their PR and marketing campaigns.
Google’s been on a mission to promote fairness in AI and machine learning, as demonstrated recently by its initiative to remove gendered pronouns from Gmail’s Smart Compose feature. The company says it’s also thinking about how to address non-binary gender in translations as well as gender bias in other Google products like search auto-complete.
Gmail’s Smart Compose can save you valuable time when you’re firing off a quick message, but don’t expect it to refer to people as “him” or “her” — Google is playing it safe on that front. Product leaders have revealed to Reuters that Google removed gender pronouns from Smart Compose’s phrase suggestions after realizing that the AI-guided feature could be biased. When a scientist talked about meeting an investor in January, for example, Gmail offered the follow-up “do you want to meet him” — not considering the possibility that the investor could be a woman.
In September, Google also launched the Inclusive Images Competition, which asked people to upload photos of their surroundings to Open Images (an image classification resource that skews heavily to North American and European images due to the data it’s collected). Google is hoping that by adding more diverse images, it can help AI tools better recognize photos, such as wedding ceremonies, regardless of what area of the world they’re in.
Google Translate tools will continue to become more inclusive, as well.
In the future, we plan to extend gender-specific translations to more languages, launch on other Translate surfaces like our iOS and Android apps, and address gender bias in features like query auto-complete. And we’re already thinking about how to address non-binary gender in translations, though it’s not part of this initial launch.
What do you think of the announcement, PR Daily readers?