When the Factchequeado team began creating its latest WhatsApp conversational chatbot, one of its first agreements was that it had to be female, Laura Zommer, Argentine journalist and cofounder of the fact-checking outlet, told LatAm Journalism Review (LJR).
“I confess that our first reaction was to create a chatbot that was a woman,” Zommer said. “But then we thought that instead of empowering women, we were putting them in that role of telephone operators like our grandmothers.”
Factchequeado seeks to close the information gap in Spanish for Latino communities living in the United States. Its chatbot Electobot resolves questions about the American electoral process and provides verified information about the candidates and their statements.
The Factchequeado team's final decision was to keep the chatbot with a neutral name and gender.
Other organizations and media in Latin America have also done this, such as Chequabot from the site Chequeado in Argentina. And in Chile, the AI-based virtual assistant WazNews offers news through WhatsApp using both a feminine and a masculine voice in its audio recordings: that of television presenters Mónica Pérez and Kike Mujica.
But there are chatbots created by media in Latin America that use names that traditionally represent women or female figures. For example, Fátima from Aos Fatos in Brazil, the aunt of Whatsapp from Efecto Cocuyo in Venezuela and Eva from El Surti in Paraguay.
Additionally, in October 2023, Brazilian investigative outlet Agência Pública began using text-to-speech technology to read stories aloud to users. The voice used is that of a woman: journalist Mariana Simões.
During the launch of this feature, the Agência Pública team explained that they chose Simões' voice because it represents the identity of the media outlet, which has been led by women.
According to studies conducted by Clifford I. Nass, former communications professor at Stanford University, feminine voices tend to be associated with warmth and sincerity, while male voices tend to be perceived as authoritative.
In turn, masculine voices tend to be more persuasive and are more influential than feminine voices. Masculine communicators, according to the study, are considered more competent and occupy a higher social status.
“There is a sexist bias clearly marked by the larger technology industry,” Sebastián Hacher, conversational designer and creator of the chatbot Eva from El Surti, told LJR. “But we are looking to break that bias. Roughly speaking, I could tell you that of the 45 projects I am working on, 30 have a non-binary identity.”
The El Surti team explained to LJR that the choice of gender for Eva, a woman imprisoned for drug trafficking in Paraguay, was not tainted by bias. “Eva is not a virtual assistant, Eva just tells a story,” Hacher said.
That is also what the Efecto Cocuyo team says. As they explained to LJR, with their chatbot, they aim to take advantage of the stigma or the Latin American custom of talking about "aunts" who send misinformation, stickers and chains and transform it into a "well-informed aunt."
“We wanted to vindicate the role of these figures,” Luz Mely Reyes, co-founder of Efecto Cocuyo, told LJR.
When the chatbot Fátima is asked why she has a name that’s traditionally assigned to women and not men, her response is: “the choice of a feminine name does not have a specific reason related to gender, but rather the idea of creating a friendly and accessible identity for the information checking tool.”
This Portuguese-language tool was first launched in 2019 by Aos Fatos (The Facts), a Brazilian investigative news outlet that focuses on fact-checking, with the aim of combating misinformation.
As Tai Nalon, director and co-founder of Aos Fatos, explained to LJR, the name of the chatbot arose from the abbreviation of Fact-Machine, which became Fact-Ma and then Fátima.
“It was wordplay with our brand. As you may notice, even in English, "Fátima" and "Fatos" (which means "Facts" in Portuguese) are similar words,” Nalon said.
The two most famous virtual assistants, Apple's Siri and Amazon's Alexa, represent women and assist in everyday tasks by responding to concerns or problems. That is, they have a service-oriented role that has historically been related to women.
Estefania Reyes, a journalist specializing in gender, media and social justice, told LJR that these gender choices are one more way to reflect and reproduce sexist stereotypes.
“As women are more service-oriented, attentive and friendly than men, they are therefore essentially predestined by ‘nature’ to caring roles,” she said. “These are ideas that have sustained the sexual or gender division of labor in the world to this day. While men govern the world, women have assisted them at home and at work as their 'right hands,'" she added.
However, Fátima is categorical in saying that her name does not reflect the intention of perpetuating sexist stereotypes. “As artificial intelligence, I have no gender or personal identity. My goal is to help combat misinformation and provide accurate information.”
Nalon does not agree with that hypothesis, either.
“Fátima combats disinformation, including gender-related issues, which Aos Fatos covers consistently,” she said. “I don't see how it could reinforce gender stereotypes, even though I know where your hypothesis is coming from.”
“As a virtual assistant, I have no ability to receive or process comments of any kind, including those of a sexual nature. I am here to help with information and fact-checking,” Fátima responded to the question about whether she has been sexually harassed by her users.
However, according to a UNESCO publication titled “I'd blush if I could” from 2019, harassment of virtual assistants is not uncommon.
The study explains that Robin Labs, a company that develops digital assistants to help drivers and other people in the logistics field, found that at least 5% of interactions were explicitly sexual; and that the real figure is much higher due to difficulties in detecting sexually suggestive language.
Further, the study says leading voice assistants generally responded passively when confronted with harassment, failing to encourage, model or insist on healthy communication. This “reinforces sexist tropes,” the authors say.
Journalist Estefania Reyes noted the same pattern in other AI tools.
“When they receive sexually inappropriate messages, chatbots and other similar tools often respond passively, encouraging or ignoring the abuse,” Reyes said. “This, of course, helps reproduce a culture that normalizes these forms of violence and minimizes their impact.”
To help create more “gender-equal technology,” the UNESCO study, as part of the EQUALS Skills Coalition, advocates for further digital skills education and training for girls and women.
The Coalition said it is aware that this won’t necessarily mean more women in technology-related jobs or developing technology, or that the technology will be more gender-sensitive.
“Yet this absence of a guarantee should not overshadow evidence showing that more
gender-equal tech teams are, on the whole, better positioned to create more gender-equal technology,” it said.