Media outlets and content creators are turning to "algospeak"âthe alteration of words or the use of euphemismsâto maintain visibility and evade algorithmic restrictions.
When the news portal En Blanco y Negro, based in the city of Chihuahua in northern Mexico, set out to increase its social media audience, it encountered the fact that major digital platforms frequently restricted the visibility of topics such as crime and insecurity.
The media outlet's team, led by its then-news director Roberto Ălvarez, noticed that posts containing words such as âexecuted,â âdrugsâ or ânarcoâ received less trafficâor were even flagged for alleged violations of content guidelines.
It was then that they began to adopt a strategy that influencers and content creators use to outsmart social media algorithms: inventing or altering words.

When the digital news outlet En Blanco y Negro uses altered words in headlines, traffic starts to climb. (Photo: En Blanco y Negro on Facebook)
ââEjecutarâ is the word we use most in Mexico to say that someone was shot to death,â Ălvarez told LatAm Journalism Review (LJR). âWhat could we do to avoid putting ourselves at risk? We started using the word âdesvivirââwhich doesnât exist; itâs not in the dictionary.â
The English word used for desvivir is âunalived.â
The team also began changing the spelling of words to an equivalent formed by letters, numbers and symbols. Thus, âejecutarâ became â3jecut4r,â âdrogasâ became âdr0g4sâ and ânarcoâ became ân@rcø.â
âWhen we start monitoring traffic in real time and see that itâs very low, we immediately change the keywordsâor come up with new onesâusing slang more specific to this region of Chihuahua; almost instantly, that same post sees its traffic start to climb,â Ălvarez said.
The case from En Blanco y Negro illustrates a social media practice that is becoming increasingly widespread among media outlets: altering language to evade algorithmic content moderation. Known as "algospeak," this strategy seeks to maintain content visibility and avoid sanctions, although it can also alienate readers and compromise the clarity of the message.
âAlgospeakââderived from the terms âalgorithmâ and âspeakââconsists of modifying, substituting or disguising words considered sensitive through euphemisms, spelling alterations or symbols.
Its use in the news is not an isolated phenomenon. Research by the Autonomous University of Chihuahua (UACH, for its Spanish initials)âpublished in September 2025 in the academic journal Doxaâdocumented how at least three digital media outlets in that city are adapting their language on Facebook to evade algorithmic moderation.
Based on an analysis of 312 headlines, the studyâtitled âWhen the Dead Become âUnalivedâ: Journalistic Algospeak as a Response to Digital Censorshipââfound that journalists systematically resort to orthographic alterations, euphemisms and, to a lesser extent, symbols or metaphors, in order to publish news contentâprimarily concerning violence, deaths and securityâwithout facing penalties from digital platforms.
âThis is a novel presentation of a challenge that journalists have always facedânamely, evading censorship,â Mario Alberto Valdezâa journalist, UACH professor and one of the studyâs authorsâtold LJR. âNow, we are viewing it through a technological lens.â
The research found that more than 90 percent of publications resorted to the orthographic alteration of sensitive words, followed by the use of lexical euphemismsâthat is, words or expressions used to substitute others considered offensive. The majority of headlines with altered words were concentrated in publications regarding deaths, security and drug trafficking, with a lesser presence in content concerning drugs, sexuality and human abuse.
The fact that crime-related topics are the ones most heavily penalized by social media platforms poses a challenge for media outlets in states like Chihuahua, where this type of news is among the most consumed by the audience, Valdez said.
However, topics related to violence are not the only ones restricted by algorithms. Profiles that address gender issues must also resort to âalgospeakâ to ensure their content reaches their audience, according to separate research published in March 2026 by the Mexican media outlet La Cadera de Eva and the organization Article 19.
SofĂa MĂĄrquez, a content creator and founder of We R Women On Fireâa Tijuana-based platform focused on feminism and gender-based violenceâsaid that in many of her Instagram posts, she has to alter certain terms.
âThe penalties range from completely removing content and revoking the use of certain toolsâon one occasion, they blocked my ability to go live for six monthsâto the risk of having your account deleted entirely,â MĂĄrquez told LJR.
In her postsâwhich cover cases of violence against women and feminist movementsâterms such as âv1olador,â âs3xualâ and âf3minicid4â are used, both in text and in images.
âIt is striking how informative, reflective and conscious content is consistently censored, while other content that replicates violence is disseminated with greater force within the patriarchal algorithm we see on social media,â MĂĄrquez said.
Between 2023 and 2025âwhen the migration crisis in Mexico saw several peaksâMetaâs platforms frequently penalized En Blanco y Negroâs content related to migration, Ălvarez said.

We R Women On Fire, a platform covering feminist and gender issues, alters certain terms in its content to avoid penalties from social media platforms. (Photo: We R Women On Fire on Instagram and Canva)
âIf you used the word âimmigrantâ or âundocumentedâ on Facebook, they could even hit you with a one-day ban on your account,â Ălvarez said. âThe notification would say, âYou may be violating our policies regarding violence or discrimination.â And we would think, âOf course not! Iâm reporting the news!ââ
It was then that the newsroom found an alternative in euphemisms. To avoid saying âimmigrants,â they opted to use âpeople in a situation of mobility.â
At times, the euphemisms they chose bordered on sensationalism, Ălvarez said. On one occasion, the outlet reported on the sexual abuse of minors in a marginalized region of Chihuahua.
They decided to use a euphemism: âPredator unleashes on 500 children in Punta Oriente.â Although some readers expressed outrage, Ălvarez said, the media outlet chose to proceed in this manner to ensure the story would not be rendered invisible.
âCuriously enough, by being more sensationalist, Facebook actually did drive traffic to us,â Ălvarez said. âIt has to be that way, because we arenât going to do any authority the favor of not reporting.â
Meta says on its website that its systems reduce the exposure of content that violates its Community Standards, âin order to minimize possible harm to our community.â This includes content related to graphic violence, hate speech, suicide, self-harm and fraud.
The company said that it permits this sensitive content if it is âof journalistic interest,â but first subjects it to a human rights-based analysis.
However, if a media outlet or journalist disagrees with a decision made by the platform, there is not much that can be done, Ălvarez said.
âTo get a problem resolved, you can submit an appeal, but everything is highly automated,â he said. âItâs almost impossible to get in touch with anyone.â
While âalgospeakâ helps disseminate information of public interest, it is also true that journalistic clarity may be compromised, according to research by the UACH.
âYoung people immediately understand the rationale behind these strategies because they live it day in and day out,â Valdez said. âBut we encountered profiles of older people who did not understand this way of handling information.â
MĂĄrquez said that, at times, followers of We R Women On Fire perceive the alteration of words as a lack of seriousness or as a way of softening the content.
âI always like to share the facts as they areâwithout hiding or sugarcoating anythingâand when I censor words or images, there are people in the audience who question why I do it,â she said. âThey assume it is because I donât want to show the full reality.â
When readers of En Blanco y Negro expressed their discontent regarding the alteration of words and the use of euphemisms, the media outlet explained the reason in an article written by Ălvarez in August 2025, clarifying that âalgospeakâ was a necessary strategy.
The article includes a glossary of 30 terms that the media outlet routinely modifies in areas including organized crime, deaths and sexual offenses.
âIn a nutshell,â he said, âitâs not ignorance; itâs digital survival.â

En Blanco y Negro shared an image explaining why it uses âalgospeak.â (Photo: Screenshot of enblancoynegro.com.mx and Canva)
This article was translated with AI assistance and reviewed by Teresa Mioli