Hate speech and social media go hand in hand, and this is nothing new to anyone. But on X (the old Twitter) it seems that the ground is more fertile than ever for this type of content.
According to a study produced by 4 universities: UCLA, USC, UC Merced and the University of Oregon, all in the United States, since Elon Musk bought the Silicon Valley giant the number of posts containing hate speech has doubled.
Another survey, carried out in December last year by the ADL, showed that the number of racist posts jumped from 1,282 per day to 3,876. Homophobic posts went from 2,506 daily to 3,964.
These numbers, however, began to harm the company itself. The reason is simple: advertisers do not want to link products to offensive posts that circulate freely on the platform.
Action and reaction
More recently, 3 large companies, Apple, Disney and Coca-Cola, announced that they would take advertising funds from Elon Musk’s social network. The decision represents a loss of US$75 million (around R$370 million) per year for X.
In response, the billionaire owner of Tesla filed a lawsuit against Media Matters, an NGO that has been publicizing the placement of advertisements for major brands alongside neo-Nazi, white supremacist, far-right or misinformation posts on the platform.
“A central element is the pursuit of profit. There are platforms that are for-profit. Obviously the way you try to affect the behavior of these platforms is in the so-called bottom line, as we would say in English”, explained Filipe Campante, Brazilian professor at Johns Hopkins University, to Brasil de Fato .
In a post on X, Elon Musk said: “Media Matter is evil.”
The risk for the 2024 US elections
Less than 10 months before the 2024 presidential election, another issue has been worrying experts: the increase in misinformation on X.
In July, the ADL revealed that the number of posts with conspiracy theories linked to Q-Anon, a group that was involved in the invasion of the capitol in January 2022, increased by 91%. The group believes, among other things, that Democrats suck the blood of children in underground bases to produce a kind of elixir of youth.
Theories like this are absurd, but they have already had real consequences. In December 2016, a South Carolina man stormed a Washington pizzeria armed to free children he believed were being enslaved. The case became known as Pizza Gate.
The effect of this greater misinformation, however, could be even more serious and widespread. The question is not just what people believe, but also everything else they fail to believe.
“With this enormous amount of content, it is very difficult to filter what is reliable, what is not, what you should believe or not… And in this situation, bad content often ends up crowding out good content”, explains Campante.
The Brazilian professor continues: “without knowing what is good or not, you kind of assume that everything is bad. And then if you think that producing good content is more expensive than producing bad content, you will end up prioritizing good content.”
Setbacks in the fight against disinformation
In 2020, the former Twitter created a specific sector to deal with misinformation during the election. Not even Donald Trump escaped having posts with a stripe warning about the lie being told. Under the command of Elon Musk, the line between lies and freedom of expression has become less clear.
In Brazil, the issue is being approached from a judicial perspective, with accounts being deleted and fake news spreaders banned from the networks. The same tactic, however, does not work in the USA.
In the case of the United States, the situation is more complex given the current reading of the First Amendment of the constitution, which deals with freedom of expression. In both cases, the problem appears to be far from being resolved.
“You’re wiping some ice there. Because, this way, you are dropping your profile there, another one will appear there. That the problem is inherently complex because it is a technological issue. It’s not like you say ‘oh, you have the radio and TV concession and you say that if you don’t do X, Y, Z, the regulator will come and do this and that’. Technologically, it is much more difficult for you to control or moderate content, for better or for worse”, concludes Campante.
Editing: Rodrigo Durão Coelho