The Era of the Algorithm
The internet age has made democracies exploitable. As an act of societal self-defence, it is necessary to strengthen the critical thinking of the young generation.
Lesen Sie die deutsche Version hier.
Facts can’t always convince people to change their opinion – especially today. In the past 8 years of running the investigative organisation Bellingcat, we’ve investigated several topics that have attracted communities made up of individuals who reject, out of hand, what they consider the mainstream orthodoxy. With the use of Sarin and chlorine gas as a chemical weapon by the Syrian regime that occurred between 2013 and 2018, they claim it’s all part of a Western conspiracy to create a justification for military involvement in Syria. For Russia in Ukraine, some communities reject Russia’s responsibility for the downing of Malaysian Airlines Flight 17 in July 2014, instead finding shaky evidence of Ukraine’s involvement in the shootdown. More broadly, we’ve seen the rise of conspiracy theory communities like QAnon, and even the rapid expansion of the Flat Earth communities.
Rejection of evidence
The response, particularly since the 2016 election of Donald Trump, has been to focus on the attempts of outside actors, particularly the Russian Federation and its proxies, to influence Western society using online disinformation campaigns – as if these communities are formed by the actions of outsiders. Counter-disinformation efforts from organisations like the Atlantic Council’s Digital Forensic Lab and the European Union focus on debunks, identifying bot networks run by foreign states, and hunting for other connections to state actors.
While foreign state actors do play a role in disinformation, to focus on them is to miss not only what is in many cases the true root of disinformation, but also how to address the fundamental issues that lead to the creation and dissemination of disinformation. Failure to address these issues will only lead to the same patterns being repeated time and time again, undermining free and democratic societies, while emboldening those who use disinformation and the counterfactual communities who propagate that disinformation for their ends.
Through my work, I’ve encountered those who deny what to me seems like the most apparent reality. Bellingcat’s work is based on using open source evidence, evidence that is publicly available and analysed in our work using transparent, step-by-step, re-creatable methodologies. Despite this, there are plenty of communities online who dismiss our work as being handed down to us by Western intelligence services to further the agendas of Western states against countries like Syria and Russia. This has driven me to understand what motivates these people to the point of why their default position is to reject such evidence out of hand.
So what is the true root of disinformation? Why do these counterfactual communities form in the first place? Often they begin as communities that have been brought together around certain issues, which themselves are not necessarily conspiratorial or counterfactual in nature. For example, a number of the individuals who made up the community that denied chemical weapon attacks in Syria started as part of the Stop the War movement that arose around the 2003 invasion of Iraq.
What they have in common is a fundamental distrust in certain sources of authority, usually over some perceived betrayal – independent of how rightly or wrongly that might be in the specific case. With the community that arose around the denial of chemical weapon use in Syria, it’s frequently the 2003 invasion of Iraq, where the perception of the media’s complicity in selling an invasion based on the alleged threat of weapons of mass destruction not only drives that distrust, but allows them to draw parallels between Syria’s alleged use of chemical weapons and Iraq’s alleged stockpiling of WMDs.
Bubble Society
It should be stressed that these feelings of betrayal may not be misplaced, and are often held by people who don’t get drawn into conspiracy theories. But the crucial difference between those who are drawn into conspiracy theories and those who are not, is how those betrayals and that loss of faith in those institutions begin to define the entire worldview of the individual. For conspiracy theorists, it reduces complex issues to simple binary choices of good vs. evil and right vs. wrong.
«For conspiracy theorists, it reduces complex issues to simple binary choices of good vs. evil and right vs. wrong.»
The internet era created more opportunities for people with similar ideas to come together, and the social media era increased the accessibility of those ideas. It quickly transformed into the era of the algorithm, where an individual is proactively served content search engines and social media companies believe their users want to see content that builds on the content they’ve already seen, reinforcing their beliefs, and fueling alternative media echo systems.
If you have doubts about chemical weapon attacks in Syria, the fate of Malaysian airlines flight MH17, questions about Q, or concerns about vaccines, there are Facebook groups, YouTube personalities, social media superstars, and alternative news websites all of which are delivered to you by social media algorithms designed to serve you exactly the sort of content it thinks you want to see, which in this case means more conspiracy theories. These form alternative media ecosystems that draw people into ideas outside of the mainstream, and proudly so. It is the structure of our internet that has made our democracies vulnerable to half-truths and ill-intentioned lies.
Empower the youth
But what should we do about disinformation? I think it’s important to have a multi-faceted approach, with organic engagement with online communities being key. Russia’s invasion of Ukraine in February 2022 can be seen as a great example of what is possible.
I should start by saying the situation in Ukraine is rather unique, because since 2014 there’s been an engaged online community made of individuals from a wide variety of backgrounds, including journalists, activists, human rights workers, disinformation researchers, and ordinary people following events in Ukraine. With MH17, Ukraine became one of the first conflicts where open source investigation (investigations based on publicly available information, particularly that from social media) played a key role in not only identifying those responsible for shooting down MH17, but also led to community involvement in that investigation process, and the identification of further Russian involvement in the 2014 conflict.
This meant there was already a community engaged with the situation in Ukraine that understood disinformation and open source investigation. In the weeks before the invasion, parts of that community, as well as people who were newly engaging with information from Ukraine, began to document the Russian build-up along the border with Ukraine, with Russian TikTok users helpfully documenting the movement of Russian troops and equipment into positions along the border. While the Russian government claimed the build-up was related to training exercises, it was clear from the types of units and vehicles being transported to the border that this was not the typical military build-up you’d see around training exercises.
As the day of the invasion approached, allegations of Ukrainian violations and crimes started to be shared on pro-Russian channels. The evidence that came with these allegations was quickly examined and picked apart by the social media community that had formed around the build-up to the invasion, and often in less than an hour of originally being published had been debunked.
It might seem like these efforts to counter disinformation have arisen organically and rapidly, but the groundwork for the existence of these communities started back in 2014, in particular after the downing of MH17. In my opinion, if we want a more sustainable and widespread approach to countering disinformation, we need to think in terms of building communities, and empowering individuals to make a difference. I believe we need to show young people how they can make a positive contribution to issues they care about as part of online communities, teaching them critical thinking and investigative skills, and showing them how to build online communities designed to have an impact.
What we face with counterfactual communities should be seen in terms of online radicalisation. While we usually think of radicalisation in terms of the far right and Islamists, I’ve observed that the same process of radicalisation impacts a broad range of communities, from Flat Earthers to Covid conspiracists. We only have to look at the events of January 6th on the steps of the Capitol building to see the many forms online radicalisation can take, with everything from the far right to true believers in Satanic pedophile cults running the world being represented in the mob that attacked the capitol.
The deradicalisation of those with such extreme worldviews is extremely difficult, and often requires the involvement of those closest to the individuals, rather than attempts by the media, government, or fact-checkers, all of whom would be rejected out of hand by the most radicalised. These are people who seek empowerment and community, and it is friends and family that can offer them such an environment. External preaching, as regularly tried in the media today, will not do the trick.
At Bellingcat, we’ll soon be working with a UK-based organisation, The Student View, to teach 16 to 18-year-olds how to investigate issues impacting their own lives. We hope that we can combine traditional investigation methods with open-source methods and show them how to have an impact, as well as empower them to make changes in their own lives. By doing that, we hope to prevent them from becoming a new generation of angry, disenfranchised internet users who confuse online abuse with empowerment. It’s not an easy task, it requires a lot of effort, and Bellingcat is still a small organisation. I hope that all of us can be part of the efforts to make the online world a better, more productive place. Because the opportunity is there – if only we can all work towards it together.