X

For You or Against You? Tiktok’s echo chamber dilemma

In the 5th century BCE, Athens experienced a radical transformation in governance. The city’s elite gathered in the Athenian Assembly to discuss and, most importantly, vote on civic matters. This has come to be known as democracy and has evolved into today’s dominant political system.

Yet, inadvertently, the Greeks also discovered the concept of echo chambers. An echo chamber refers to an environment where participants encounter beliefs that amplify or reinforce their pre-existing beliefs through communication and repetition inside a closed system, insulated from rebuttal. By establishing a democracy where only the elite males of the city participated, a group who likely had similar opinions and concerns, there was very little deviation from preconceived perspectives and thus a further entrenchment of these common ideas and beliefs they shared. Upon emerging from the depths of the “original” echo chamber, even if others wanted to oppose the decisions made, they faced a collective of individuals who had just been affirmed and validated, making them unlikely to be persuaded.

Modern-day echo chambers often manifest within the media. When right-wing individuals access platforms such as Fox News, blatant bias in story presentation affirms their positions and radicalises them even more as they consume commentary on the brilliance of Trump and the incompetence of the Democrats. While the premise that individuals actively engage in single-narrative media consumption may seem absurd or a slippery slope argument, individuals, particularly ones who do not necessarily have the privilege to access high-quality education or personalities that feed on affirmation, often do.

This has undoubtedly become an issue, particularly with the rise of digital media and a political environment rampant with misinformation and fake news. Populists such as Donald Trump, Lula da Silva, and Viktor Orbán have leveraged these echo chambers to establish a dangerous supporter group. When consumption of media from ‘reputable sources’ validates and intensifies belief, supporters often fail to see any flaws in their chosen leader or personality, no matter what policy they suggest or how many felonies they have been convicted of. Just take the example of incited insurrection leading to the January 6th 2021 storming of the Capitol building upon Trump’s false election denial claims.

But this is more relevant now than ever…

The rise of social media – notably TikTok – has created an intensification of personalised algorithms, whereby the media individuals consume is not caused by an active choice such as clicking on a video, perusing interesting titles or scouring web pages but a product of a complex personalised algorithm reflective of their online behaviours. Therein lies the birth of the ‘For You Page’ (FYP). There obviously are significant benefits, with platforms such as TikTok capable of creating a more dynamic and enjoyable viewing experience. But taking away users’ autonomy over what they watch is incredibly dangerous. An algorithm targeted at catering for taste is one at risk of intensifying exposure to dangerous and radical content, spouts of misinformation, or oppressive ideological stances. With these algorithms comes the very real potential for social media consumption to slowly become more radical and more harmful.

Previously, several factors could mitigate the potency of echo chambers. When people access media content, the majority are aware that certain organisations have a certain political stance or bias. This enables them to regulate their internalisation of the material they view, consuming narrative with a grain of salt. However, the political stance of content creators is not always obvious. Someone who views content from a creator, such as Daterightstuff, may be unaware that he is conservative. His right-wing opinions may be able to more effectively influence your political stance than accessing the same information from a known right-wing newspaper. Moreover, as social media users access their information by scrolling through videos, even if it comes from creators such as a political party TikTok account, it is not always noticeable, and this is often not even a realised fact.

Furthermore, there is massive exposure to younger audiences. One of the most concerning aspects of TikTok’s echo chambers is their impact on younger audiences. Unlike the traditional media landscape, where news consumption was primarily an activity for older, more mature individuals, these echo chambers often target teens and young adults. When a 12-year-old boy hears Andrew Tate explain why men are superior to women in a manner that is quite rhetorically compelling, he is likely unable to evaluate this content critically and may become indoctrinated. He will like, comment, and share until more Andrew Tate videos and other content from extreme right-wing commentators fill up his algorithm to reinforce these beliefs. A dangerous premise for the most impressionable members of society.

The dynamic nature of TikTok’s algorithm makes it particularly insidious. In traditional media, where viewers consciously choose to read or watch certain content, the ability to create a slippery slope towards radicalism is more difficult. As mentioned, TikTok doesn’t have this problem. The advanced capabilities of TikTok’s algorithm gradually shift the content to match viewing and engagement patterns, ensuring users stay engaged with similar content styles. 

For example, a user might watch a few videos about healthy eating. The algorithm then starts suggesting more content on dieting, which might escalate to extreme dieting advice within a short space of time. This slow and incremental exposure means users can become more radicalised over time without a clear, conscious shift in their content consumption habits. The algorithm constantly adjusts to keep users engaged, often pushing more extreme content to maintain interest and interaction, leading individuals down a rabbit hole of increasingly extreme viewpoints.

Finally, social media’s unique nature allows for direct engagement, which can reinforce the echo chamber effect. Users can like, comment on, and share videos, creating a sense of community and validating their beliefs. This interaction can create further positive reinforcement from like-minded individuals, perpetuating this echo-chamber effect. Moreover, this provides a unique avenue for meetups. Things such as anti-vax rallies are often built in recent times through mass-sharing of posts with details for such events.

How do we respond to the modern echo-chambers?

Awareness of our consumption and willingness to critically engage with all content presented will enable us not to fall victim to these echo chambers. When exposed to substance, fact-check, seek countervailing opinions, and interrogate your existing beliefs to ensure your opinion and behaviours are as informed and holistic as possible. This is as simple as looking things up online or discussing it with a mate. This should extend to others; when you hear your 12-year-old cousin glorify Andrew Tate, educate them and discuss the content with them.

But also, the onus shouldn’t just be on us. It is quite alarming the absence of political attention this issue has received, with politicians deciding to focus on a non-existent anti-China issue rather than fixing the imminent danger that these platforms have. Increased regulations around such platforms can allow them to thrive positively in the space for political discussion that is substantiated and accurate – as opposed to being wrought with affirmation bias.

Until then the obligation falls on the everyday person. So next time you’re mindlessly doom scrolling – try to do so consciously and keep an ear out for any echoes.

Categories: Opinion Politics
Toby Freeman:
Related Post