Dark
Light
In an article about doom scrolling, a man sits on a ledge and stares at his phone screen.

When Your For-You-Page Becomes Too Specific

Big Brother may not be watching you, but your FYP certainly is.
January 11, 2023
9 mins read

The TikTok “For-You-Page” (FYP) is the default landing page that a user sees when they open the application. It’s the main page where users spend most of their time on the app, and as a result, most of the content they consume is displayed here. When they open the app, a video starts playing right away and users can scroll indefinitely to find new things to watch. These videos are tailored specifically to a user’s interests and this curation is accomplished by taking a few factors into account. This includes how much time a user spends watching a video for its entire duration, as well as whether they like, share or comment on the video. This data is used to better understand the types of videos that a user enjoys, resulting in more of these videos being recommended. Most social media apps aim to keep users in the app for as long as possible by creating highly personalized feeds. Because the video suggestions are so accurate, the algorithm feels almost intrusive. This is one of the methods social media apps such as TikTok use to keep people perpetually on the app.

These accurate predictions lead to  infinite scrolling,” a design technique popular in many social media apps today, including TikTok, Instagram and Facebook. Infinite scrolling is a technique that loads more content as you scroll, thereby creating an infinite supply of new content. Originally intended to allow users to navigate through posts and articles without having to switch between pages, it has become one of the most addictive aspects of social media use. Constantly scrolling through the FYP provides immediate gratification in the brain via dopamine release. The rush that users get from this release of dopamine pushes them to seek more of it, resulting in a bottomless pit of scrolling. This cycle has been likened to addictive behaviors, such as gambling; it has become common knowledge that social media platforms employ the same techniques as gambling companies to create psychological dependencies in their users. In this regard, TikTok has been described as a form of digital crack cocaine. Aza Raskin, the inventor of infinite scrolling, shares similar sentiments, and has expressed concern about how harmful this is: “It’s as if they’re taking behavioral cocaine and just sprinkling it all over your interface and that’s the thing that keeps you like coming back and back and back.”

Most users pick up their phones to go on TikTok as a form of temporary relief, either to relax after a long day, to cheer themselves up when they’re feeling down or to take a break from studying. The infinite scrolling feature, combined with the power of TikTok’s algorithm to create a highly personalized and accurate for-you-page makes it easy to spend long hours on the app. There’s always something to grab your attention and prompt continuous scrolling. We tell ourselves that after a couple of videos we’ll stop watching, but if the next video isn’t enjoyable, we keep scrolling. In no time, hours have gone by. This cycle has tremendous psychological effects on individuals, and might result in feeling overwhelmed and mentally exhausted. Infinite scrolling is also closely related to “doomscrolling,” a relatively new term referring to mindlessly scrolling through negative news articles online, particularly on social media sites like Twitter. Dr. George Nichols, a mental health expert, explained in an interview that doomscrolling gives the impression that there is one piece of information that, upon discovery, will make everything better. However, this is not always the case, and endless scrolling can cause sensory overload and exacerbate anxiety and depression.

Watching similar videos for hours on end can also create a false sense of relatability as the user begins to internalize the content, even if they initially watched out of curiosity and did not believe in the information being consumed. For example, if a person spends a lot of time interacting with videos of people self-diagnosing certain behaviors as symptoms of conditions such as Attention-Deficit-Hyperactivity-Disorder (ADHD), more and more of these videos begin to appear on their FYP, indicating that the algorithm has identified a side of TikTok to keep them on. The result is that they begin to attribute their mundane behaviors to having the condition, even though in reality they do not. Similarly, if a user leaves a comment on videos giving (often unfounded) relationship advice or explains how to determine whether a partner is cheating, TikTok begins to suggest more videos like that. Some of the subliminal messages will gradually seep into one’s subconscious, causing them to overthink and overanalyze everything their partner does, resulting in distrust and insecurity in a relationship that may have been perfectly fine.

This gradually results in the creation of a filter bubble. According to Eli Pariser, author of “The Filter Bubble: What the Internet Is Hiding from You,” a “filter bubble” is a personal ecosystem of information catered by algorithms. These filter bubbles amplify the user’s current interests, opinions and beliefs, ensuring that the user is never bored and always enjoys and connects with the content they consume. While this can have positive effects, such as exposure to videos that help users unpack trauma and attempt to heal, or videos explaining how to build better interpersonal relationships and improve communication skills, there are also a plethora of videos that are extremely harmful. They spread misinformation and even amplify hateful ideologies. For example, if a person’s “filter bubble” consists of videos that constantly teach them how to manipulate people or be toxic in their relationships, chances are they will eventually internalize these messages and begin practicing these habits in real life, creating a negative cycle. 

Filter bubbles create echo chambers and reinforce spirals. This is because most of the time, people seek out and remember information that provides cognitive support for their pre-existing attitudes and beliefs, according to the reinforcement theory. This effectively isolates and insulates users from opposing viewpoints. Over time, echo chambers cause cognitive biases and impair logical thought. Furthermore, because almost all of the videos the user encounters are similar and reinforce their current beliefs, it is nearly impossible to recognize that one is being sucked into an abyss. Even when a user is aware that this is the case, constantly bombarding the brain with this content makes it difficult to return to reality.

In today’s fast-paced and technologically dependent world, quitting social media apps cold turkey is nearly impossible. Not only do they entertain us, they help us make new, meaningful connections, network and maintain existing relationships with our loved ones. However, we bear enormous responsibility for regulating and monitoring the information we consume, as well as the time we spend on these apps. Although it may be frustrating and difficult at first, recognizing the problem and setting boundaries will go a long way toward ensuring the development of healthier relationships with social media in general.

Azeezah Ibraheem, Near East University

Writer Profile

Azeezah Ibraheem

Near East University
Medicine

Hi! My name’s Azeezah and I love sunsets, books and the ocean.

Leave a Reply

Your email address will not be published.

Don't Miss