Algorithms decide what we see on social media
We swipe with our finger and, as if by magic, videos that are perfect for us appear. But how does TikTok know that we like funny cats or cooking tutorials? And why does Instagram keep showing us posts on a topic that we only glanced at?
The answer lies in artificial intelligence. The algorithms that regulate social media feeds are not simple sequences of code: they are complex systems that learn from our behavior in order to offer us personalized content. This is convenient, of course, but it is also a potential tool for manipulation that deserves to be thoroughly understood.
How recommendation systems work
Every time we like something, spend more time on a post, or watch a video until the end, we are giving valuable information to a recommendation system. This system stores our behavior, compares it with that of millions of other users, and tries to understand what we might want to see next.
This is how the famous “personalized bubble” is created: a continuous flow of tailored content, optimized to keep us glued to the screen. TikTok's algorithm, for example, considers over 1,000 different signals for each user: from the time spent watching videos to the speed of scrolling, from social interactions to the times of day the app is used.
Instagram uses a similar but more layered approach, combining data from the main feed with data from Stories and Reels. The result is a content ecosystem that seems to know us better than we know ourselves.
The artificial intelligence behind personalization
The technology that makes all this possible is based on increasingly sophisticated machine learning algorithms. These systems use deep neural networks capable of identifying complex patterns in our digital behavior.
TikTok, in particular, has developed an algorithm called “For You Page” that combines collaborative filtering (suggestions based on similar users) and content-based filtering (analysis of content characteristics). Artificial intelligence analyzes not only what we watch, but also how we watch it: micro-movements of the finger, pauses, even the angle of the phone.
Instagram has integrated AI even more pervasively. In addition to feed content, the algorithm also influences search results, suggested content in Explore, and even the order of Stories. As we explored in our article on “AI and Social Media: Algorithms That Guide Us,”, These systems are redefining the way we consume information online.
Concrete examples of algorithmic manipulation
The problem is not so much the efficiency of these systems as their undesirable side effects. We risk being exposed only to opinions similar to our own, reinforcing our beliefs without comparison. In the worst cases, we may be led to extreme, conspiratorial, or manipulative content, not because the algorithm “wants” to do so, but because it has learned that this type of content keeps us engaged longer.
A survey by the Center for Humane Technology highlighted how social media platforms unknowingly encourage the spread of polarizing content. TikTok, for example, has been criticized for how its systems can quickly push radical videos or conspiracy theories if the user shows even the slightest initial interest.
Facebook (now Meta) has admitted that its algorithms tend to favor content that generates “engagement,” even when this means amplifying anger or outrage. In 2021, Frances Haugen's testimony before the US Congress revealed that the company had been aware of these effects since 2018.
Privacy and data collection: the price of personalization
All this happens thanks to a sophisticated combination of machine learning, predictive analysis, and massive collection of personal data. And this is where the issue of privacy comes into play. The data we give to social media, even just by interacting with content, is processed to reconstruct our tastes, vulnerabilities, and emotional tendencies.
TikTok collects over 380 different types of data on its users, according to an analysis by cybersecurity researcher Felix Krause. Instagram is no different: through its integrated browser, it can track every click, every cursor movement, even text that has been typed but not sent.
The stated goal is to keep us active and present for as long as possible. But at what cost? As we explored in our article on focus and attention in the digital age, This hyperconnectivity is having a profound effect on our ability to concentrate.
Algorithmic biases and digital discrimination
It is not just profiling that is cause for concern. So-called algorithmic biases also play a crucial role. Algorithms are not neutral: they learn from human data, which is often imperfect. If a certain type of content has been rewarded in the past, it will continue to be rewarded, reinforcing existing trends and penalizing diversity of vision.
This is how dynamics are created that favor certain groups and marginalize others. Instagram, for example, has been accused of penalizing content created by people of color or belonging to minorities. TikTok has admitted to using policies that limited the visibility of creators with disabilities or who were “unconventional” in order to prevent bullying, but in doing so, it effectively created a form of discriminatory censorship.
A study by the MIT Technology Review has shown that TikTok's algorithm tends to show different content to users of different ethnicities, even when they have similar interests, perpetuating social divisions through personalization.
Key points to remember
- Our digital behaviors fuel increasingly sophisticated algorithms that create personalized content bubbles
- Personalization can lead to radicalization when it favors content that generates strong emotional reactions
- The data collected goes far beyond likes and shares, including micro-behaviors and psychological patterns
- Algorithmic biases reflect and amplify human prejudices, creating systemic discrimination in the digital world
Frequently asked questions
How can I reduce the influence of algorithms on my feeds? Actively diversify your interactions, follow accounts with different opinions, and regularly use the “Not interested” feature when appropriate.
Do social media really know so much about me? Yes, they collect hundreds of different data points, often cross-referencing information from multiple sources to create detailed profiles of your interests and behaviors.
Are there alternatives to traditional social media? More transparent platforms such as Mastodon or BeReal are emerging, using simpler algorithms or chronological feeds, but they still have limited adoption.
How can I tell if I am in an information bubble? Check whether you rarely see opinions that contradict your beliefs or whether your social media feeds are very homogeneous in terms of the topics and viewpoints presented.
Towards greater digital awareness
Artificial intelligence in social media has two sides. On the one hand, it allows us to discover new content, connect with people who share similar interests, and enjoy more seamless digital experiences. On the other hand, it can become a distorting lens, showing us only part of reality, the part that keeps us there.
To tackle this complexity, awareness is needed. Digital literacy is needed to help people recognize the underlying mechanisms and question why we see certain content and not others. As we discussed in our in-depth analysis on fake news and information warfare, critical thinking skills are becoming increasingly essential.
Only in this way can we move from being passive users to critical digital citizens. The future of social media does not depend solely on technology, but on how we decide to use and regulate it. And on how much we want to understand what lies behind every scroll.