The European Union has issued a stern ultimatum to TikTok, demanding the popular video-sharing platform overhaul its "addictive design" features or face substantial financial penalties. This decisive action stems from a preliminary finding by the European Commission that TikTok has contravened the bloc’s stringent online safety regulations. The investigation, launched in February 2024, scrutinized the app’s operational mechanisms, particularly those impacting user well-being, with a significant focus on younger demographics.
In its preliminary assessment, the Commission articulated that TikTok had failed to adequately evaluate the potential detrimental effects of features such as autoplay on user mental health. The findings suggest that the platform did not implement sufficient measures to mitigate the inherent risks associated with these design choices, particularly for children and adolescents. This oversight is seen as a direct breach of the Digital Services Act (DSA), a landmark piece of legislation designed to hold online platforms accountable for the content they host and the impact of their services.
A spokesperson for TikTok swiftly contested the Commission’s findings, labeling them a "categorically false and entirely meritless depiction of our platform" and signaling the company’s intent to challenge the assessment. Despite this rebuttal, the platform has been formally invited to present its response to the EU’s preliminary conclusions. The gravity of the situation is underscored by the potential consequences: should the Commission proceed with its findings, TikTok could be subjected to fines equivalent to up to 6% of its total global annual turnover, a sum estimated to be in the tens of billions of euros.

Henna Virkkunen, the EU’s tech commissioner, directly addressed the implications for TikTok, stating unequivocally that if the company wishes to avoid these substantial fines, it must fundamentally "change the design of their service in Europe." The Commission has outlined a series of concrete recommendations for TikTok to consider. These include the implementation of mandatory "screen time breaks," particularly during nighttime usage, and a significant recalibration of its content algorithms. The algorithms, which are central to the platform’s success in feeding users highly personalized content streams, are under particular scrutiny for their role in fostering continuous engagement.
Furthermore, the Commission has proposed that TikTok disable its "infinite scroll" feature. This mechanism allows users to seamlessly cycle through an endless stream of videos, a design element widely recognized for its potential to contribute to compulsive usage patterns. Virkkunen emphasized the core principle behind the DSA: "The Digital Services Act makes platforms responsible for the effects they can have on their users. In Europe, we enforce our legislation to protect our children and our citizens online." This statement highlights a broader regulatory trend towards holding digital platforms accountable for their societal impact.
Leading academics and experts in the field have weighed in on the Commission’s pronouncements. Professor Sonia Livingstone of the London School of Economics acknowledged that while TikTok has introduced some measures aimed at enhancing user safety, these efforts are deemed insufficient to meet the EU’s rigorous standards. "Young people are calling for such changes," Professor Livingstone noted, adding, "They are frustrated that the platform does not prioritise their wellbeing over profit." This sentiment reflects a growing concern among users, particularly younger ones, about the exploitative nature of platform design driven by profit motives.
Matt Navara, a social media expert, offered a nuanced perspective, suggesting that while the term "addictive" can sometimes be overused in public discourse, the Commission’s findings appear to be grounded in robust "true behavioural science." He characterized the EU’s stance as a "seismic shift" in how regulators are approaching social media platforms. "This seems to be the first time a major regulator has said that the design is the problem," Navara observed. "It’s no longer about just toxic content, it’s about toxic design." This distinction between content moderation and the inherent design of the platform itself marks a significant evolution in regulatory thinking.

The EU’s action against TikTok is not an isolated incident but rather part of a broader, intensifying regulatory push against major technology companies. The Commission has a history of scrutinizing the operations of big tech firms and has not shied away from imposing fines. In December 2024, a separate investigation was initiated into TikTok concerning alleged foreign interference in the Romanian presidential elections, signaling a multi-faceted approach to regulating the platform.
More recently, in January of this year, the EU launched an inquiry into Elon Musk’s X (formerly Twitter) over concerns that its AI tool, Grok, was used to generate sexually explicit images of real individuals. This investigation highlights the EU’s vigilance regarding the ethical implications of emerging AI technologies deployed by social media platforms. Further demonstrating the Commission’s willingness to act decisively, in December 2025, X was fined €120 million (£105 million) for its "blue tick" verification badges. The EU argued that these badges "deceive users" as the company was not "meaningfully verifying" the identity of account holders, thereby compromising user trust and authenticity.
Paolo Pescatore, a social media analyst, views the latest development concerning TikTok as a crucial "reality check" for the company and a potent "warning shot" for the entire social media industry. He posits that the market dynamics are undergoing a significant transformation, shifting from an exclusive focus on "maximise engagement" to an imperative to "engineer responsibility." Pescatore concludes, "regulators now have the tools to enforce it," indicating that the era of unchecked platform growth driven solely by engagement metrics may be drawing to a close, replaced by a more responsible and user-centric regulatory framework. The implications of this ruling extend far beyond TikTok, setting a precedent for how digital platforms will be held accountable for their design choices and their impact on society.








