Tech firms will have 48 hours to remove abusive images under new law

In a landmark move to protect individuals from online harm, the UK government is set to introduce stringent new legislation that will compel technology platforms to remove non-consensual intimate images (NCII) within a strict 48-hour timeframe. This proposed law, currently being debated as an amendment to the broader Crime and Policing Bill in the House of Lords, signifies a significant shift in accountability, placing the onus squarely on tech companies to act swiftly and decisively. The government has declared its intention to treat the abuse of intimate images with the same gravity as child sexual abuse material (CSAM) and terrorist content, underscoring the severe impact such violations can have on victims. Failure to comply with these new regulations could lead to substantial penalties, including fines of up to 10% of a company’s global annual turnover, or even the complete blockage of their services within the United Kingdom.

Prime Minister Sir Keir Starmer articulated the government’s resolve during an interview with BBC Breakfast, describing the legislation as a crucial component of an "ongoing battle" to champion the rights of victims against powerful platform providers. He emphasized that the new law aims to alleviate the immense burden placed on individuals who currently have to navigate a labyrinthine process of reporting the same abusive content across multiple platforms, often facing lengthy delays. "The days of tech firms having a free pass are over," stated Technology Secretary Liz Kendall, reinforcing the government’s commitment to ensuring that "no woman should have to chase platform after platform, waiting days for an image to come down."

Tech firms will have 48 hours to remove abusive images under new law

The proposed legislation introduces a streamlined reporting mechanism for victims. Henceforth, individuals will only need to flag an intimate image once, triggering a mandatory removal process by the platform. Crucially, tech companies will also be required to implement robust measures to prevent the re-uploading of such content once it has been identified and removed. This proactive approach is designed to create a more secure online environment and prevent the persistent circulation of harmful material. Furthermore, the bill aims to equip internet service providers with the necessary guidance to block access to websites that persistently host illegal content, thereby targeting rogue entities that currently operate beyond the scope of existing legislation, such as the Online Safety Act.

Intimate Image Abuse (IIA), also known as image-based sexual abuse (IBSA), disproportionately affects women, girls, and LGBT individuals, highlighting a deeply entrenched societal issue that is being exacerbated by technological advancements. A comprehensive government report released in July 2025 delved into the evolving landscape of digital violence, noting a disturbing trend of young men and boys being targeted for financial sexual extortion, commonly referred to as "sextortion." This form of abuse involves perpetrators demanding money from victims in exchange for not sharing intimate images online. The report underscored the critical need for survivor-centric tools and strategies, especially in the age of generative artificial intelligence, which presents new challenges in the creation and dissemination of malicious content.

The urgency for such legislation is further underscored by alarming statistics. A parliamentary report published in May 2025 revealed a significant increase in reported incidents of intimate image abuse, with a staggering 20.9% rise recorded in 2024 alone. This surge in reported cases reflects both an increase in actual abuse and, potentially, a greater willingness among victims to come forward due to increased awareness and improved reporting mechanisms. The government’s proactive stance on this issue is a direct response to these escalating concerns.

Tech firms will have 48 hours to remove abusive images under new law

Sir Keir Starmer elaborated on the enforcement mechanisms, explaining that the law would be upheld through a combination of oversight bodies responsible for online content regulation and criminal proceedings. He indicated that while fines would be a primary deterrent, he did not foresee prison sentences for tech company executives at this stage. However, the potential for substantial financial penalties and service blockades serves as a powerful incentive for companies to prioritize user safety and adhere to the new legal framework.

The announcement of this new law follows a highly publicized standoff between the government and the social media platform X (formerly Twitter) in January. The controversy erupted when the platform’s AI tool, Grok, was used to generate sexually explicit images of real women, including public figures. This incident highlighted the vulnerability of individuals to AI-generated deepfakes and the urgent need for platforms to take responsibility for the content they host and the tools they provide. While X eventually removed the function, the episode served as a stark reminder of the evolving threats in the digital realm and the imperative for robust regulatory responses. The government’s swift action in proposing this legislation demonstrates a commitment to learning from such incidents and proactively addressing the challenges posed by new technologies.

The proposed 48-hour removal window is not without precedent. Tech companies are already subject to similar obligations for removing terrorist content, a mechanism that has proven effective. The government views the application of this established framework to NCII as a logical and necessary extension of their responsibilities. This approach acknowledges the profound psychological and social damage that can be inflicted by the non-consensual sharing of intimate images, which can lead to reputational damage, social ostracization, and severe mental health issues for victims. The legislation seeks to create a proactive culture within tech companies, where the prevention and rapid removal of harmful content are embedded into their operational DNA.

Tech firms will have 48 hours to remove abusive images under new law

The broader implications of this law extend beyond immediate removal. By mandating that platforms actively block the re-uploading of flagged images, the legislation aims to disrupt the cycle of abuse. This is particularly important in cases where perpetrators may attempt to re-circulate images across different platforms or on new accounts. The requirement for a unified reporting system for victims is also a significant advancement, reducing the emotional toll and practical difficulties associated with repeatedly reporting the same violation to various entities.

Civil society organizations have broadly welcomed the proposed legislation. Janaya Walker, interim director of the End Violence Against Women Coalition, stated that the move "rightly places the responsibility on tech companies to act." This sentiment is echoed by many who believe that the burden of combating online abuse should not fall solely on the shoulders of victims. The law represents a recognition that the digital spaces where these harms occur are largely controlled and profited from by tech corporations, and thus, they must bear a significant portion of the responsibility for ensuring their safety.

The government’s commitment to tackling online abuse is a complex and multifaceted endeavor, involving not only legislative action but also ongoing engagement with technology providers, law enforcement agencies, and victim support groups. The amendment to the Crime and Policing Bill is a significant step in this direction, signaling a clear intent to create a more accountable and safer online environment for all citizens. As the bill progresses through Parliament, its precise details and enforcement mechanisms will continue to be scrutinized, but the overarching goal remains clear: to empower victims and hold tech firms accountable for the harms facilitated by their platforms.

Related Posts

Porn company fined £1.35m by Ofcom over age verification failings

Ofcom, the UK’s communications regulator, has imposed a substantial fine of £1.35 million on adult content provider 8579 LLC for its persistent failure to implement robust age verification measures across…

SpaceX rocket fireball linked to plume of lithium.

When a SpaceX rocket’s fiery demise illuminated the skies over western Europe last February, questions arose about potential atmospheric pollution. Now, scientific investigations have established a direct correlation between the…

Leave a Reply

Your email address will not be published. Required fields are marked *