These modifications have led to some high-profile feedback from politicians, and even a petition to repeal the invoice which has generated over 400,000 signatures. The Act itself handed in 2023, nonetheless, the roll out of legislation has been staggered, therefore the renewed public curiosity as some new guidelines come into place.
The act makes it the obligation of social media firms and platform suppliers to guard kids and adults from hurt by making them chargeable for their consumer’s security while on their platforms. Failure to adjust to these guidelines may end up in giant fines, firm executives being jailed and even websites being banned within the UK.
However what are the precise modifications that the Act is implementing? How are they supposed to guard kids and younger folks from hurt? And, most significantly, what’s the proof that these modifications will truly defend psychological well being?
Eradicating unlawful content material
A rule which got here into power in December 2024 requires all firms to take motion towards unlawful content material being shared on their platforms or unlawful transactions and exercise going down by their platforms and providers.
The sorts of unlawful content material the Act outlines embody photographs of sexual abuse, the promoting of unlawful medicine or weapons, the sharing of state-sponsored disinformation by the Overseas Interference Offence, and stopping exploitative or coercive behaviours.
A survey carried out by Savanta for the Psychological Well being Basis earlier this 12 months, discovered that 68% of younger folks (aged 16-21) reported having seen dangerous or disturbing content material on-line. This doesn’t solely normalise dangerous behaviours for impressionable kids. Viewing violent or abusive content material may also trigger psychological misery and even trauma, significantly in younger folks.
‘Media Induced Trauma’ is properly documented, significantly following high-profile traumatic occasions. In 2021, researchers at Boston College discovered that following a college capturing, individuals who considered extreme protection or upsetting content material, together with graphic movies of the capturing itself, have been extra prone to have signs of PTSD and different psychological problems.
The brand new legislation makes it the accountability of social media firms, tech corporations and web suppliers to hold out a threat evaluation of unlawful content material being shared on their platforms and take applicable steps to take away it on an ongoing foundation.
Age verification
Placing the onus onto firms to take away unlawful content material isn’t the one instance of the brand new guidelines. The Act additionally requires platforms to guard kids from viewing content material which could not be unlawful, however which may very well be in any other case dangerous to creating minds. For instance, pornography web sites now have an obligation to confirm the ages of customers earlier than they will entry express content material.
Pornography, significantly excessive or hardcore pornography, could be damaging for kids’s psychological well being. It could actually result in habit, issues with emotional intimacy and forming relationships, and skewed understandings of intercourse and gender roles in relationships. Publicity to pornography at a younger age makes kids extra prone to develop signs of hysteria and extra susceptible to sexual exploitation as sharing express photographs with younger kids is a typical grooming tactic of predators. The typical age kids are first uncovered to on-line pornography is 12 years previous, with 15% of youngsters being beneath the age of 10.
As of July 2025, web sites within the UK internet hosting pornography should confirm customers ages earlier than permitting them to entry express content material. It’s these age verification guidelines which have brought on some backlash. Critics of the brand new guidelines argue that as customers now must confirm their age by presenting ID, or utilizing different instruments reminiscent of facial recognition software program, there are dangers of knowledge breaches or customers privateness being threatened.
Social media use
Along with pornography websites, there’s a particular onus on social media websites to risk-assess the content material on their platforms and if it may be dangerous for kids, to then implement applicable age limits. Most kids over the age of 10 have a presence on social media with the preferred being YouTube and WhatsApp:






