NewsInternet

UK’s major porn providers agree to age checks from next month

Children in the UK will gain increased protection from online pornography next month, Ofcom has announced, as major providers agree to bring in robust methods to check users’ age for the first time.

By 25 July 2025, all sites and apps that allow pornography – whether they are dedicated adult sites or social media, search or gaming services – must use highly effective age checks to ensure children are not normally able to encounter it. Online firms who publish their own pornography are already required to protect children from it, and thousands of sites have already introduced robust age checks in response.  

Major porn providers operating in the UK have confirmed to Ofcom that they will introduce effective checks by next month’s deadline in order to comply with the new rules[1]. They include PornHub, the most-visited pornographic service in the UK [2]. Other services who are happy to be named at this stage include BoyfriendTV, Cam4, FrolicMe, inxxx, Jerkmate, LiveHDCams, MyDirtyHobby, RedTube, Streamate, Stripchat, Tube8, and YouPorn. This represents a broad range of pornography services accessed in the UK.  

Monitoring compliance with these new duties is a priority for Ofcom. If any company fails to comply with its new duties, Ofcom can impose fines and – in very serious cases – apply for a court order to prevent the site or app from being available in the UK. As part of our work enforcing the Online Safety Act, we have already launched investigations into four porn providers and won’t hesitate to take further action from July. 

Under 18s exposed to adult content 

Robust age checks are a cornerstone of the Online Safety Act and will help protect children from harmful content. Under the Act, age assurance methods – which include age verification, age estimation or a combination of both – must be ‘highly effective’ at correctly determining whether a user is under 18. Highly effective age checks include credit card checks, open banking or facial age estimation.  

New research from Ofcom reveals the extent to which children are accessing porn online, and underlines the need for new measures to protect them. We found that eight per cent of children aged 8-14 in the UK visited an online porn site or app in a month – including around 3% of 8–9-year-olds – the youngest children in the study. [3]  

Ofcom’s research tracked the use of websites and apps by 8–14-year-olds across smartphones, tablets and computers over a month. Boys aged 13-14 (19%) were mostly likely to visit a porn service, significantly more than girls the same age (11%). With older teenagers also likely accessing pornography, the total number of under-18s exposed to adult content will be higher still. 

Oliver Griffiths, Ofcom Group Director of Online Safety said: “Society has long protected youngsters from products that aren’t suitable for them, from alcohol to smoking or gambling. But for too long children have been only a click away from harmful pornography online. 

“Now, change is happening. These age checks will bring pornography into line with how we treat adult services in the real world, without compromising access and privacy for over-18s.” 

Securing adult users’ rights 

As well as preventing children from accessing harmful content, platforms must also ensure that the new measures do not exclude adults from accessing legal content, nor compromise their privacy. All age assurance methods are subject to the requirements of UK privacy laws, including on processing personal data. These are overseen and enforced by the Information Commissioner’s Office (ICO).  

Under the Online Safety Act, online pornography services must keep written records explaining how they protect users from a breach of these laws. 

Our approach to highly effective age assurance allows space for innovation and new methods in a fast-moving technology market. Next year, Ofcom will publish a report looking into the use of age assurance and its effectiveness. 

Protecting children online  

From 25 July, all social media, search and gaming sites and apps must prevent children in the UK from encountering harmful content including suicide, self-harm and eating-disorder content. Tech firms will have to apply the safety measures set out in our Children’s Codes, or take alternative action to meet their duties, to mitigate the risks that their services pose to children.    

The riskiest services will have to use highly effective age assurance to identify which users are children. Any provider that operates a recommender system and poses a medium or high risk of harmful content must configure their algorithms to filter out harmful content from children’s feeds. 

There is more information on how the new rules will work at Ofcom.org.uk/agechecks.

Jason Davies

I am one of the editors here at www.systemtek.co.uk I am a UK based technology professional, with an interest in computer security and telecoms.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.