-
Online Safety Bill will force pornography websites to prevent underage access including by using age verification technologies
-
New measure goes further than the bill’s existing protections by bringing all websites offering pornography online into scope
-
Children will be better protected from online pornography under new measures to bring all websites that display it into scope of the government’s pioneering new internet safety laws.
On Safer Internet Day, Digital Minister Chris Philp is announcing the Online Safety Bill will be significantly strengthened with a new legal duty requiring all sites that publish pornography to put robust checks in place to ensure their users are 18 years old or over.
This could include adults using secure age verification technology to verify that they possess a credit card and are over 18 or having a third-party service confirm their age against government data.
If sites fail to act, the independent regulator Ofcom will be able fine them up to 10 per cent of their annual worldwide turnover or can block them from being accessible in the UK. Bosses of these websites could also be held criminally liable if they fail to cooperate with Ofcom.
A large amount of pornography is available online with little or no protections to ensure that those accessing it are old enough to do so. There are widespread concerns this is impacting the way young people understand healthy relationships, sex and consent. Half of parents worry that online pornography is giving their kids an unrealistic view of sex and more than half of mums fear it gives their kids a poor portrayal of women.
Age verification controls are one of the technologies websites may use to prove to Ofcom that they can fulfil their duty of care and prevent children accessing pornography.
Digital Minister Chris Philp said:
It is too easy for children to access pornography online. Parents deserve peace of mind that their children are protected online from seeing things no child should see.
We are now strengthening the Online Safety Bill so it applies to all porn sites to ensure we achieve our aim of making the internet a safer place for children.
Many sites where children are likely to be exposed to pornography are already in scope of the draft Online Safety Bill, including the most popular pornography sites as well as social media, video-sharing platforms and search engines. But as drafted, only commercial porn sites that allow user-generated content – such as videos uploaded by users – are in scope of the bill.
The new standalone provision ministers are adding to the proposed legislation will require providers who publish or place pornographic content on their services to prevent children from accessing that content. This will capture commercial providers of pornography as well as the sites that allow user-generated content. Any companies which run such a pornography site which is accessible to people in the UK will be subject to the same strict enforcement measures as other in-scope services.
The Online Safety Bill will deliver more comprehensive protections for children online than the Digital Economy Act by going further and protecting children from a broader range of harmful content on a wider range of services. The Digital Economy Act did not cover social media companies, where a considerable quantity of pornographic material is accessible, and which research suggests children use to access pornography.
The government is working closely with Ofcom to ensure that online services’ new duties come into force as soon as possible following the short implementation period that will be necessary after the bill’s passage.
The onus will be on the companies themselves to decide how to comply with their new legal duty. Ofcom may recommend the use of a growing range of age verification technologies available for companies to use that minimise the handling of users’ data. The bill does not mandate the use of specific solutions as it is vital that it is flexible to allow for innovation and the development and use of more effective technology in the future.
Age verification technologies do not require a full identity check. Users may need to verify their age using identity documents but the measures companies put in place should not process or store data that is irrelevant to the purpose of checking age. Solutions that are currently available include checking a user’s age against details that their mobile provider holds, verifying via a credit card check, and other database checks including government held data such as passport data.
Any age verification technologies used must be secure, effective and privacy-preserving. All companies that use or build this technology will be required to adhere to the UK’s strong data protection regulations or face enforcement action from the Information Commissioner’s Office.
Online age verification is increasingly common practice in other online sectors, including online gambling and age-restricted sales. In addition, the government is working with industry to develop robust standards for companies to follow when using age assurance tech, which it expects Ofcom to use to oversee the online safety regime.
ENDS
Key research and statistics:
Research by the British Board of Film Classification ‘Young People, Pornography and Age Verification’ in 2020 found:
51% of children aged 11-13 years old have seen pornography and this number is likely conservative.
Many children – some as young as 7 years old – stumble upon pornography online, with 61% of 11-13 year olds describing their viewing as mostly unintentional.
Research published by City University, in a survey of a representative sample of over 1,000 16 and 17 year olds found:
That more of them (63%) had seen pornography on social media platforms than on pornographic web sites (47%).
That while pornography was more likely to be seen on social media platforms, it was more frequently viewed on pornographic websites.
A 2021 poll of more than 2,100 UK adults commissioned by the Christian charity CARE, found:
81% of UK adults agree with the statement: ‘The government should implement age verification to protect children from all online pornography’.
79.5% of UK adults agree with the statement: ‘There should be an age limit of 18 years for access to online pornography’.
The 2019 research report We Need to Talk About Pornography from the charity Internet Matters founds that:
Nearly half (48%) of parents worry that online pornography gives their kids “improper sex education” and an unrealistic view of normal sex.
Over a third (34%) of parents are concerned their child will become desensitised to brutal or violent content.
53% of mums fear that exposure to pornography will give their kids a poor portrayal of women as subjects of abuse.
Notes to editors:
Since the publication of the draft Bill in May 2021 and following the final report of the Joint Committee in December, the government has listened carefully to the feedback on children’s access to online pornography, in particular stakeholder concerns about pornography on online services not in scope of the bill.
To avoid regulatory duplication, video-on-demand services which fall under Part 4A of the Communications Act will be exempt from the scope of the new provision. These providers are already required under section 368E of the Communications Act to take proportionate measures to ensure children are not normally able to access pornographic content.
The new duty will not capture user-to-user content or search results presented on a search service, as the draft Online Safety Bill already regulates these. Providers of regulated user-to-user services which also carry published (i.e. non user-generated) pornographic content would be subject to both the existing provisions in the draft Bill and the new proposed duty.
Follow this news feed: HM Government