The future of online safety regulation?

The Online Safety Bill, which is still being amended in Parliament, seeks to introduce safety duties on “providers” of various types of internet services.  These duties are intended to protect the users of such services from specified online harms. 

The Bill firstly regulates “user-to-user” services.  Broadly speaking these are services where a user can upload or generate content that can be encountered by another user - such as Facebook and TikTok, with a range of exemptions for specified situations/organisations.  The Bill also regulates “search services” such as Google and Bing.  Other internet services are regulated if they publish or display certain pornographic content, subject to some exemptions. 

In part, the Bill is a response to concerns regarding the effect of illegal content, content that is harmful to children and certain other categories of content.  Some of these concerns were outlined by the coroner who conducted the inquest into the tragic death of 14 year old Molly Russell.

Safety duties

In general terms, the safety duties in the Bill will require providers of regulated user-to-user services to take proportionate steps to:

  • prevent users from encountering specified illegal content (content that amounts to specified “priority offences”);
  • mitigate and manage the risks of the service being used to commit or facilitate priority offences;
  • mitigate and manage the risks of harm to individuals arising from specified illegal content;
  • minimise the length of time for which any “priority” illegal content is present;
  • swiftly take down illegal content when the provider becomes aware of it.

There are additional safety duties on services which are likely to be accessed by children (under 18), focused on preventing access and minimising harm to children from specified content that is “harmful to children”.  The definitions of content that is harmful to children will be set out in as yet unpublished regulations.

The safety duties on search service providers are similar to, but less extensive than those on user-to-user services above.

Other duties

The safety duties are supported by a range of others, including for example duties to conduct risk assessments, duties to report child sexual exploitation and abuse content to the National Crime Agency, duties concerning freedom of expression and privacy, content reporting and complaints procedures.

Additional duties are also imposed on particular high reach / high risk categories.  The exact threshold criteria for what constitutes a high reach / high risk service will be published in due course.  Duties placed on providers of high reach user-to-user services will include requirements to remove content in breach of the provider’s own terms and conditions.  Where proportionate, such providers will also be obliged to offer tools to allow users to filter out content from unverified users or content that targets abuse or incites hatred against people with certain specified protected characteristics.

The Bill also requires all non-exempt “internet services” to ensure that children are not normally able to encounter certain pornographic content, for example through age verification.


The Government has estimated that whilst around 25,000 organisations will be regulated, approximately 160,000 will be exempt.

In very broad outline, the exemptions cover matters including:

  • Certain types of communications (email, SMS/MMS messaging and voice only 1:1 telephony);
  • Specified “low risk” functionality such as certain reviews and comments on content, likes and emojis;
  • “Internal business services” which are internal tools and resources available only to a closed group of people that includes officers of the provider, those who work for it and other persons authorised by the foregoing for the purposes of the provider’s “business” (widely defined to include educational institutions, whether for profit or not).  The Bill indicates that students, pupils, consultants, contractors and auditors are potential examples of persons who might be “authorised”;
  • Services provided by public bodies in the exercise of their public functions;
  • Providers of services with specified education/childcare safeguarding responsibilities including specified further education, school and academy functions, but not higher education.

There are further provisions that may be relevant where only part of a service is exempt, or where a service also publishes certain pornographic content.

Scope, enforcement and implementation

Broadly speaking, the Bill applies to services that are not exempt but which have “links with the UK” either because they have a significant number of UK users, or UK users form one of the target markets for the service.  The Bill also applies where the service is capable of being used in the UK by individuals and there are reasonable grounds to believe that there is a material risk of significant harm to such individuals from specified content on the service.

Ofcom has a range of enforcement powers under the Bill, including power to issue fines of up to 10% of annual turnover or £18m if greater.  The Bill also includes a range of other sanctions, including some criminal offences for entities and individual directors, managers and officers.

The current Parliamentary timetable envisages that the Bill will receive Royal Assent by 20 July.  The exact timetable for the Bill coming into effect will depend on Ofcom consulting on and publishing various codes of practice and guidance, amongst other matters.

It remains to be seen how effective the legislation will be; the UK is one of a number of jurisdictions that have been introducing legislation in this broad area.  There are particular concerns among technology and cybersecurity professionals that aspects of the UK legislation as currently drafted may require private encrypted messaging services such as WhatsApp to scan users’ devices for child abuse content.  This in turn would require those providers to create a weakness in their encryption, with the unintended effect that this could be exploited by hackers.  It remains to be seen whether the Bill will be further amended.

Whilst higher education institutions may be able to benefit from the “internal business services” exemption, given the complexity of the legislation and the potential sanctions for non-compliance it is likely to be worth undertaking some analysis to consider what “services” are provided and the applicability of the exemptions.  It is also a good time to consider what practical and procedural safeguards are in place where legitimate academic research or study relates to potentially illegal content, to minimise any risks of the institution or individuals infringing the criminal law.

Our content explained

Every piece of content we create is correct on the date it’s published but please don’t rely on it as legal advice. If you’d like to speak to us about your own legal requirements, please contact one of our expert lawyers.

Posted by


Mills & Reeve Sites navigation
A tabbed collection of Mills & Reeve sites.
My Mills & Reeve navigation
Subscribe to, or manage your My Mills & Reeve account.
My M&R


Register for My M&R to stay up-to-date with legal news and events, create brochures and bookmark pages.

Existing clients

Log in to your client extranet for free matter information, know-how and documents.


Mills & Reeve system for employees.