A new requirement to implement highly effective age checks has caused a particular furore in recent days, with a petition to repeal the Online Safety Act (OSA) attracting almost 400,000 signatures. Nigel Farage has promised to repeal the OSA if Reform win power, telling the Financial Times that “it begins to look as though state suppression of genuine free speech may be upon us already.” Our article aims to cut through the noise and equip you with a more practical perspective on the OSA.
The 355 pages of the OSA contain numerous new rules and regulations that aim to make the UK the safest place in the world to be online. This article explores the elements of those regulations that apply to video game companies and explains what you need to do to comply.
The regulator
Enter Ofcom, the UK’s regulator for communications, and its new powers to regulate the safety of online services. In the years to come, Ofcom will look into how gaming platforms protect users, particularly children, from harmful content.
The regulated
Your online gaming product is likely to fall under the scope of the OSA if it has functionality allowing users to encounter shared, uploaded or generated content by other users. For example:
- Text or voice chat, which is likely to cover many FPS and battle royale games that have team chat functions, as well as MMORPG games that have large, server-wide text channels
- Games that focus around players creating and sharing their own content, such as Roblox, Minecraft, and other sandbox games
- Games with inbuilt livestreaming, forums or marketplaces which facilitate user-to-user interactions.
It must also be likely that your game will be used by people in the United Kingdom (but this test is relatively easy to pass).
There are some exemptions which will apply for games that only allow limited interaction between players – such as where the communication is limited to commenting or responding on content you (the developer/publisher) have created, or interacting with other people’s comments or with your content by using ‘like’ buttons, emoji, or ratings. However, similar interactions with content created by other players will not be exempt. A game would also be exempt if the only possible communication between players was 1-to-1 live audio communication, but this is unlikely to be applicable for a lot of games.
The regulations
The key things to be aware of are your new duties to prevent gamers from encountering various different kinds of illegal content generated by other users of the game.
In some cases, taking action when illegal content comes to your attention will be sufficient, but in others, proactive monitoring will be required. This is the case for so called ‘priority illegal content’, which covers things like terrorism, child sexual exploitation and abuse, human trafficking, assisting or encouraging suicide, intimate image abuse and animal cruelty.
Proactive monitoring is also required for content that the Online Safety Act says is particularly harmful for children. This includes:
- Pornographic content
- Content encouraging promoting or providing instructions for suicide or self-harm or eating disorders
- Abuse & incitement of hatred based on race, religion, sex, sexual orientation, disability or gender reassignment
- Bullying content
- Content promoting challenges or stunts likely to result in serious injury or to inject, ingest or inhale physically harmful substances.
Ofcom requires that regulated gaming companies carry out risk assessments to determine the likelihood of adult and child gamers encountering the kinds of illegal content outlined above. These risk assessments must be “suitable and sufficient” – what that means depends on things like (1) the size of your user base (2) how likely it is that your users will encounter illegal content (3) how fast and how widely that content can spread. It’s not a one-off exercise, it’s something that should be reviewed periodically.
There are also “safety duties”, which require you to take “proportionate measures” relating to the design and the operation of your game to prevent gamers from encountering the kinds of content mentioned above. The safety duties will require you to do things like:
- design your game’s functionality to steer users away from priority illegal content
- update your game’s terms of use
- moderate content that users post when playing your game
- offer gamers functionality to control what kind of content they encounter (eg by reporting the kind of content mentioned above).
The above is a politically charged subject. Some feel that the duties to keep users safe online have gone too far.
Consequences of breaching the regulations
Ofcom has the power to issue fines of up to £18m or 10% of worldwide turnover in response to the most serious breaches of the Online Safety Act.
Complaints procedures
For some time, many multiplayer games have had community standards, acceptable use policies, and reporting mechanisms, although the accessibility and effectiveness of these has varied. Take your average multiplayer online game and try to see how easy it is to report your fellow player for a variety of reasons. Now, think about how easy is it to complain to the game's publisher.
Ofcom has new enforcement, investigation and information-gathering powers to look into how effective the complaints systems of all online services are. Gaming publishers are required to ensure that their game’s moderation, reporting and complaints procedures are effective.
We can help to advise you on whether your existing procedures are compliant.
Yes or no: Are you over the age of 13?
The OSA highlights that it is the duty of gaming providers to look at how their services function and operate as part of these risk assessments, and as part of preventing users from encountering illegal, harmful and priority offence content.
Many gaming providers produce cross-generational games. The current PEGI classification on the front of a game stating it's for 13+ or is R rated doesn’t seem to be robust enough in Ofcom and legislator’s eyes. This view is understandable given the most common measures, age-gating – inserting your date of birth or ticking a box stating you are a specific age, using third party verification providers like Yoti or parental controls, all have their own limitations.
As part of its role as the online safety regulator, Ofcom set a deadline earlier this year, for services to complete the child access assessment. All providers were required to look at their age assurance measures (verification or estimation) to see if a child (someone under the age of 18) would be able to access their service. As of 25 July 2025, children’s safety duties under the OSA came into effect.
Ofcom advises that reusable ID services, facial age estimation and photo-ID matching are highly effective and preferrable to payment methods (which do not require a user to be 18) or relying only on user terms that set a minimum player age.
If able to be accessed by children, gaming providers will need to explore how they can ensure that players are the “exact” age they say they are, considering data and ethical issues closely. We can help advise you on whether your existing process is compliant with the new regulations.
What are other gaming companies doing?
Some publicly available examples of safety and moderation features include:
- Epic Games: Epic Games DSA Transparency Report_17 Feb 2024-16 Feb 2025
- Roblox: Safety & Civility at Roblox – Roblox Support
- Nintendo: Nintendo Online Safety
The above give an indication of the types of things that large gaming companies are doing to adapt. Amongst other things, Epic Games says it has:
- Created ‘Cabined Accounts’, where players under the age of 13 or their country’s age of digital consent can still play Fortnite, Rocket League, and Fall Guys, but minus access to certain features such as voice and text chat
- Implemented more than a dozen granular controls and settings to let parents and guardians manage access to social features such as voice and text chat.
Smaller companies may not be required to go to the same lengths as larger companies to comply with the OSA, but that emphatically does not mean it would be safe for them to ignore the new law entirely.
What’s next?
The key provisions in the Online Safety Act have been coming into force over the course of this year. Most are now in place, and the regulator Ofcom is beginning to flex its muscles and take action against companies that haven't taken steps to comply with the Online Safety Act. This isn't an issue that is confined to the UK – Europe’s Digital Services Act contains similar provisions which will affect gaming companies.
Please get in touch with one of Mills & Reeve’s media & entertainment lawyers if you'd like to discuss any of the issues raised in this article.
Our content explained
Every piece of content we create is correct on the date it’s published but please don’t rely on it as legal advice. If you’d like to speak to us about your own legal requirements, please contact one of our expert lawyers.