The UK Online Safety Bill (now the Online Safety Act 2023) is one of the most expansive pieces of internet regulation to date. While much of the media coverage has focused on its application to social media giants and adult content websites, the reality is that its scope is astonishingly broad. Any platform that permits user interaction — even an innocent hobby forum or chat room — is caught in its net.
The most crucial thing to understand is that the Act does not distinguish between big and small, or between harmful and harmless content. Its focus is not on the intention of the platform but on the potential for harm and the existence of user-generated content. This includes:
In short: if users can talk, Ofcom is watching.
Under the new regime, platforms have a duty of care to users. These duties include:
Importantly, these duties scale with platform size and risk but are not eliminated for smaller sites.
One of the most controversial parts of the legislation is the requirement for age verification and assurance. If your platform is accessible to children and there is a risk of them encountering harmful content, then you are expected to:
This means that a completely innocent site about drone racing or retro computing might be expected to verify that its users are not children, or at least take steps to shield them from adult interaction, depending on what content emerges.
Even if your site never intended to attract children, you may still have to prove that it doesn’t — or that you’ve taken precautions to protect them.
The Act doesn’t demand pre-moderation (i.e., scanning every post before it's published), but it does expect:
This means that a casual, hands-off approach to forum moderation is no longer viable. If content breaches the law or your own terms and remains available, you may be in breach of the Act.
Ofcom has been given wide-ranging powers to enforce the Act, including:
Platforms accessible to children are expected to shield them from content that may be legal but harmful, such as:
Even if such content is legal in the UK, if it is deemed harmful to children, it must be mitigated or removed on platforms accessible to them.
Contrary to popular belief, there is no blanket exemption for small platforms or hobby projects. While Ofcom may apply expectations proportionally, all sites with user-generated content fall within scope. This includes:
The Act is concerned with function, not fame.
The only real exemptions are for services that do not offer public user-generated content, such as:
That’s it. Pretty much everything else is in scope.
This is where the OSB becomes particularly contentious. The scope of the legislation means:
For a hobbyist running a site out of their garage, this may feel utterly disproportionate — and yet, under the Act, you are accountable.
The Online Safety Act 2023 aims to make the internet safer — especially for children — but its **expansive, catch-all approach** puts an enormous burden on even the most benign online spaces. It's not just the Facebooks and TikToks of the world in the firing line. It's the vintage computing forum, the motorbike maintenance chat group, the amateur radio message board.
What was once considered a safe and quiet corner of the internet may now require:
Understanding the scope of the Act is crucial for anyone running or using interactive online platforms in the UK. It’s no longer enough to say "we’re too small" or "we're just a hobby" — because legally, that defence no longer exists.
If you operate or manage any platform that includes user interaction, now is the time to review your responsibilities. Ignorance of the Act will not protect you from its enforcement.