When the Online Safety Bill was first introduced to the UK parliament in 2022, it was touted as an urgent, vital piece of legislation. It was supposed to prevent children, in particular, from seeing distressing or illegal content, such as terrorist propaganda, pornography or graphic violence. It was not supposed to shut down online forums for hamster enthusiasts. But tragically, that is exactly what has happened.
Under the rules of the Online Safety Act, all online services and social media – from small online communities to giants like Facebook, X and TikTok – must complete lengthy risk assessments. This involves laying out how likely it is that a user will come across illegal content on their platform and how they intend to deal with it. Sites that fail to comply could face fines of up to £18million or 10 per cent of their annual turnover.
Small, community-led sites have been announcing to users that they will either restrict access, introduce sweeping rules or even go offline entirely – all because of the regulations imposed by the new law. Smaller forums, many of which have been around since the early days of the web in the 1990s and early 2000s, have been particularly badly hit. Victims include a group for locals of a small town in Oxfordshire and a cycling forum. One site, a link-sharing forum hosted in Finland, has blocked access for UK visitors, blaming Britain’s ‘Great Firewall’.
A particularly galling example is the Hamster Forum. Describing itself as ‘the home of all things hamstery’, it is the last place you’d expect to feel ‘unsafe’ online. Still, it too has posted a farewell notice to its users, announcing that it is shutting down. ‘While this forum has always been perfectly safe’, the administrator wrote, ‘we were unable to meet’ the compliance requirements of the Online Safety Act. The administrator of the Charlbury in the Cotswolds forum similarly wrote that the law was ‘a huge issue for small sites, both in terms of the hoops that site admins have to jump through, and potential liability’. As a result of the new rules, the forum was being forced to more strictly moderate its content.
Ofcom handwaved the costs of these risk assessments as ‘likely to be negligible or in the small thousands at most’. But ‘small thousands’ is still a monumental amount for people who are running these sites either mostly or completely for free. Besides, it is an unfathomable waste to spend any amount of money on assessing whether people might come across illegal or harmful content on sites dedicated to sharing hamster-rearing tips or asking when the next bin collection is.
Ofcom, which is responsible for enforcing this law, has promised that it intends to prioritise going after the big platforms, not ‘low-risk services trying to comply in good faith’. It has also said that it ‘will only take action where it is proportionate and appropriate’. But why would administrators take the risk? Ofcom is not exactly known for being even-handed in its application of laws anyway.
It’s not just the hamster and cycling forums that could disappear from the British internet, either. Tech behemoths like Google, Uber and Elon Musk’s X have previously warned that many more businesses could quit Britain instead of bending to the Online Safety Act’s ludicrous requirements. Politicians might think they are ‘heralding a new era of internet safety’, but all they are doing is making the country inhospitable to tech – especially innovative, new tech. Sites like Facebook, which was founded by Mark Zuckerberg and friends from a college dorm room, would never have gotten off the ground under the restrictions imposed by the Online Safety Act.
There’s more to come, too. By July, sites hosting porn or other ‘harmful’ content must force visitors in the UK to provide age verification before accessing them. This could involve measures such as having to upload photo ID or allowing software to estimate the age of your face.
Will any of this make the internet safer, anyway? Probably not. For starters, VPNs exist, allowing users to bypass geoblocked content from within the UK. Kids in particular, who this act is supposed to protect from seeing anything nasty, are famed for their ability to get around parental controls and blocks in order to access content they’re not supposed to see. Besides, the worst illegal content isn’t being shared on sites like Facebook or X – and certainly not on the Hamster Forum. It is being shared on the so-called dark web, which is not something that anyone can simply stumble across by accident.
The Online Safety Act is pointlessly punishing online communities that have done nothing wrong. This mess of a law certainly won’t make the internet safer. But it will make it a lot less free – and a lot less hamstery.
Lauren Smith is a staff writer at spiked.