After years in the pipeline, the Online Safety Act finally became law in 2023. Designed to protect online users, particularly children, from dangerous or illegal content, it introduces a number of obligations that websites and other online services need to comply with. In this post, we look at the key provisions of the Act and discuss its implications for website owners.
Making the internet a safer place
Anyone who watches the news will be familiar with the many ways online content can cause harm: encouraging suicide, promoting extremist ideologies, inciting violence, sharing child pornography and sharing fake news, just to name a few. The Online Safety Act aims to see harmful content removed from the internet by requiring online service providers to prevent it or remove it swiftly if detected or reported.
As such, the Act applies to all online services that allow user-generated content or enable user interaction. While the main focus will be on social media companies, video-sharing sites and messaging apps, it will also include any websites that share or display user-generated content or have online forums or other ways users can interact. For example, sites that sell clothing, prints and other products displaying images created by their users will need to ensure that none of those images are illegal or dangerous.
In terms of jurisdiction, the Act applies to the whole of the UK. This means that any websites based in other countries, but which are available in the UK, must also comply with the law. The job of enforcing the Act falls to Ofcom which has now been appointed as the independent regulator of online safety. As a result, website owners will need to comply with Ofcom’s codes of practice and guidelines. For those that fail to comply, the penalties can be significant. Ofcom can fine companies up to £18 million or 10% of their global revenue, bring criminal charges against company executives, stop sites from accessing some types of revenue (e.g. ads), or even limit access to the service in the UK.
Are you up-to-date with cookie law? Read: Keeping Your Website EU Compliant Using Cookie Banners
Provider categories
While the Act is applicable to all online service providers, Ofcom has decided that some of the larger providers will be put into special categories, these will be Category 1, 2A and 2B. Though the threshold for being categorised and if so, into which category, is yet to be confirmed, this will depend upon a platform’s user numbers and features. In practice, categorised providers are likely to be the major social media, messaging, video-sharing and search engine companies. Those in these groups will have additional responsibilities depending on which category they are in. These can include producing transparency reports about their online safety measures, providing user empowerment tools (e.g., letting adults block certain types of content) and preventing fraudulent ads.
Keep your website secure, read: Tackling Cybersecurity Threats – Protecting Systems From Malware
Key areas of focus
Though the Act became law in October 2023, it is being implemented in phases, with requirements concerning illegal content and content harmful to children expected to come into force next year. This is due to Ofcom holding public consultations on these issues, particularly on the practicalities of implementing safeguards, before they release any final guidance.
While final guidance is still yet to come, there are some certainties that website owners should already be prepared for. One of these is that the Act introduces a new range of criminal offences which came into effect in January 2024. These include cyberflashing (sending unwanted sexual images or videos to people without their consent), intimate image abuse (taking or sharing intimate images of other people without their consent, including AI-generated images) and sending false information with the intention of causing harm. To avoid non-compliance, website owners should make sure they are up to date with what online behaviours are considered illegal and thus fall under the Act’s remit.
The Act also requires website owners to take proactive measures to prevent illegal content from being made available on their sites. Rather than simply removing any that is spotted or reported, this means protocols must be in place to try to stop it from getting online in the first place. In fact, the Act requires website owners to invest in moderation technologies that filter out such content. While large organisations, like social media companies, have long been using advanced content filtration algorithms to do this, there are free solutions available for smaller websites. There is a recent list of free content moderation tools available here on the Sourceforge website.
Are you taking advantage of generative AI? Read: Generative AI: What is it and How Can it Benefit Website Owners?
Of course, besides filtering dangerous or illegal content, you will need to put in measures that enable users to report issues so that offending material can be swiftly removed. You should also update your terms and conditions and user policies, while providing on-page warnings, where applicable, to reflect the new law.
One of the main aims of the Act is to protect children. This means website owners are now responsible for preventing children from accessing harmful or age-inappropriate content. This includes not just pornography, but content promoting, encouraging or instructing self-harm, suicide, eating disorders, bullying, violence, dangerous stunts or harmful substances. Today, most sites which publish age-restricted content simply ask users to confirm they are over 18 or to input a birth date. Going forward, the act will require more effective ways of preventing underage users from gaining access, for instance, by using age verification solutions.
One of the challenges when protecting children is what constitutes harmful content. While some things are easy to identify and remove from your website, others are not. Dangerous stunts, for example, can be a grey area. Does it mean websites should remove images or videos promoting tree climbing or sports like parkour (free running) which can be risky pursuits? When it comes to moderating user comments and other textual messages, website owners also need to be aware of how young people use language in different ways. It can be quite easy for older adults moderating content not to understand offensive textual abbreviations and slang.
Aside from children, the Act also recognises and seeks to address how harmful content disproportionately affects women and girls. As a result, website owners will now need to remove any illegal content flagged by users as being harassment, stalking or revenge pornography. Specific guidance on these and other female harm-related online issues is currently being developed by Ofcom for website owners.
There are also stipulations in the Act about giving adults more choice about how they access some content. However, these regulations currently only pertain to Category 1 providers, i.e., the major user-to-user online platforms.
Finally, though not relevant for all websites, is that any site which uses algorithms to display different content to different users must undertake a risk assessment to show how that algorithm may expose users to harmful content. This may require companies to make changes to their algorithms to make their platforms safer.
Conclusion
The online Safety Act will ramp up protection for online users, particularly children, women and girls. The responsibility to keep users safe will fall on service providers and the owners of websites that share user-generated content or enable users to interact. Hopefully, this post will have given you a comprehensive understanding of what the Act does and what measures you may need to implement. For more information, visit Ofcom’s New Rules for Online Services page.
Looking for a web hosting provider that takes security and compliance seriously? To see our range of hosting plans and to find out more about our secure, managed solutions visit our homepage.