Knowhow: Online Safety Act 2023
Intended to make the UK “the safest place in the world to be online”, the Online Safety Act 2023 (the “Act”) received Royal Asset on 26th October 2023.
Ofcom's initial analysis suggests more than 100,000 online services could be subject to the new rules.
Ofcom
In this piece, we set out the context to the Act (which the new Labour government has promised to “build on”) as well as what its ongoing implementation will mean for our TRACK network. The Act will have implications for the operators (potentially both customer and supplier side) of websites, platforms and online forums where: (i) information can be shared; (ii) advertising is served; or (iii) users might interact with other users.
As fan engagement continues to be a key driver for rightsholders across sport, such rightsholders and their suppliers should ensure they are tackling issues relating to online content in parallel and in accordance with the Act, alongside other relevant legislation such as the EU’s Digital Services Act.
Overview, objectives and compliance with the Act
In short, companies within the remit of the Act will be required to proactively implement systems and processes to remove illegal content which appears on their services. The Act mandates that platforms provide a higher standard of protection for children and introduces potential significant financial penalties for platforms which fail to comply.
In terms of regulation, Ofcom will be the appointed regulator in the UK to enforce the Act, and companies which are found to be in breach will face fines of up to £18m or 10% of their annual global turnover (whichever is greater).
In addition to the corporate penalties set out above, there is also a risk of personal liability for senior managers (albeit this is generally focused on senior managers of large tech companies). This would apply in the event of particularly egregious or repeated breaches, where directors have deliberately failed to comply with Ofcom information requests or obstructed their regulatory duties, in which case senior managers could face criminal prosecution.
Timeline
12 May 2021
The government publishes a draft of the Online Safety bill
2021-2023
The Online Safety Bill is formally introduced to the UK Parliament. This initiates debates in the House of Commons and House of Lords, where MPs and Lords review and suggest amendments to the legislation.
26 October 2023
The Online Safety Bill is passed by Parliament and receives Royal Assent, making it officially law.
10 January 2024
The majority of provisions are brought into force (noting that certain obligations for online service providers will be subject to a phased introduction as set out below).
Some provisions of the Act have already entered into force (for example, Ofcom's powers relating to information gathering). However, the obligations for online service providers will only be implemented fully after the relevant statutory codes of practice come into force following a period of public consultation and approval by the UK Parliament. Ofcom’s “Roadmap” for implementing the Act (available here) assumes that the codes of practice will come into force in phases between Spring 2025 and Spring 2026, as summarised below:
Expected Spring 2025
Phase 1 - illegal harms duties
Consultation closed on 23 February 2024. Following review of the responses, Ofcom plans to publish final versions of the documents by the end of 2024, with the Code of Practice expected to be submitted for Parliamentary approval in Spring 2025, after which the duties will come into force.
Expected end of 2025
Phase 2 - child safety duties and pornography
Consultation closed on 5 March 2024.
Ofcom anticipates that it will publish the final guidance by Summer 2025, with Parliamentary approval to follow by the end of 2025, after which the duties will come into force.
Expected Spring 2026
Phase 3 - transparency, user empowerment, and other duties on categorised platforms
Initial consultation launched on 25 March 2024, to inform wider consultation in early 2025. Ofcom anticipates publishing final codes of practice and guidance towards the end of 2025, with Parliamentary approval expected in Spring 2026, after which the duties will come into force.
What companies does the Act apply to?
The Act applies to the following types of companies:
- user-to-user (U2U) services – e.g., social media, marketplaces, audio and video-sharing services, messaging services and information sharing services;
- search services – e.g., search engines; and
- pornography platforms.
Ofcom published a questionnaire (available here) for businesses to determine if the Act applies to them. For those in our TRACK network operating platforms involving a “user-to-user service” (i.e. an online service that allows its users to interact with each other), it is likely that the Act will apply.
The Act applies to services which are “regulated” (i.e., those which have links with the UK and are not exempt). A service will have a “link with the UK” if any of the following criteria are met:
- the service hosts a significant number of UK users;
- UK users constitute the sole or primary target market for the service; and
- the service is capable of use within the UK, with reasonable grounds to believe the user-generated content on the service provides a material risk of significant harm to individuals within the UK.
It follows that the Act is not restricted to companies and services which are exclusively based in the UK – global companies are also held to account.
Key takeaways for our TRACK network
Ofcom
As set out above, the Act's provisions will encompass websites, platforms and online forums where: (i) information can be shared; (ii) advertising is served; (iii) users might interact with other users, and it is therefore likely to span organisations across a wide range of sectors, from very large and well-resourced organisations to small and medium sized businesses.
With this in mind, the key takeaways for such platforms operating within the sector are as follows:
a) Duty of care
Platforms that host user-generated content (including social media, chatroom forums and apps) must take steps to minimise harm from illegal content and content harmful to children.
The kinds of illegal content and activity that platforms need to protect users from are set out in the Act, and this includes content relating to:
· child sexual abuse
· controlling or coercive behaviour
· extreme sexual violence
· extreme pornography
· fraud
· racially or religiously aggravated public order offences
· inciting violence
· illegal immigration and people smuggling
· promoting or facilitating suicide
· intimate image abuse
· selling illegal drugs or weapons
· sexual exploitation
· terrorism
Operators may need to review and update content moderation policies and systems to meet compliance standards. This should include the implementation of the advice published by Ofcom.
b) Chatroom functions
Apps or platforms which involve the use of chatrooms, where users can post live messages, will likely fall within the scope of the provisions relating to “user-generated content”. This means they are subject to the Act's duty to minimise exposure to harmful and illegal content.
Operators which run chatrooms will need to be particularly careful to monitor and moderate conversations to ensure harmful or illegal content (like hate speech, child abuse material, or cyberbullying) is detected and removed.
Live communication in chatrooms can increase the risk of grooming or harassment, so if a platform has a chatroom feature, it will be required to conduct a risk assessment. This involves identifying potential risks that the chatroom could pose, particularly to vulnerable users like children, and implementing systems to manage or mitigate these risks (including offering users easy ways to instantly report harmful messages they have received).
c) Age verification requirements
Platforms which are “likely to be accessed by children” will be required to implement “robust” age verification measures or other ways to distinguish between adults and minors.
This could well be relevant in the sports context, and if so, operators should consider investing in effective age-checking tools or mechanisms (either themselves or through trusted suppliers) and offer parental control options. Different technologies can be used to check people’s ages online, but websites with age restrictions need to specify in their terms of service what measures they use to prevent underage access and apply these terms consistently.
d) Reporting requirements
As part of the drive towards stronger governance and accountability, companies within the scope of the Act will be required to produce transparency reports that explain how they deal with harmful and illegal content on their platforms. These reports will include information about the algorithms used and their effect on users, including children, and will be submitted to Ofcom for review.
Even smaller businesses are expected to meet these reporting requirements, but the frequency and depth may be scaled down for smaller operations to ensure proportionality.
e) "Safe by design"
Reminiscent of GDPR/UK GDPR principles of “data protection by design and by default”, platforms should incorporate the principle of "safe by design" in their software development processes, to ensure that they are set up and operated in a way that provides a higher standard of protection for children than for adults. In practice, this will involve the integration of features which prevent or mitigate harmful content, and small businesses are advised to consider these safety elements at an early stage of the design and operation of the software, to avoid it having to be adapted retrospectively.
f) Principle of proportionality
Within the Act, there are certain limited practical exemptions (i.e. educational providers who are using the service to exercise their educational function) and proportionality considerations for smaller platforms and businesses. It is stated that the measures must be “proportionate and technically feasible” for providers, and notes that obligations may be adjusted based on the size, type, and reach of the platform. This may mean that small companies perceived to be lower risk are subject to less extensive regulatory obligations.
Notwithstanding this, service providers must still demonstrate that they are taking sufficient steps to ensure the online safety of their users, regardless of their size.
g) Empowering adult users
Platforms must provide tools that allow users to “increase their control” over their experience, such as filtering harmful content, blocking certain users, or reporting abusive content. It will also enable adult users to avoid content they do not want to see.
Small businesses should make sure these user empowerment features are available and easy to use, such as a control permitting users to retain and or change the default setting for certain types of content.
h) Procuring suppliers
If your platform outsources certain functions (i.e. content moderation or age verification), to a third-party supplier, that supplier will have responsibility for meeting those specific compliance obligations.
However, it remains the case that the primary operator of the platform will be generally responsible for ensuring that any third-party suppliers used comply with the Act's requirements. It follows that primary operators should ensure they obtain sufficient contractual controls and protections when engaging with third party suppliers, and responsibilities are allocated appropriately.