The Online Safety Bill – all 247 pages, 212 clauses and 12 schedules of it – reaches the House of Lords on 1st February. But what does it do? And, more importantly, what will the Lords do to it?
It’s a complex Bill, made ever more complex with every Commons stage it has passed through in its long, disrupted journey. The first draft Bill went through pre-legislative scrutiny in the summer of 2021, but the policy foundations stretch back to the Internet Safety Strategy Green Paper of autumn 2017. But, at its heart, is a risk-based regulatory approach, such as that established in many other sectors, that focuses on companies’ systems and processes. This means that potentially there are a wider range of interventions that can be made to improve the online environment, rather than relying just on the take down of content. Interventions could take place in the way accounts are created, the incentives given to content creators, in the way content is spread as well as in the tools made available to users before we got to content take down.
Duties in the Bill
The Bill puts a series of duties on regulated companies (those that provide a user-to-user function and search engines; professional porn services are now within the same regime but subject to separate duties) requiring them to do risk assessments of harms arising from certain types of content and the operation of their service. Companies must then put in place effective and proportionate risk mitigation plans. An independent regulator (OFCOM) oversees the regime, providing codes of practice and guidance for regulated companies and enforcing compliance through a sliding scale of measures up to fines of 10% of annual global turnover or £18m.
Legal but harmful
With the recent, controversial removal of the “harms to adults” duties (so-called “legal but harmful”), two main duties remain for regulated services: illegal content, including terrorism and child sexual exploitation and abuse material, as well as a series of criminal offences listed in the Bill; and content that is harmful to children. The Bill does not expect that services can stop all individual pieces of harmful content that fall under these duties, nor assess every item of content for the potential to cause harm. It requires companies to show they have done everything reasonably possible to ensure that their services don’t facilitate the spread of such content and, when notified that it is on their service, taken action to remove it as swiftly as possible. Despite claims to the contrary, Ofcom, in its own words, “will not censor online content: the Bill does not empower us to adjudicate on individual items of content or accounts.”
Triple Shield
Adults are now, in the language of DCMS, to be protected by a “Triple Shield”: illegal content removed; companies required to enforce their Terms of Service and, if they say they do not allow a type of content or activity on their service, to follow through or be accountable to Ofcom; and enhanced user empowerment tools, allowing users to limit exposure to certain types of content. The largest and most risky companies (Category 1) have duties to protect both journalistic content and “content of democratic importance”; broadcast and print media have a carve out. Category 1 companies and search engines also have to comply with a fraudulent advertising duty; and all regulated services have to take account of freedom of expression and privacy. Separately (in Part 5), new rules for pornography providers have been added to prevent children accessing their services.
Journey so far
The Bill hits the Lords after a particularly turbulent stage in its progress: a long pause between July and December, triggered by the Conservative leadership crisis and extended by the ensuing political and economic turbulence, additional pressure around freedom of expression led to the Government recommitting its new “Triple Shield” clauses to a Commons Committee for scrutiny – something which hasn’t happened for over 20 years. When the Bill had its final Report stage last week, the extent of Conservative backbench support for an amendment to impose criminal liability on senior managers forced the Government into a concession to avoid a rebellion and damaging defeat. A further concession – on an amendment to prevent platforms sharing footage showing migrant boats crossing the Channel “in a positive light” – was unexpected and likely to come under serious challenge in the Lords. Both concessions underline the extent of the Sunak Government challenges in the face of further orchestrated political pressure as the Bill continues its progress.
So, what can we expect in the Lords?
For you
Be part of something bigger, join BCS, The Chartered Institute for IT.
Well, there’s already a long list of amendments promised by the Government. In addition to the two that made recent headlines, there will be the addition of new criminal offences (facilitating self-harm, intimate image abuse) and new statutory consultees for Ofcom’s codes of practice. A further amendment will require category 1 services to publish summaries of their risk assessments, while the DCMS Minister, Paul Scully, appeared to indicate at Report stage that he would look favourably on the amendment championed by Baroness Kidron, in the wake of the Molly Russell inquest, to enable coroners and bereaved parents to have access to children’s data after their death.
There are also many substantive areas that have been tested to some degree in the Commons but on which the Lords will doubtless hone in and table further amendments. These include:
- the impact of the removal of the “harms to adults” duty, in particular whether the “Triple Shield” is sufficient to really protect individual users from an array of serious harms that may fall just below the criminal threshold (such as targeted online abuse, dangerous health misinformation, extremism and incel material). Arguments aired recently in the Commons around the need to reintroduce a risk assessment for such harms, the case for the user empowerment tools to be “on” by default and for minimum standards for terms of service are likely to return;
- the extent of the far-reaching powers granted to the Secretary of State, where we at Carnegie UK have significant and long-held concerns about the scope of the directions that can be given to the regulator, as well as the potential for unjustified meddling in its day-to-day operations. Both unjustified powers will undermine the independence and the legitimacy of the regime;
- the adoption of a code of practice on Violence Against Women and Girls, which Carnegie UK has developed with a coalition of campaigning groups and is now championed by former DCMS Secretary of State, Baroness Morgan, and
- the further tightening up of the protections for children – which the Government is now keen to stress is the priority aim of the Bill; there is likely to be significant pressure from the Lords for the Government to indicate what will be on the “primary priority” and “priority” content list of harmful content that companies will need to address under the Bill’s child safety duties, as well as the specifics of the age assurance requirements to prevent under-18s from accessing it.
Areas which have received less attention to date but are likely to rise to the surface in the Lords, in response to campaigners’ concerns, include: encryption; the Bill’s transparency and data access provisions; the lack of measures to address mis- and disinformation or, conversely, to improve media literacy; and the journalism exemptions.
“The technologists that I know care deeply about online safety.
“But protecting young people takes a combination of both policy changes and long-term education.
“Technology and tech leaders of course, have an important to part play in keeping people safe on social media platforms.
“It is vital that those responsible for creating the technology woven into our lives meet the very highest standards of competence, inclusivity, ethics and accountability.
“And that they are able to prove that commitment, for example by being chartered.
“But, if senior manager liability is introduced it must be balanced with programmes of digital education and advice, so young people and their parents can confidently navigate the risks of social media over a lifetime.”
Next steps
If things go smoothly, the Bill should receive Royal Assent by the summer. But the history of the Online Safety Bill is anything but smooth. One thing we know for certain is that there will be an eighth DCMS Secretary of State overseeing it by the time it enters the Statute Book: the current postholder, Michelle Donelan, is due to go on maternity leave in the spring. Another certainty is that there will continue to be uncertainty on the finer details of the regime for a long time yet, with a long tail of consultations and codes and secondary legislation, valiantly mapped out last summer by Ofcom, before it will be fully enacted. Political and parliamentary delays mean that the regulator is now already at least six months behind the schedule it envisaged then, with an end-point now stretching well into 2025. Labour has already pledged to revise the legislation if it wins the next election.
But we are at least – looking at the start of the new era of digital regulation and a step change in online safety in the UK. [I'll be back with a further blog in the next few months on what the implementation may look like beyond Royal Assent.]
About the author
Maeve Walsh is an Associate with Carnegie UK; Carnegie have published extensive analysis and policy papers on the development of a duty of care for social media platforms and on the progress of the Online Safety Bill, and publish a fortnightly Online Harms newsletter.