As part of the government’s manifesto commitments, they promised to introduce a bill that would strengthen online safety in the UK particularly for minors. This commitment has now been met with the introduction of the Online Safety Bill, which has now passed through all the parliamentary stages and is awaiting Royal Assent.
The bill provides for a new regulatory framework which has the general purpose of making the use of internet services regulated and safer for individuals in the UK.
The bill places new regulatory duties on social media companies to hold them responsible for the content that they host. This landmark legislation is designed to mitigate the harms associated with online interactions, including cyberbullying and the dissemination of illegal content.
What action does the bill require Social Media companies to take?
Under the bill, social media platforms will be expected to:
- remove illegal content quickly or prevent it from appearing in the first place, including content promoting self-harm
- prevent children from accessing harmful and age-inappropriate content
- enforce age limits and age-checking measures
- ensure the risks and dangers posed to children on the largest social media platforms are more transparent, including by publishing risk assessments
- provide parents and children with clear and accessible ways to report problems online when they do arise
What penalties will the bill enforce?
The bill provides a framework whereby companies can be fined significant amounts if they do not act swiftly to prevent and remove illegal content, and to protect minors from seeing material that is harmful to them.
Ofcom are to be made the regulator for online safety and will have the powers to fine by up to £18 million, or 10% of their global annual revenue, whichever is higher. This therefore poses a significant change and considers the accountability principle as set out in current UK data protection legislation.
When the bill receives Royal Assent, Ofcom intends to consult on a set of standards for social media companies to meet in tackling online harms, including child sexual exploitation, fraud, and terrorism.
Further, the bill provides for the directors of these companies to be held directly liable, and potentially face prison time for the offences.
The bill also introduces legislation which will make it easier to convict individuals who share intimate images without consent, as well as adding new offences for cyber-flashing and the sharing of non-consensual deep fake pornography.
What are the difficulties with the law?
The bill itself has been particularly controversial and remains so. The bill is just over 300 pages and very comprehensive, although critics are calling it long and convoluted.
It is unclear yet how it will work alongside Human Rights protections such as Freedom of Speech, and how it will coexist with UK GDPR legislation to protect privacy and personal data. Messaging service WhatsApp for example has already threatened to refuse to comply with the powers in the bill, as it could require them to examine the contents of encrypted messages for illegal content.
Further, Wikipedia, the online Encyclopaedia, has said that it will not comply with the requirement to conduct age checks on users as it would violate their commitment to collect minimal data about readers and contributors.
It is unclear as yet how this will play out in practice, but it seems likely that this bill will face legal scrutiny in the courts when it does come into force, and that it will certainly have a large impact on how social media companies operate in the UK.