Landmark laws to keep children safe online

New internet laws are included in the draft Online Safety Bill to protect children online and tackle some of the worst abuses on social media.

The draft Bill includes changes to put an end to harmful practices, while ushering in a new era of accountability and protections for democratic debate.

Social media sites, websites, apps and other services hosting user-generated content or allowing people to talk to others online must remove and limit the spread of illegal and harmful content such as child sexual abuse, terrorist material and suicide content.

Ofcom will be given the power to fine companies failing in a new duty of care up to £18 million or ten per cent of annual global turnover, whichever is higher, and have the power to block access to sites.

The draft Bill will be scrutinised by a joint committee of MPs before a final version is formally introduced to Parliament.

In line with the government’s response to the Online Harms White Paper, all companies in scope will have a duty of care towards their users so that what is unacceptable offline will also be unacceptable online.

They will need to consider the risks their sites may pose to the youngest and most vulnerable people and act to protect children from inappropriate content and harmful activity.

They will need to take robust action to tackle illegal abuse, including swift and effective action against hate crimes, harassment and threats directed at individuals and keep their promises to users about their standards.

The largest and most popular social media sites (Category 1 services) will need to act on content that is lawful but still harmful such as abuse that falls below the threshold of a criminal offence, encouragement of self-harm and mis/disinformation. Category 1 platforms will need to state explicitly in their terms and conditions how they will address these legal harms and Ofcom will hold them to account.

The draft Bill contains reserved powers for Ofcom to pursue criminal action against named senior managers whose companies do not comply with Ofcom’s requests for information. These will be introduced if tech companies fail to live up to their new responsibilities. A review will take place at least two years after the new regulatory regime is fully operational.

The final legislation, when introduced to Parliament, will contain provisions that require companies to report child sexual exploitation and abuse (CSEA) content identified on their services. This will ensure companies provide law enforcement with the high-quality information they need to safeguard victims and investigate offenders.

Dr Alex George, The UK Government’s Youth Mental Health Ambassador, said: "This is a landmark moment here in the UK. The problem of online abuse has escalated into a real epidemic which is affecting people physically as well as psychologically and it is time that something is done.

"That’s why I welcome today’s announcement about the Online Safety Bill and the protection it will provide people. Social media companies must play their part in protecting those who consume and engage with their content."

Read more