Social media companies to face big fines for harmful content
- Published
- comments
New internet laws will be published today with the aim of protecting children online.
Known as the Online Safety Bill, it was officially announced during the Queen's speech in parliament on Monday.
Under the new laws, the government plans huge fines for social media companies that fail to remove harmful or upsetting material.
What is covered in the bill and what does it mean?
The bill covers a huge amount of potentially harmful content that you might see online - such as posts promoting abuse or eating disorders.
It also covers online content which is harmful for everyone, not just children, including terrorism, fake news and racist abuse.
It means all social media companies will have a duty of care towards their users so that what is unacceptable offline will also be unacceptable online.
As part of that, companies will need to consider the risks their sites may pose to the youngest and most vulnerable people who use them and protect children from inappropriate content.
How will it work?
Media regulator, Ofcom (the Office of Communications) is in charge, it will explain to companies what they need to do and will check that they are following the rules.
If a company breaks the rules, then Ofcom can issue fines of up to £18 million, or for bigger companies, 10 percent of the money they have made in a year.
The ³ÉÈË¿ìÊÖ Secretary Priti Patel said it will "force" social media companies to report material that is harmful for children.
"It's time for tech companies to be held to account and to protect the British people from harm. If they fail to do so, they will face penalties," she said.
Dame Melanie Dawes, Ofcom Chief Executive, said the bill will mean the "benefits of being online" will outweigh the negatives "for children and adults".
Does it go far enough?
The NSPCC, (National Society for the Prevention of Cruelty to Children) has previously said that fines do not go far enough and has called on the government to make senior managers of social media companies face action in court.
The government said that it's up to Ofcom to decide what action to take "if tech companies fail to live up to their new responsibilities".
Meanwhile tech businesswoman Belinda Parmar says Ofcom should speak with young people - "those who have actually suffered harm" - as they come up with their set of rules.
Ofcom promised to "soon say more" about how things will work in practice.
What about free speech?
Freedom of speech is the right to share opinions and ideas without being stopped or punished. Sometimes this is also called freedom of expression.
Jim Killock, executive director of the Open Rights Group is worried that social media companies will use AI to censor lots of content online, without properly checking if they're harmful or inappropriate, just to avoid fines.
"Companies will always do whatever they can to minimise financial risk to themselves," she said.
In response the government said that companies will need to "put in place safeguards for freedom of expression," and have ways for users to appeal about content removed without good reason.
Users will also be able to appeal directly to Ofcom.
What next?
Before it comes into force, the bill will be checked by a group of MPs (Members of Parliament). Chairperson for the group, Julian Knight said it will be pressing for it "to be given top priority".
"Powerful tech companies must be held to account for online harms," he said.
- Published1 May 2021
- Published28 April 2021