Facebook, Instagram and YouTube: Government could fine social media companies for harmful videos and other content
- Published
- comments
The UK is planning to bring in huge fines for social media and tech companies if they are found not to be doing enough to keep young people safe.
It's after concerns that tech companies and social media sites like Facebook, Instagram, YouTube and Google aren't trying hard enough to stop children from seeing harmful and inappropriate content, and protecting them enough from cyber-bullying and fake news.
It's also trying to tackle the harms that could be caused by videosharing apps and live streaming.
The government says it wants to give powers to a team of experts called a regulator that will set out rules that tech companies will have to follow.
The plans could allow the media regulator - Ofcom - to give out multi-million-pound fines if it decides the platforms aren't doing enough.
It could also have the power to block the sites completely.
The idea of new rules to tackle 'online harms' was originally set out by the Department for Digital, Culture, Media and Sport in May 2018 but now the government's plans are becoming a bit clearer.
It's suggested Ofcom could take charge of the issue from September 2020 and it has told the ³ÉÈË¿ìÊÖ that it is ready to take on the powers.
It also said it supports plans to go further and bring in "a wider set of protections, including a duty of care for online companies towards their users."
What are these rules about ?
The government wants to hold internet companies to account for what is available on their sites, apps or social networks.
It says that they need to be more responsible for protecting young people - who in some cases aren't supposed to be using their sites.
Some of the content, like videos and websites, isn't appropriate for certain age groups and the government want companies to be forced to work harder to manage that better.
They want "code of best practice" that social networks and internet companies will have to work to.
As well as Facebook, Twitter and Google, the rules would apply to messaging services like Snapchat and cloud storage services.
The regulator will have the power to fine companies and the company bosses.
It will also be able to make search engines remove links to websites that break its rules.
What is in the "code of best practice"?
Lots of the details are still unclear, but some suggestions about what could be included in the "code of best practice" are things like tackling fake news by forcing social networks to employ fact-checkers and promote real news sources.
However, regulators are meant to be independent - or outside of control by politicians - so the government will ask the regulator to come up with the rules itself.
They have also suggested social media companies should give out yearly reports about how much harmful content has been found on their platforms.
What have people been saying?
The child safety charity NSPCC has welcomed the plans saying "This is a real chance to bring in protections... and to finally hold sites to account if they put children at risk."
TechUK - the industry group that represents big companies - said it hoped that ministers would take a "balanced" approach to this issue. They want clear and precise definitions of what the rules would be.
The ³ÉÈË¿ìÊÖ has asked Facebook, YouTube, Twitter and Snapchat what they think but they haven't commented yet.
Not everyone agrees with the plans.
Some people worry about who is going to be deciding what's available on the internet and what content or videos will be seen as "harmful".
Campaign group Article 19 warn that the government must not create an environment that encourages the "censorship of legitimate expression".
- Published7 February 2017
- Published7 February 2017
- Published7 February 2017