成人快手

YouTube report function not properly protecting children

  • Published
Youtube logo

The report button on social media sites is supposed to help protect you online.

This gives you a way to let the companies running the sites know that something is wrong.

But a 成人快手 Trending investigation has found YouTube's report system hasn't been working properly for more than a year, making it hard for moderators to remove these comments.

YouTube says it takes child abuse extremely seriously, and reviews the "vast majority" of reported comments within 24 hours.

The site says it has no technical problems in its system reporting. On Wednesday, the company announced new measures to protect children on the site.

There could be up to 100,000 accounts leaving inappropriate comments on videos relating to children.

The Children's Commissioner for England, Anne Longfield, says this is "very worrying".

Last week Prince William launched special plan, with

How has this happened?

Image source, Youtube
Image caption,

This is how you report inappropriate content on Youtube

YouTube is the world's largest video-sharing site, with strict rules about what you can and can't post. But not everyone sticks to these rules.

The site uses to algorithms - a special computer code - programmed to look out for illegal and inappropriate videos.

It also relies on users to report illegal behaviour or content that goes against its rules, including videos or comments that put children in danger.

This is how you report a comment or video on Youtube:

  • You fill in a special online form to say which account the person breaking the rules belongs to. It also asks you to include links to videos and comments which are breaking the rules .

  • This reports then go to moderators - YouTube employees who review the material and have the power to delete it.

Makes sense. So what's gone wrong?

In some reports submitted by users, these links to the videos and comments were missing.

This made it hard for the YouTube moderators to investigate the reported account, because they had no way of knowing which specific comments were being flagged.

How was the problem discovered?

Image source, Getty Images

Members of YouTube's Trusted Flagger programme - a group that includes individuals, as well as some charities and law enforcement agencies - told the 成人快手 about the issue.

This programme began in 2012, run by volunteers who have special tools to alert YouTube to content that should be removed.

Youtube says reports from this group are accurate more than 90% of the time.

With the help of a small group of Trusted Flaggers, the 成人快手 identified 28 comments directed at children that were clearly against the site's guidelines.

They were left on YouTube videos posted by young children, and they are exactly the kind of material that should be immediately removed under YouTube's own rules.

Some of the comments made should also have been reported to the police.

These comments were reported, using a special form that tells YouTube that these comments are dangerous to children.

Over a period of several weeks, five of the comments were deleted, but no action was taken against the 23 others.

After the 成人快手 contacted the company and provided a full list of these accounts, they were deleted within 24 hours.

What does YouTube say?

A YouTube spokesperson said: "We receive hundreds of thousands of flags of content every day and the vast majority of content flagged for violating our guidelines is reviewed in 24 hours.

"Content that endangers children is unacceptable to us...We are committed to getting this right and recognise we need to do more."

The company says it has systems in place to take action quickly, with special staff to review and removing flagged material around the clock, and to close the accounts of people who leave inappropriate comments.

YouTube said that in the past week they've disabled comments on thousands of videos and shut down hundreds of accounts.

What next?

Image source, Office of the Children's Commissioner
Image caption,

Children's Commissioner for England, Anne Longfield is calling the discovery about YouTube "very worrying"

The Children's Commissioner for England, Anne Longfield says YouTube needs to take action across the world to stop something like this happening again.

"This is a global platform and so the company need to ensure they have a global response.

There needs to be a company-wide response that absolutely puts children protection as a number one priority, and has the people and mechanisms in place to ensure that no child has been put in an unsafe position while they using the platform."

The National Crime Agency says it is "vital" online platforms used by children and young people have systems in place to stop, identify and report material that could harm and upset children.