Ofcom has laid out the measures it will require video-sharing platforms to take to better protect users, according to BBC.
The VSPs, including TikTok, Snapchat, Vimeo and Twitch, must take “appropriate measures” to protect users from content related to terrorism, child sexual abuse and racism.
A third of users have seen hateful content on such sites, Ofcom says.
The regulator will fine VSPs that breach the guidelines or – in serious cases – suspend the service entirely.
The VSPs will have to:
- provide and effectively enforce clear rules for uploading content
- make the reporting and complaints process easier
- restrict access to adult sites with robust age-verification
Ofcom promised a report next year into whether those in scope – and there are 18 in total – were taking the appropriate steps.
Specific legal criteria determine whether a service meets the definition of a VSP and whether it falls within UK jurisdiction.
YouTube is expected to fall under the Irish regulatory regime but it will come in scope of the Online Safety Bill, which has a much broader remit to tackle online harms on the big technology platforms, such as Twitter, Facebook and Google, once that becomes law.
Ofcom said one of its main priorities in the coming year would be to work with VSPs to reduce the risk of child sexual abuse material being uploaded.
According to the Internet Watch Foundation, there has been a 77% increase in the amount of self-generated abuse content in 2020.
Inappropriate material
Ofcom’s job will not involve assessing individual videos.
And it acknowledges the massive amount of content will make it impossible to prevent every instance of harm.
But it promised a “rigorous but fair” approach to its new duties.
Chief executive Dame Melanie Dawes said: “Online videos play a huge role in our lives now, particularly for children.
“But many people see hateful, violent or inappropriate material while using them.
“The platforms where these videos are shared now have a legal duty to take steps to protect their users.”