The UK’s communications, broadcast, and supposed internet content regulator, Ofcom has reportedly introduced new regulations for the video-sharing platforms operating in the country, such as TikTok, Vimeo, Snapchat, and Twitch.
The news guidelines have reportedly been introduced to protect users under 18 and above from harmful content such as videos/ads encouraging violence against protected groups or hate speech. Child sexual abuse material, terrorist content, xenophobia, and racism have also been considered as harmful content.
Speaking on the matter, Ofcom stated that its research shows that one-third of the internet users in the UK claim to have experienced or witnessed hateful content. A quarter of the users stated that they’ve been exposed to violent or disturbing content, while one in five have witnessed videos or content that prompted racism, the regulator added.
There is reportedly no prescriptive list of the measures that must be used by the video-sharing platforms to prevent users from being exposed to harmful content. However, numerous recommendations have been made, such as the ability for uploaders to declare if their content contains an ad; user-friendly mechanisms that will allow viewers to report or flag harmful content; transparent complaints procedures; and clauses in terms and conditions. Age assurance systems and parental controls are also recommended.
A list of video-sharing platforms that have declared themselves under the scope of Ofcom regulations can be found here.
According to Ofcom, the video-sharing platform (VSP) regime is about the safety systems and processes used by the platforms. It won’t regulate individual videos, however, the prevalence of harmful content on a platform might need to be scrutinized closely.
The role of Ofcom as an internet content regulator will take effect in the upcoming years as the government works to pass legislation that will enforce the wide-ranging duty of care on digital service providers. It will supposedly instruct the digital service providers to manage user-generated content to prevent people, especially children, from being exposed to harmful material.