Social media firms must ‘tame toxic algorithms’ under new measures to protect children

UK technology firms have been told to ‘tame toxic algorithms’ and implement practical steps for children’s safety online. This all… Continue reading Social media firms must ‘tame toxic algorithms’ under new measures to protect children The post Social media...

Social media firms must ‘tame toxic algorithms’ under new measures to protect children

UK technology firms have been told to ‘tame toxic algorithms’ and implement practical steps for children’s safety online.

This all comes under Ofcom’s new measures titled ‘Children’s Safety Codes of Practice,’ which must be followed by social media sites, apps, and search engines.

Ofcom is the government-approved regulatory and competition authority for broadcasting, telecommunications, and postal industries within the United Kingdom.

One of the first elements listed is age checks, with “much greater use of highly-effective age-assurance” needed. Anything that promotes suicide, self-harm, eating disorders, or pornography is classed as harmful content.

Dangerous challenges, harmful substances, inciting hatred against people with certain characteristics, instructions for acts of serious violence, and real or serious violence against people or animals are also classed as harmful under the UK Online Safety Act.

This should impact all services that do not currently ban harmful content, as they’ll now be expected to implement age checks to prevent children from seeing it.

Dame Melanie Dawes, the Ofcom Chief Executive, explains how this goes “way beyond current industry standards,” but aims to “deliver a step-change in online safety for children in the UK.

“We want children to enjoy life online. But for too long, their experiences have been blighted by seriously harmful content which they can’t avoid or control. Many parents share feelings of frustration and worry about how to keep their children safe. That must change.”

The regulatory watchdog “won’t hesitate to use our full range of enforcement powers to hold platforms to account.”

The draft includes measures to make sure there’s strong accountability for children’s safety taken within technology firms too. They say this should include having a named person accountable for compliance specifically with the children’s safety duties.

OnlyFans under investigation by UK Ofcom

The measures have been published just days after Ofcom announced they’ve opened an investigation into OnlyFans on May 1.

They’re looking at whether the company is doing enough to prevent children from accessing pornography on its site.

While OnlyFans does have age measures in place, Ofcom says they’ve “reviewed submissions we received” and “have grounds to suspect the platform did not implement its age verification measures in such a way as to sufficiently protect under-18s from pornographic material.

An update on the investigation is expected within due course.