Home   News   Article

Subscribe Now

Emilie Silverwood-Cope: Will the Online Safety Act better protect our children?




At a talk I attended recently, a room full of teachers were asked to put their hand up if their time had been taken up by the following social media issues: cyberbullying, threats of violence and pupils seeing sexual content.

Will the Online Safety Act better protect our children?
Will the Online Safety Act better protect our children?

Every hand went up and stayed up. This had become such a pervasive issue that one teacher, responsible for pastoral care, said social media-related problems (from violence to mental health) took up her entire day.

Will the Online Safety Act (OSA) make a difference to teachers like this and better protect our children?

The OSA is either a disastrous piece of legislation that erodes freedom of speech or an essential policy needed to protect children from very real harms. While that debate rages, what we can be sure of is that this bill brings an end to the era of self-regulation for Big Tech. Meta can no longer ‘move fast and break things’ (Facebook’s internal motto).

Organisations such as the NSPCC and Bereaved Parents for Online Safety pushed the government for tighter regulations in the hope that children will be better protected. What will change?

The Act does not change the age restrictions for children. These will remain at 13 years for social media platforms like TikTok and SnapChat (an age I believe is too low even with protections in place). However, it will now be beholden on these platforms to complete a risk assessment on behalf of their users. They must be clear about what risks their product exposes children to and what mitigations they are going to put in place to reduce harms.

These risks will include exposure to inappropriate content, psychological damage as well as better protect children from grooming and abuse. It also requires tech giants to be clearer about the risks their products pose via their very functionality. How vulnerable are children to the techniques used by platforms to hold their attention - such as ‘brain hacking’ (using their data and online activity to nudge them in the direction of certain types of content)? Can we really protect young brains from the lure of the dopamine hit? Social media platforms have this baked into their business models, so I remain sceptical.

The Act will also mean that these companies, which are the wealthiest to ever exist, could face severe punishments if they do not comply. Fines of 10 per cent of global turnover, or £18million (whichever is bigger) can now be levied. If bosses are found to have put users at risk they could even face imprisonment. This could be for allowing child abuse images to be shared or exposing children to self harm images or pornography.

Nick Clegg, Meta’s president, global affairs, when interviewed on BBC Radio 4 Today (on November 1, 2023) was asked about the easily-found Instagram Reels depicting suicide attempts. For those of you who don’t know, Instagram Reels are 45-second videos. It was a feature copied from TikTok by Instagram with the sole purpose to attract teens back to their app from this competitor. Clegg’s careful response about these videos was about the ‘internet’ rather than about ‘Instagram’. He also tried to argue there are experts who suggest not allowing these images could force this issue ‘under the carpet’ leaving some more vulnerable.

This defence was used by Meta in the case of Molly Russell’s death, and was dismissed. The coroner Andrew Walker said social media had contributed to Molly Russell’s death, stating that she had “died from an act of self-harm whilst suffering from depression and the negative effects of online content”. Meanwhile a BBC investigation showed teen boys are targeted with images of violence and knives. Teachers have talked about videos being used to intimidate and threaten. The parents of murdered 13-year-old Olly Stephen were amongst the group to petition the government for change. Thames Valley Police were clear on the role social media and chat groups played in this murder. The grim list goes on and on.

Maybe we will look back on this generation and ask what on earth we were doing allowing Big Tech to mark their own homework. Maybe the Online Safety Act will be a watershed moment for child safety. Technology secretary Michelle Donelan said the bill was a “game-changing piece of legislation… this government is taking an enormous step forward in our mission to make the UK the safest place in the world to be online.” The world is certainly watching.

In the meantime I’m left wondering what parents think their role should be. A recent survey of 32,000 children, aged seven to 11, ‘suggested almost half used social media sites or apps a few times a week or every day.’ That’s a lot of parents who trust the platforms in their current format, and why shouldn’t they? We exist in a world where we expect the checks and balances to have been done.

However, even in this era of helicopter parenting, it’s clear from teachers, bearing the brunt of what children are getting up to online, that restrictions are needed. It seems parents, who are either blissfully unaware or bamboozled by the tech, need the protection of the Online Safety Act too.

Read more Parenting Truths from Emilie Silverwood-Cope every month in the Cambridge Independent.



This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies - Learn More