Over the past few weeks I’ve discussed ‘life online’ with students at a number of our local secondary schools.

We’ve discussed the amount of time people spend online now, the platforms they use, and their views on the benefits and downsides of the different ones.

And we’ve discussed what they would like to see in law or regulation to help keep users safe. 

As a parent, I share many of the concerns of others on the risks posed to young people through their use and exposure of social media.

I was, however, quite struck by both the understanding and maturity of the students’ views with regard to those risks.

The internet and social media have, obviously, changed our world. We engage with each other differently as a result. We access information and news in completely new ways, and how we entertain ourselves has altered beyond recognition.

As these students recognised themselves, there are many positive experiences to be had online, but also dangers for people, and particularly for young people. 

It was clear from our discussions that the students valued the immediacy and contact that social media offers. The direct communication with parents, friends, school and other communities.

But they were also aware of the dangers of miscommunication, disinformation and access to inappropriate content.

The return of the Online Safety Bill to the House this week is another step towards greater protection online, yet it also highlights once again the complexity facing legislators.

The Bill has taken time to develop, with the necessary evidence gathering and consultation but also the right and proper scrutiny afforded by due parliamentary process. 

In terms of protection for children, the Bill lays out more clearly the existing expectations of platforms in understanding the age of their users. And for platforms that specify a minimum age for users, a requirement to explain in their terms of service the measures they use to enforce this. Firms will also need to publish information about the risk their platform poses to children, and must show how they enforce their age limits to stop children bypassing their authentication methods.

This will enable users and parents to understand the risks and the approach platforms have with regard to children’s safety.

Firms must also publish whenever Ofcom have taken action against them, and companies will face fines of up to ten per cent of global turnover for failing to adhere to these measures.

The children’s commissioner will be named as a statutory consultee for Ofcom in its development of the codes of practice to ensure the measures relating to children are robust and reflect the concerns of parents.

A broader area of debate on the Online Safety Bill has been on the issue of ‘legal but harmful’ content, which of course poses its own challenge in terms of protecting adult freedom of speech. 

The problem is, of course, that in law we only really deal in ‘legal or illegal’, and if something is legal, well… it’s legal, and that’s that. Whereas it can be really hard to nail down robust definitions of some of the things you find online that can cause real harm.

Those things have always existed, but in the past they were hard to come by, and so most people never did. On the internet they are just a couple of clicks away – and on occasion it comes to the user without actively looking for it at all.

In the Online Safety Bill as it is now before parliament, there will be a ‘triple shield’.

First, this reaffirms the need for illegal content to be removed. The Bill includes a number of priority offences, and companies must proactively prevent users from encountering this content.

Secondly, legal content that a platform prohibits in its own terms of service should be removed, so there is clear accountability for platform owners to not only define this, but also to remove content that breaches their terms of service. 

And thirdly, there will be a duty on platforms to provide users with greater functionality to control their exposure to unsolicited content.  

These functions should not limit discussion, robust debate or support groups’ ability to speak about issues freely, but enable adults to reduce the likelihood that they will see certain categories of content if they so choose.  

Legislation is not the only answer, however, and it remains important for people to stay informed and to understand, for instance, the terms of service that individual platforms offer. 

Parents clearly have a vital role in helping young people navigate the online world, and taking time to talk about it is a great place to start.