Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

The government should actually understand the internet if it’s going to protect kids from online harms

This approach to safer internet use for children is as misguided as 1960s fears about TV. We have to shape policy around the lives and perspectives of young people

Amy Orben
Saturday 15 February 2020 16:30 GMT
Comments
UK government to give Ofcom power to police what is posted on the internet

A world with less child abuse content, terrorist materials and self-harm images is a world worth striving for. This week’s publication of the government’s strategy against so-called “online harms” (i.e. to give Ofcom responsibility over policing online content) is therefore a long-overdue step. If only it were more informed.

The policy proposal makes it painfully clear how little we currently know about the effects of new technologies; technologies that we, and our children, use for many happy and productive hours every day.

Our current system for understanding and regulating such innovations, the same one employed to deliver the online harms strategy, is not fit for purpose – it is outpaced by a fast-moving, highly individualised technological space. And these are the obstacles that are holding back our ability to react assertively to such accelerating technological change.

TV ad banned for encouraging children to get likes on social media

Firstly, the current focus on screen time is misguided. Sonia Livingstone, LSE professor, supports this in a report published this week to mark Safer Internet Day; she points out that parents’ fears about three areas – content, contact and conduct – have little to do with the duration of “screen time”. The internet now provides children with a greater variety of uses, content and activities than ever, and time is not an appropriate measure for any of those.

The focus of the government’s new policies on “online harms” might, therefore, be a welcome change for parents, the NSPCC and other organisations campaigning for a safer internet.

Yet while it is relatively clear how self-harm images, radicalised content and child pornography are harmful, there could be many other aspects of the online world that are causing individual or general harm: for example, design features, algorithmic biases, and the tracking of behaviour across platforms.

In her report, Livingstone quotes Wilbur Schramm’s 1961 reflections on the early days of television: “For some children, under some conditions, some television is harmful. For some children under the same conditions, or for the same children under other conditions, it may be beneficial. For most children, under most conditions, most television is probably neither particularly harmful nor particularly beneficial”. If we replace “television” with “internet” in this quote, we have an accurate representation of research today.

It is currently impossible to identify anything except the most obvious of online harms. And what might be harmful to some, could be beneficial to others.

Had there been a concentrated conversation about this when development began on the Online Harms White Paper two years ago, many pertinent questions would have emerged.

The first of these questions is about access to data. While huge amounts of rich data about our online activities are tracked in real time, these data are owned by companies which have little incentive to make them available for research. Even academic researchers – in the UK or anywhere else – are routinely excluded despite needing the data as raw materials to provide important evidence.

As I have found in my work, the lack of data access means researchers often need to rely on children’s (or parents’) own estimates of their time spent online to understand technology effects. This makes it impossible to provide detailed insights about anything other than “screen time” or other vague notions of time spent on different platforms.

The government wants children growing up in the UK to have the world’s best safeguards against online harms. However, policy makers and regulators need to be furnished with high-quality, objective research.

Academic research is heavily curtailed, and politicians are delaying important decisions as a result. If the regulator doesn’t want to be playing catch up with the tech giants for the next few decades, this will have to change. A much closer relationship between academics and policy, and more initiatives to ensure controlled and ethical data-sharing, transparent practices and real-time collaboration between scientists and the tech industry are needed.

The British Academy, the national body for the humanities and social sciences, says debates over childhood policy currently give us an important opportunity for policy to draw on valuable research and protect the most vulnerable from harm.

The first step is to shape policy around the lives and perspectives of children. Where parents see “screen time”, academics might see a far richer variety of different activities children are engaging in, some harmful and some beneficial, e.g. doing homework, skyping relatives, watching TV programmes, reading horror stories or starting mindfulness meditation.

With more well-rounded research and closer links to policy, we may discover more about the extent to which online risks can lead to harm, as well as understanding the opportunities new technologies provide.

As it stands, research is highlighting that social media and digital technology are not as harmful as often feared. But when a more harmful technology arrives, the current system for understanding and reacting to it would be outmanoeuvred. This is where the real risks lie.

Dr Amy Orben is Emmanuel College research fellow at the University of Cambridge

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in