UK’s ambitious plan for addressing all online harms
HM Government: Online Harms, White Paper
The UK White Paper puts forward a new regulatory framework for tech companies, moving beyond self-regulation. The framework will be overseen by an independent regulator (Ofcom is seen as a “strong candidate”), which has to set clear safety standards, backed up by reporting requirements and effective enforcement powers. The purpose of the government is to turn UK into “the safest place in the world to go online, and the best place to start and grow a digital business”.
The companies in scope will include social media platforms, but also other services or tools that allow, enable or facilitate users to share or discover user-generated content, or interact with each other online: file hosting sites, public discussion forums, messaging services, search engines, etc.
The companies need to be able to show that they are fulfilling their new statutory duty of care. The regulator will set out how to do this in codes of practice, directed by the government on matters involving national security or CSEA: “It will be important to ensure that Parliament is able to scrutinise the regulator’s work“. The regulator will also work in collaboration with law enforcement and other relevant government agencies to ensure the codes adequately keep pace with the threat.
The new regulatory framework sets high expectations in terms of transparency: online companies need to give access to their data to the regulator and/or researchers if asked to do so. They also need to have effective and easy-to-access user complaints functions, responding within an appropriate timeframe. The framework considers having designated bodies to make “super complaints” to the regulator in order to defend the needs of users.
The regulator may issue substantial fines and impose liability on individual members of senior management, or even impose disruption of business activities or ISP blocking. Nevertheless, the report argues that the liability regime is not the most effective mechanism for driving behavioural change by companies. Instead, they need to be required to have effective and proportionate processes and governance in place to reduce the risk of illegal and harmful activity on their platforms already before issues arise.
The regulator will also have broader responsibilities to promote education and awareness-raising about online safety, and to promote the development and adoption of safety technologies to tackle online harms. To accomplish this goal an online media literacy strategy will be developed. The first step will be a comprehensive mapping exercise to identify what actions are already underway: to increase transparency about the level of investment and the effectiveness of different interventions, as well as to avoid duplication across industry.
The regulator will have sufficient resources and the right expertise and capability to perform its role effectively, including a legal duty to pay due regard to innovation. It will be funded by industry in the medium term, and the government is exploring options such as fees, charges or a levy to put it on a sustainable footing.
The White Paper provides an initial list of online harmful content or activity in, based on an assessment of their prevalence and impact on individuals and society dividing online harms in three categories:
- Harms with a clear definition (Child sexual exploitation and abuse, terrorist content and activity, organised immigration crime, modern slavery, extreme pornography, revenge pornography, harassment and cyberstalking, hate crime, encouraging or assisting suicide, incitement of violence, sale of illegal goods/services, such as drugs and weapons (on the open internet), content illegally uploaded from prisons, sexting of indecent images by under 18s);
- Harms with a less clear definition (Cyberbullying and trolling, extremist content and activity, coercive behaviour, intimidation, disinformation, violent content, advocacy of self-harm, promotion of Female Genital Mutilation (FGM)
- Underage exposure to legal content (Children accessing pornography, children accessing inappropriate material).
Designed addiction is also listed as an emerging challenge. A number of issues related to online advertising, also requires attention. Some categories of harmful content or activity online have already received an effective regulatory response and they will be excluded from the scope (hacking, dark net, breach of data protection).
Compiled by Media 21 Foundation from Online harms, White paper
- Online event: The Broken Internet Symposium, 21 December 2020
- Storytelling for Virtual Production: From Script to Headset Format
- Online: Immersive Technologies Symposium – VR & ZOOM, October 19-20
- Potential impacts of blockchain technology, Internet of Things, 5G and Artificial Intelligence
- Disinfodemic responses: how to assess their challenges and risks