Will the white paper published to prevent online harm actually be effective?

Will the white paper published to prevent online harm actually be effective?

Friday, 24th May 2019
(Credit: Shutterstock)
(Credit: Shutterstock)
Twitter icon
Facebook icon
LinkedIn icon
e-mail icon

Stewart Purvis welcomes the recent white paper on online harm but warns of unintended consequences

In May 2018, the Government announced that, later that year, it would publish a white paper “that will cover the full range of online harms”. In September 2018, with no publication date yet in sight, the Financial Times reported that ministers were grappling with how to force technology companies to take more responsibility for online content.

Government intervention was said to be part of an international trend. Germany had introduced fines for platforms that failed to remove hate speech within 24 hours, but the UK would be the first in Europe to go further.

A joint letter, signed by the heads of the BBC, Sky, ITV, Channel 4 and BT, had argued for independent regulatory oversight of content posted on social media platforms. However, the FT reported that “Stewart Purvis, a former Ofcom official, said he has yet to see a workable proposal for increasing oversight of social media companies”.

A year on, we finally have the white paper and I, for one, think the time has been well spent by the DCMS and Home Office on proposals that could indeed be workable. But the focus has now shifted to whether their plan will have unintended consequences that will limit freedom of speech.

The 98-page white paper “Online harms” goes further than any previous British administration has dared to tread. That “this is a complex and novel area for public policy” is an elegant understatement.

Politicians who once seemed in awe of the tech companies now threaten to “disrupt the business activities of a non-compliant company”, even one based outside the UK.

"The white paper targets companies such as Facebook, Snapchat and YouTube"

The global giants could be fined or banned and their directors held criminally liable. The days when the tech giants could say they were “mere conduits” for the material they distributed seem long gone.

The political momentum for change became unstoppable the month before publication, after what the white paper calls “a co-ordinated cross-platform effort to generate maximum reach of footage of the attack” on mosques in New Zealand, when the gunman live-streamed his shooting on Facebook Live.

The document is full of good reasons why something has to be done. No fewer than 23 “online harms in scope” are listed. Child exploitation and distributing terrorist content top the list.

But many of the harms on the list are already illegal and no new offences are created. Specifically, as Paul Herbert of Goodman Derrick has pointed out, the Government has decided against creating any new offences for hosting illegal or harmful content, which he says would have been a “radical challenge”. No bloggers will go to jail unless it is for something that is already illegal.

Instead, the white paper targets companies such as Facebook, Snapchat and YouTube, which allow users to share or discover user-generated content or interact with each other online. They would have a new statutory duty of care to take more responsibility for the safety of their users and tackle harm caused by content or activity on their services.

A new independent regulator, mostly funded by industry, would enforce it. This approach has been generally welcomed. The tech companies are no longer pushing back against new legal obligations as forcefully as they used to, in public at least. Facebook’s Mark Zuckerberg told Congress in April 2018 that he would welcome regulation, but with the rider that it had to be the right regulation.

The public debate about what the right regulation for the UK is has been mostly about the possibility of unintended consequences. Comparisons with North Korean-style censorship have been littered around rather carelessly, but the Society of Editors (SoE) has correctly focused on the potential weak spot in the Government’s ideas. “Where the white paper moves into areas concerning the spread of misinformation – so called fake news – we should all be concerned,” says the SoE and asks: “Who will decide what is fake news?”

In his reply, the DCMS Secretary of State, Jeremy Wright, accepted that the breadth of the proposals means that they will affect “organisations of all sizes, including social media platforms, file-hosting sites, public discussion forums, messaging services and search engines”.

But, seeking to reassure the older media, he said, “Journalistic or editorial content will not be affected by the regulatory framework.”

"Focus on protecting users from the most harmful content, “not judging what is true or not”.

The proposed new independent regulator “will not be responsible for policing truth and accuracy online”. Where services are “already well ­regulated”, by bodies such as the press self-regulators Ipso and Impress, Wright has said “we will not duplicate those efforts”.

In Whitehall’s mind, the news world seems to divide between the “real journalism” that comes from what we used to call Fleet Street and the “fake journalism” emanating from the Internet Research Agency of 55 Savushkina Street, St Petersburg.

If only life was so simple. The world has moved on from the days when only journalists did journalism. In the white paper there are moments when you wonder if the drafters understand how journalists and non-journalists alike use social media to distribute news and opinion, and how comment sections on sites can be as important as the original “journalistic” article.

For an example of the simplistic approach, take paragraph 4 of the section of the white paper’s executive summary headed “The problem”. It says: “Social media platforms use algorithms, which can lead to ‘echo chambers’ or ‘filter bubbles’, where a user is presented with only one type of content instead of seeing a range of voices and opinions. This can promote disinformation by ensuring that users do not see rebuttals or other sources that may disagree.”

What about the thousands of single-­minded and occasionally bloody-minded partisan voices offering independent commentary that are an essential part of the internet. They do not seek to offer a balanced view of the world and readers would not expect a right to reply. This paragraph almost sounds like an echo from last year’s recommendation from the otherwise well-informed DCMS Committee that the Government should use the Ofcom rules on impartiality to set standards for online content.

To offset any concerns about possible government restrictions on “freedom of expression online” and “a free, open and secure internet”, there are reassurances in the white paper that seek to go beyond fine words.

The independent regulator – either Ofcom or a new body – will be told to focus on protecting users from the most harmful content, “not judging what is true or not”. If the regulator is to be Ofcom, we can be sure its experience in broadcasting will be valuable in making the expected “difficult judgement calls”.

Ian Murray, executive director of the SoE, says he welcomed the reassurance from the DCMS, “but we must be ever-vigilant of the laws of unintended consequences and what some politicians or a future government may do to use online-harms legislation to restrict freedom of speech”.

There is now a consultation period until 1 July, and vigilance will, indeed, be needed to ensure that, when legislation is finally presented to Parliament, the unintended, the unanticipated and the unforeseen do not flow from what is otherwise a sensible, practical and important law.

 

Stewart Purvis was Ofcom partner for content and standards 2007-10. During that time, he chaired the Digital Britain Media Literacy Working Party and was a member of the Government’s UK Council for Child Internet Safety. He is now a  non-executive director of Channel 4 and writes here in a personal capacity.

 

how_to_regulate_the_internet_-_audience_protection_in_a_digital_age_rts_london

You are here