Audience safety in the digital age: How to regulate the internet

Audience safety in the digital age: How to regulate the internet

Alamy
Twitter icon
Facebook icon
LinkedIn icon
e-mail icon

What should policymakers be doing to regulate the internet? Tara Conlan assesses the options

One of the hottest topics at this year’s conference was internet regulation. Fortuitously, Ofcom had released a discussion document about it on the morning of the RTS event, and its findings were explored in a session devoted to the issue.

The panellists learned from Ofcom that one in five Britons say they’ve been harmed by something they’ve seen online, and 12 million have experienced harassment, fraud or abuse through the medium.

Unregulated online channels such as YouTube are now available on smart TVs, creating a media minefield for millions of parents. Session chair and Sky News deputy political editor Beth Rigby asked how the regulatory playing field could be levelled for broadcasters.

With a potential white paper on internet regulation due by the end of the year, Damian Collins MP said he wanted to see an end to the platform-­neutrality status that technology companies enjoyed.

The Chair of the Digital, Culture, Media and Sport Committee also wanted a regulatory system in which, “where there is harmful and illegal content, and the company has been notified of it, they have to take it down within a period of time – and if they don’t, they become liable”.

Collins went on: “We’re going to have to put the right process of regulation in place to deal with problems today and with problems coming down the track.”

He noted that, by the time of the next general election, people could possibly be using editing technologies to publish fake-news videos of politicians making “derogatory remarks”.

Collins added: “The tech companies have a responsibility for the way people use their platforms, particularly when they are aware of it.

"There’s another perfectly legitimate question coming down the track, too: when does YouTube, with its multi­million audience, become a broadcaster?

“More than half of the households in this country have a TV connected to the internet.

If, as far as the TV viewer is concerned, their favourite YouTube channel, their favourite Netflix series and BBC One are all TV – because they consume it on the same device in the same way – and some of it is very heavily regulated, and some not, is that right?

“Does there need to be a more equitable system of broadcasting [regulation] in the future?”

Thinkbox Chair and Advertising Standards Authority council member Tess Alps said it was important to see the internet as a “public asset that needs to be managed for the benefit
of everyone”.

She went on: “Talking about it like the wild west isn’t very helpful, other than to say that the Wild West was eventually tamed through the community deciding how it wanted the place to be and having impartial  sheriffs to administer the rules that the collective agreed on.

“Lots of these internet companies do have their own rules, but that’s how gangs operate.

"What we need is a commonly agreed way of using this fantastic asset, this amazing technology. It then needs to be regulated impartially.”

Rachel Coldicutt, CEO of tech think tank Dot.everyone, said that “the breadth of the internet” complicates the issue… the thing that’s been going on for ages is: ‘Are you a publisher or not?’”

She suggested that “the platforms need to have their own status… recognised as their own entity, and, if that happens, then a different set of rules should apply to them.”

When asked if freedom of expression could be damaged by internet regulation, Coldicutt said Dot.everyone’s research confirmed that significant numbers of people were unhappy about things they had been exposed to.

Sky policy and public affairs director David Wheeldon later said that “tech companies are very adept at stirring up” fears over free speech, and “mobilising people to oppose any intervention”.

He added: “And yet, most of these firms are editorialising to some extent – restricting what you can say on the basis of their own policies.

“So, isn’t it actually the case that the oversight and transparency of their policies are a means of ensuring that there’s freedom of speech?… Should we not be making that case more loudly, so that they don’t stymie any effort to bring this world into some sort of order?”

People were “not completely free” in the comments that they could make offline in public, Alps pointed out.

Collins agreed and noted that “a lot of what we are talking about here is informing the user and making it harder for people to mislead users”.

He argued that users had to be “empowered” so that they could recognise “disinformation”, such as if [a website] was not what it claimed to be.


Damian Collins MP and Beth Rigby
(Credit: Paul Hampartsoumain)

“We need algorithm auditing”, proposed Alps, and an “algorithm standards agency” to check that tech companies were not suppressing information, and also to investigate how such companies went about deciding what was true or false.

Rigby asked if new laws were needed, or whether tech companies needed more time to self-regulate.

Alps pointed to the newspaper industry, which “would claim its self-regulation does work”. She conceded that not everyone would agree with that claim, but it was better than the internet: “At least, in newspapers, there is a code and forms of redress.”

However, Coldicutt wondered how self-regulation would work beyond Facebook, Amazon, Apple, Netflix and Google: “It [also] needs to work for smaller tech companies.”

She argued that the way to get better content was “by regulating the business model”. Regulation could target the way people were currently “incentivised” to create algorithm-friendly, click-bait content that could be easily ranked.

Rigby suggested that an alternative to internet regulation would be levelling the playing field by lightening the regulation of broadcasters. Alps said she thought the public would not want that to happen.

She continued: “We’ve been sold this rather romanticised version of what the internet will do for us… when, in reality, it’s acted as a camouflage for very large, quasi-monopolistic, greedy corporations to grow up and not necessarily help society.

“The single biggest disparity is... user-generated content and videos that are not mediated [being] uploaded – that’s really where the harm comes. The companies that enable this… are, to me, publishers. Whatever they do, it’s tantamount to being a broadcaster.”

Alps stressed that tech organisations “algorithmically” had knowledge of harmful, controversial content on their platforms – “which is why [they are] promoting it and ranking it and making money out of it.

“These companies are huge, why aren’t we expecting them to behave responsibly? God knows, they’ve got plenty of money.

"Facebook has a 40% profit margin; the idea that they can’t pay to pre-vet content is just absurd.”

Collins chipped in: “I don’t have a lot of sympathy with these companies. You have to comply with the law in the countries in which you trade.”

He said accountability was important – newspaper readers know they can contact the editor.

But, “if you take issue with a Facebook page that’s being derogatory about you… you go to Facebook and tell them it’s not true, they say, ‘we don’t have to tell you [about it, and] we aren’t responsible for the content – so good luck’.”

He believed that “the threat of legislation” and the creation of a regulatory framework would ensure that tech companies invested in protecting users.

“They curate user experiences and have a responsibility to the user. Most of the harmful content… is already in breach of [the companies’] own community guidelines.”

Alps concluded: “One day, maybe all broadcasting will stop and [go] online”, so this issue was key for Ofcom.

“The advertisers are a very powerful lever in this. Facebook, Google and so on are based on advertising companies and [feel] pressure from advertisers.

“What would be more important is for Ofcom to set standards and set rules, because most advertisers would not wish to support companies that broke those rules. Even if Ofcom couldn’t impose those rules, advertisers could observe them and that would bale out a lot of water.” 


Why Germany imposed new laws

Germany has tried regulating the internet. A new law banning hate speech online came into full force in January. Called the NetzDG law it gives platforms 24 hours to take down offensive posts or be fined up to e50m (£44m).

Damian Collins MP said Facebook had removed hundreds of posts since the law was introduced. This proved regulation could work: ‘These companies are more or less public utilities now. They are the gateway into the internet every day and there are certain standards we expect. They can’t be totally neutral about content served on their platform.’


How web ads are regulated

‘Lots of people have said today that advertising online is not regulated. It’s the only bit of the internet in the UK that is properly regulated,’ pointed out Tess Alps of Thinkbox.

She said that the Advertising Standards Authority covers ‘all advertising online… more than half of complaints are about online ads and 88% of complaints upheld are about online ads.

‘The slight problem is that the companies earning profits from online advertising don’t necessarily contribute the money that it takes to regulate them as they should.

‘They’re getting much better… But one of the problems is that they don’t pre-vet the content. That’s the single biggest issue.

‘It’s not good enough to wait for someone to complain about it; the harm’s already done.’


What policymakers should do first

One of the biggest hurdles to internet regulation is that the tech behemoths are global players. Moreover, the sheer volume of content being uploaded to the likes of YouTube, or available live on platforms such as Instagram or Microsoft’s Mixer, is problematic.

Pre-vetting is a big issue, said Thinkbox’s Tess Alps, but she pointed out that traditional TV companies did it and were responsible for a huge volume of content around the world. It cost broadcasters millions of pounds every year in compliance procedures. Alps stressed that the technology companies could afford it.

Damian Collins MP highlighted how the tech firms used machine learning to monitor adverts, so that they could also use it ‘to keep you safe [and] identify… harmful sorts of content, which could then be referred to human [moderators]’.

He dismissed the idea that it could not be done (‘This sector could easily find a way!’), and added that, if people uploaded pirated video of football, it came down pretty quickly, ‘because there are pretty big commercial incentives to stop that happening’.

He raised the idea of an independent body that could ensure the tech firms were meeting the standards that
were set.

Rachel Coldicutt of Dot.everyone said she wanted operating systems where ‘you choose at set-up’ whether a media device would operate, for example, ‘in safety mode for a 12-year-old.… Apple and Google could be doing that’.

The Information Commissioner’s Office and Ofcom had ‘different, complementary roles to play’ in regulating the internet, noted Collins. The ICO and Ofcom should be resourced well enough to carry out those roles, added Coldicutt.

Collins agreed and said he thought that tech companies should play a part in funding this.


Session Ten, ‘How to regulate the internet – audience protection in a digital age’, was chaired by Beth Rigby, deputy political editor, Sky News. The panellists were: Tess Alps, Chair, Thinkbox; Rachel Coldicutt, CEO, Dot.everyone; and Damian Collins MP, Chair, DCMS Committee. The producers were Ali Law and Sky News.

how_to_regulate_the_internet_-_audience_protection_in_a_digital_age_rts_london