It is 40 years since the BBC broadcast David Attenborough’s Life on Earth, the first major series to offer subtitles.
An RTS Thames Valley event in early December celebrated this anniversary with contributions from subtitling experts and Dawn Jones, a subtitle user.
“I’m exhausted at the end of the day from the effort it takes to engage with real life, so it’s lovely to come home, turn the telly and the subtitles on and relax,” said Jones, who was born hard of hearing.
“Accessibility is not just about giving me access to some random entertainment programme. It actually prevents loneliness… because programmes are subtitled, it meant I could participate in conversations at school with my friends and, today, with my [work] colleagues about the latest [show].”
“Accessibility is part of the BBC by its charter,” said Nigel Megitt, an executive product manager in the corporation’s access services. “When the technological capability existed, we started making [programmes] accessible, and the first thing we did was subtitles.”
By 2008, all programmes on the BBC’s main television services were being subtitled. Currently, some 10% of audiences use subtitles daily.
Hewson Maxwell, head of technology development in access services at Red Bee Media, explained how live TV is subtitled by “re-speaking” specialists as shows go out: “The key challenge is producing accurate text when you have no idea what someone is going to say until they’ve already said it – it’s a real skill.”
Pre-recorded shows are largely 100% accurate; on live shows, “there’s always a chance of things coming out wrong”, he said. “You’re trying to say ‘Trump and Pence’ and you get ‘Trump in pants’.”
Dom Bourne, founder of Take 1 access services, discussed how artificial intelligence and humans “can come together to create subtitling [for] the future”.
Automatic speech recognition-generated captions, he added, need a “human polish to boost quality”.