Ofcom says that every minute, 500 hours of content is uploaded to YouTube, 5,000 videos viewed on TikTok and 695,000 stories shared on Instagram. It is well understood this level of content means that misinformation can spread quickly on social media platforms (for example, about vaccines or the war in Ukraine) and Ofcom points out that having the right critical skills and understanding to decipher fact from fiction is crucial.
However, Ofcom says that 30% of UK adults who go online are unsure about, or don’t even consider, the truthfulness of online information. A further 6% believe everything they see online - a stat even Forrest Gump might be shocked to hear.
When Ofcom showed people social media posts and profiles to see if they could verify their authenticity, only 22% could.
Ofcom says that support for greater online protection is growing. According to its study, 81% of adult internet users want to see tech firms take responsibility for monitoring content on their sites and apps. 65% also want protection against inappropriate or offensive content.
Ofcom’s research also looked at how children use the internet and has raised several concerns. It found that despite being under the minimum age requirement (13 for most social media sites), 33% of parents of 5-7s and 60% of 8-11s said they have a social media profile. Ofcom also says that some children could be using second accounts to conceal aspects of their online lives from parents.
More than a third of children reported engaging in potentially risky behaviours, which could hinder a parent or guardian keeping proper checks on their online use. A fifth surfed in incognito mode or deleted their browsing history, and one in 20 circumvented parental controls put in place to stop them visiting certain apps and sites. Children are seeing less video content from friends online, and more from brands, celebrities and influencers.
However, children feel positive about the benefits of being online, and many use social media as a force for good. Many follow activists and campaigners and post in support of causes. They also use online services to support their personal wellbeing and many help others to do things online.
The Online Safety Bill was introduced to Parliament on 17 March and has combating disinformation as one of its key aims. The Bill introduces a new regime to be regulated by Ofcom and requires platforms to exercise a duty of care, which will require them to have robust and proportionate measures to deal with harms that could cause significant physical or psychological harm to children, such as misinformation and disinformation about vaccines. Platforms will also need to address in their terms of service how they will treat named categories of content which are harmful to adults, including disinformation.
But perhaps we should treat this information from Ofcom with a healthy dose of scepticism - it reminds me of a famous quote by the brilliant Vic Reeves: 88.2% of statistics are made up on the spot.
More than a third of internet users are unaware that online content might be false or biased, according to new Ofcom research.