Young people should report harmful content online, the communications watchdog has said, after finding that two-thirds have experienced potential harm on social media, but only one in six report it.
Ofcom found that 67% of people aged 13-24 had seen potentially dangerous content online, although only 17% reported it. The regulator is tasked with enforcing measures in the upcoming Online Safety Bill, which will require social media companies to protect children and adults from harm online.
The most common potential harm encountered online was offensive or coarse language (28%), according to respondents to Ofcom’s Online Nation 2022 report, followed by: misinformation (23%); scams, fraud and phishing (22%); unwanted friends or follow requests (21%) and trolling (17%). A further 14% had experienced bullying, abusive behavior and threats online.
Ofcom is launching a campaign with TikTok influencer Lewis Leigh, who rose to fame during lockdown by posting videos of himself teaching his grandmother dance moves. The ‘Only Nans’ campaign will encourage young people to report harmful content they see on social media.
The campaign is also supported by Jo Hemmings, a behavioral psychologist. She said: “People react very differently when they see something dangerous in real life – reporting it to the police or asking a friend, relative or guardian for help – but take often very little action when they see the same thing in the virtual world.”
TikTok deleted more than 85 million pieces of content in the last three months of last year, with nearly 5% of that total coming from user referrals. Instagram removed more than 43 million pieces of content during the same period, more than 6% of which came from users flagging or reporting content.
Anna-Sophie Harling, Head of Online Safety at Ofcom, said: “Our campaign is designed to empower young people to report harmful content when they see it, and we’re ready to hold tech companies accountable. efficiency with which they react”.
The Online Safety Bill is expected to become law by the end of the year. Ofcom will have the power to impose fines of £18million or 10% of a company’s worldwide turnover for breaching the law, which imposes a duty of care on technology companies to protect people from harmful user-generated content. One of the bill’s specific mandates is to ensure that children are not exposed to harmful or inappropriate content.
Andy Burrows, head of child online safety policy at the NSPCC, who called for the bill to be strengthened, said: “This report shows how young people are at increased risk of encountering harmful but feel unsupported on social media and either don’t know how to report it or feel like the platforms just won’t take action when they do.