Content media

Can European legislation protect children from harmful content online?

The Digital Services Act will crack down on targeted ads and content to better protect children from harmful content on social media

  • EU law cracks down on targeted ads and promoted content
  • Tech companies that fail to remove illegal content face fines
  • DSA could set global example, say child rights campaigners

By Joanna Gill

BRUSSELS, July 12 (Thomson Reuters Foundation) – A new EU law aimed at curbing tech giants could set the benchmark for global legislation to protect children online, as global concern grows about to the impact of social media on young people, according to child rights activists.

The bloc’s Digital Services Act (DSA) includes a ban on targeted advertising aimed at children and prohibits the algorithmic promotion of content that could be harmful to minors such as videos related to eating disorders or self-harm .

Jim Steyer, founder of Common Sense Media, a US nonprofit focused on children and technology, said the law signed by European lawmakers last week could help introduce similar rules for big tech companies elsewhere. , including in the United States.

“The DSA is landmark legislation, and what you’re going to see is that it will also lead to similar legislation in the United States,” Steyer said, adding that it could bolster various state-led efforts. to regulate social media networks on issues. ranging from child safety to political bias.

By imposing heavy fines on companies that fail to remove illegal content – ​​such as child sexual abuse images – from their platforms, the DSA is effectively ending an era of voluntary self-regulation by companies, the authorities said. activists.

“The importance of this legislation is (to say): ‘No, it’s not voluntary, there are certain things you have to do,'” said Daniela Ligiero, co-founder of Brave Movement, an organization run by survivors fighting to end childhood sexual abuse. .

“We believe this can not only help protect children in Europe, but also set an example… to the rest of the world,” she added.

Between 2010 and 2020, there was a 9,000% increase in online abuse images, according to the US National Center for Missing and Exploited Children, a nonprofit organization, and COVID-19 lockdowns have led to an increase in reports of online child sexual abuse.

Although detailed European Union regulations on child pornography have yet to be developed, the DSA provides for fines of up to 6% of global turnover for platforms that fail to remove illegal content.

Survivors of child sexual abuse or other online crimes such as so-called revenge pornography say sharing videos or images online forces them to relive the abuse and can have a devastating impact on mental health.

APPLICATION CONCERNS

Major tech companies say new EU legislation brings “needed clarity” and could help boost trust in the digital world, said Siada El Ramly, chief executive of Dot Europe, a lobby group for tech giants. , including Apple and Google.

She added, however, that tech companies still wanted clarity from regulators on how they should balance user privacy protections with high transparency requirements.

“We can’t be pulled back and forth,” she said.

Despite praise for the legislation from rights activists, there are concerns about its enforcement. The European Commission has set up a task force, which around 80 officials are expected to join, which critics say is insufficient.

Some have pointed to the poor enforcement of the bloc’s privacy rules governing big tech, known as the General Data Protection Regulation (GDPR).

Four years after it came into force, the EU’s data protection watchdog has lamented stalled progress in long-running cases and called for an EU watchdog, rather than a to national agencies, to handle cross-border privacy cases.

But child rights advocates say the speed with which the DSA was agreed shows policymakers are determined to speed up measures designed to protect children using the internet.

The legislation is “one piece of the puzzle”, said Leanda Barrington-Leach, EU affairs manager at 5Rights, a child online safety advocacy group.

It will set the tone for European regulations in areas of particular concern such as artificial intelligence (AI) and child pornography, both of which are currently in the works at EU level.

Barrington-Leach said another key step for Europe would be enshrining “age-appropriate design code” – a sort of rulebook for designing products and handling children’s data in order to prevent minors from being tracked and profiled online.

Britain pioneered this approach with its Children’s Code, which requires online services to meet 15 design and privacy standards to protect children, such as limiting the collection of their location and other data. personal.

Efforts in the United States to pass similar legislation are progressing at a slower pace and facing significant industry setback.

In Minnesota, for example, a bill that would prevent social media companies from using algorithms to decide what content to show children failed to pass the Senate this year.

But Steyer said a push from California lawmakers to pass a bill enshrining an age-appropriate design code by the end of 2022 could be spurred by the EU lead.

Fundamentally, Barrington-Leach said, the child protection measures contained in the DSA highlight an acceptance of the need for legal safeguards online.

“We keep saying (the kids) are digital natives, they get it all, they’ve got it all sorted. No, they haven’t,” she said.

“The tide is changing and tech companies are realizing that they are now being watched more closely.”

Related stories:

EU eyes Big Tech to control online child sex abuse

“Adults have failed”: young activists fight for digital rights in the United States

OPINION: Digital Services Act: Time for Europe to turn the tide on Big Tech

Our standards: The Thomson Reuters Trust Principles.