Last week of Early Bird!

Can an EU law save children from harmful content online?

4-305

Children’s rights campaigners believe a new EU law to rein in tech giants could serve as a benchmark for worldwide legislation to protect children online, as concern grows globally about the impact of social media on young people.

The Digital Services Act (DSA) includes a ban on targeted advertising aimed at children and prohibits the algorithmic promotion of content that could be harmful for minors such as videos related to eating disorders or self-harm.

Jim Steyer, founder of Common Sense Media, a USA-based nonprofit focused on children and tech, said the act signed by European lawmakers could help usher in similar rules for big tech companies elsewhere, including the United States.

“The DSA is a landmark legislation, and what you’re going to see is it will also lead to similar legislation in the United States,” Steyer said, adding that it could bolster various state-led efforts to regulate social media networks on issues ranging from child safety to political bias, writes oanna Gill of the Thomson Reuters Foundation.

By stipulating hefty fines for firms that fail to remove illegal content – such as child sexual abuse images – from their platforms, the DSA effectively ends an era of voluntary self-regulation by the companies, campaigners said.

“The importance of this legislation is (to say): ‘No, it’s not voluntary, there’s certain things you must do’,” said Daniela Ligiero, co-founder of Brave Movement, a survivor-led organisation fighting to end childhood sexual violence.

“We believe it can not only help protect children in Europe, but then it can serve as an example … to the rest of the world”

Between 2010 and 2020, there has been a 9,000 per cent jump in abuse images online, according to the U.S. National Center for Missing and Exploited Children, a nonprofit, and COVID-19 lockdowns saw a surge in reports about online child sexual abuse. While detailed European Union regulations on child sexual abuse material have yet to be drawn up, the DSA sets out fines of up to six per cent of global turnover for platforms that fail to remove illegal content.

Survivors of child sexual abuse or other online crimes such as so-called revenge porn say the re-sharing of videos or images online forces them to relive the abuse and can have a devastating impact on mental health.

Enforcement worries

Leading tech companies believe the new EU legislation brings “necessary clarity” and could help foster trust in the digital world, said Siada El Ramly, director general at Dot Europe, a lobby group for tech giants including Apple and Google.

She added, however, that tech companies still wanted clarity from regulators on how they should balance protecting users’ privacy with demands for great transparency.

“We can’t be pulled in both directions,” she said.

Despite praise for the legislation from rights campaigners, there is concern about enforcement. The European Commission has set up a task-force, with about 80 officials expected to join up, which critics say is inadequate.

Some have pointed to the poor enforcement of the bloc’s privacy rules governing big tech, known as the General Data Protection Regulation (GDPR). Four years after it came into force, the EU data protection watchdog lamented the stalled progress in long-running cases, and called for an EU watchdog, rather than national agencies to take on cross-border privacy cases.

But children’s rights advocates say the speed with which the DSA was agreed shows policymakers are committed to accelerating measures designed to protect children using the internet.

The legislation is “one piece of the puzzle”, said Leanda Barrington-Leach, head of EU affairs at 5Rights, an advocacy group for children’s online safety. It will set the tone for European regulations on areas of particular concern such as artificial intelligence (AI) and child sexual abuse material, both of which are currently in the works at EU level.

‘Age appropriate design code’

Barrington-Leach said another key step for Europe would be enshrining “the age appropriate design code” – a kind of rule book for designing products and handling children’s data in order to prevent minors from being tracked and profiled online.

Britain pioneered this approach with its Children’s Code, which requires online services to meet 15 design and privacy standards to protect children, such as limiting collection of their location and other personal data.

U.S. efforts to pass similar legislation are progressing at a slower pace and facing significant industry pushback. In Minnesota, for example, a bill that would prevent social media firms from using algorithms to decide what content to show to children failed to pass the state’s senate this year.

But Steyer said a push by California lawmakers to pass a bill enshrining an age appropriate design code by the end of 2022 could get a boost from the EU’s lead. Crucially, said Barrington-Leach, the child protection measures contained in the DSA highlight an acceptance for the need for legal safeguards online.

“We keep saying (children) are digital natives, they understand all this, they’ve got it all sorted. No they haven’t,” she said.

“The tide is changing and tech companies realise that they’re being looked at more closely now.”

Originally published by the Thomson Reuters Foundation, the charitable arm of Thomson Reuters, which covers the lives of people around the world who struggle to live freely or fairly. 

Author: Thomson Reuters Foundation

Add your comment

characters remaining.

Log in through one of the following social media partners to comment.