Homepage Smartphone US court rulings increase pressure on Meta over child safety

US court rulings increase pressure on Meta over child safety

Teenage girl with smartphone mobile phone
Shutterstock

A series of legal decisions is drawing renewed attention to how major technology platforms operate. The outcomes highlight ongoing concerns about the balance between innovation, responsibility, and user protection.

Others are reading now

A recent jury verdict in New Mexico is adding to the legal challenges facing Meta, as courts increasingly scrutinise how social media platforms handle risks to younger users.

According to the BBC, the jury ordered the company to pay $375 million after concluding it had given users an inaccurate impression of how well children were protected on its platforms.

The decision followed a seven-week trial, where former employees and internal documents were presented as evidence.

Meta has said it will appeal the ruling and highlighted recent safety initiatives, including “Teen Accounts” and tools that notify parents if users search for self-harm content. Still, prosecutors argued those measures did not address deeper structural issues.

New Mexico Attorney General Raúl Torrez called the outcome “historic,” adding: “Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew.”

Also read

Jurors were shown research and testimony suggesting that recommendation systems could steer minors toward sexual content or expose them to predatory contact. BBC reports that internal findings indicated a noticeable portion of users encountered unwanted explicit material within a short timeframe.

In recent years, similar lawsuits have been filed across the US, but this case stands out because it directly challenged how the platform’s underlying systems function -not just individual pieces of content.

Attention turns to platform design

A separate case on the US West Coast focuses less on exposure to harmful material and more on how social media is built.

As reported by Danish broadcaster DR, a jury in Los Angeles awarded damages to a 20-year-old woman who argued that her use of platforms like Instagram and YouTube as a child developed into compulsive behaviour that later affected her mental health.

The case examined features such as infinite scrolling and personalised feeds – tools that shape what users see and how long they stay. Critics argue these systems are designed to maximise time spent on the platform.

Also read

Miriam Michaelsen, chair of the Media Council for Children and Young People in Denmark, told DR that the ruling is significant because it establishes that companies are aware of these dynamics. She said the judgment makes it clear that platforms rely on elements that can foster dependency.

She also pointed to evidence presented during the case indicating that companies had identified harmful side effects linked to these design choices.

Two cases, one growing legal focus

Together, the two rulings highlight a shift in what courts are willing to examine. The New Mexico case centres on safety failures and exposure to harmful interactions, while the Los Angeles case targets the design of the platforms themselves.

That distinction matters. One is about what users encounter, the other is about why they stay.

According to DR, Michaelsen believes that financial penalties alone may not be enough to force change in business models. But combined with legal pressure and public awareness, they increase the likelihood that companies will have to adjust how their services operate.

Also read

Beyond the US, policymakers are already moving in a similar direction. Several countries are considering stricter age limits and tighter rules for platforms used by children, reflecting concerns that current safeguards fall short.

For now, the message from the courts is becoming harder to ignore: Responsibility may extend beyond moderating content to rethinking how platforms are designed in the first place.

Sources: BBC, DR

Ads by MGDK