Meghan Markle and Prince Harry Call for Stronger Online Safety Laws to Protect Children
When Meghan Markle took the stage in New York to accept the Humanitarians of the Year Award with Prince Harry, her words struck a deeper chord than expected.
“Our children, Archie and Lilibet, are just six and four, can you believe it?” she said, pausing before adding that while her children are still too young for social media, that day is coming sooner than most parents would like.
The concern is clear - the law is struggling to evolve as quickly as the risks confronting children in the digital sphere.
At the World Mental Health Day Gala, hosted by Project Healthy Minds, the Duke and Duchess of Sussex delivered more than an expression of parental empathy.
Their remarks constituted a clear appeal for stronger legal accountability in the digital era.
Through the Archewell Foundation and its Parents Network initiative, they have positioned themselves at the forefront of a rapidly intensifying policy debate - one that asks a defining question of our time: who should be held responsible when technology built for connection instead inflicts harm?
Turning Advocacy into Legal Pressure
Prince Harry’s remarks were unusually direct. He spoke of corporations and lobbyists “spending tens of millions every year to suppress the truth,” pointing to what he described as “algorithms designed to maximize data collection at any cost.”
His words mirror the arguments appearing in ongoing lawsuits against major social media platforms in the United States, where parents claim companies have prioritized profit over child safety.

Prince Harry cradles his daughter Lilibet (Photo: @meghan Instagram)
Meghan, by contrast, framed the issue through empathy.
She spoke about the impossible balance of “embracing technology’s benefits while safeguarding against its dangers.”
That reflection aligns closely with the emerging duty of care doctrine now debated in several jurisdictions, the idea that digital platforms owe a legal responsibility to minimize foreseeable harm to minors.
Together, the couple’s speeches bridged the emotional with the legal. They made the case that parental concern must now be matched by enforceable oversight, not just voluntary ethics.
As their foundation partners with advocacy groups like ParentsTogether, they are building a model of reform that blends community support with a push for policy change.
Law in Focus: How Governments Are Responding
In the United Kingdom, the Online Safety Act 2023 was designed to bring accountability to the tech sector by introducing enforceable duties to identify and mitigate risks to children.
The law, which began rolling out its Children’s Safety Codes of Practice in mid-2025, requires age-assurance systems and prohibits design choices that encourage addictive use.
Platforms that fail to comply face fines of up to £18 million or 10 percent of their global turnover.
Enforcement rests with Ofcom, though critics warn the law risks overreach into areas of free speech and privacy.
Across the Atlantic, the Children’s Online Privacy Protection Act (COPPA) remains the primary U.S. statute governing how websites and apps collect data from users under 13.
It requires verifiable parental consent and limits how personal data can be used or monetized.
Earlier this year, the Federal Trade Commission tightened its enforcement rules to curb companies’ use of children’s behavioral data for targeted advertising.
Meanwhile, a new bipartisan proposal, the Kids Online Safety Act (KOSA) seeks to impose a formal duty of care on social media platforms and compel algorithmic transparency.
The challenge lies in proving harm and causation. When algorithms are opaque and content exposure is personalized, establishing liability becomes legally complex.
Courts are still testing whether platforms can be held responsible for design features that contribute to anxiety, addiction, or self-harm among minors.
The Broader Legal Tension
Legal scholars increasingly view the Sussexes’ advocacy as part of a wider movement to modernize outdated frameworks.
Technology continues to evolve faster than the law’s ability to regulate it, creating complex questions around accountability and duty of care.
Once confined to sectors like healthcare or product safety, this legal principle now extends to code, algorithms, and content design.

Meghan Markle enjoys a peaceful garden stroll with her two young children. (Photo: @meghan Instagram)
Governments are therefore confronted with a delicate balance, protecting children from harm without overreaching into free expression, privacy, or encryption rights.
The couple’s work also raises practical questions for policymakers and the courts: can a parent sue a platform for emotional distress or loss linked to algorithmic exposure?
Should executives face liability if their systems knowingly amplify harmful content? And, most critically, what constitutes a foreseeable risk in a digital environment where harm is often diffuse and invisible?
These unresolved issues sit at the center of an evolving legal landscape, one the Sussexes have helped bring into sharper public and policy focus.
People Also Ask
What did Meghan Markle and Prince Harry say about online safety for children?
At the World Mental Health Day Gala in New York, Meghan and Harry called for stronger online safety laws, warning that the digital world is evolving faster than legal protections. They emphasized that platforms must be held accountable when technology designed for connection instead causes harm.
What is the Online Safety Act 2023 and how does it protect children?
The UK’s Online Safety Act 2023 introduces binding duties for social media and online platforms to identify, manage, and reduce risks to young users. It requires age verification, restricts addictive features, and empowers Ofcom to fine non-compliant companies up to 10% of global revenue or £18 million — whichever is greater.
What legal reforms are being proposed in the United States to protect children online?
In the U.S., lawmakers are advancing the Kids Online Safety Act (KOSA) to establish a statutory duty of care for social media companies. Alongside updates to the Children’s Online Privacy Protection Act (COPPA), the proposal seeks to curb exploitative data collection and make algorithms more transparent.
Why are Meghan and Harry involved in the legal conversation around digital safety?
Through their Archewell Foundation’s Parents Network, the Duke and Duchess of Sussex have become prominent advocates for child digital safety. Their work connects grieving families, mental health experts, and policymakers to push for clearer accountability and stronger laws protecting minors online.
Can parents sue social media companies for harm caused to their children?
Several ongoing lawsuits in the U.S. are testing whether platforms can be held liable for harm linked to algorithmic design and addictive engagement features. These cases could set important precedents for how courts define foreseeable risk and corporate duty of care in the digital age.



















