Safety Exit

Sensitive content

This site contains sensitive content that includes references to sexual violence.

Latest

Safety by design: A turning point for accountability in the digital age

Brave Movement

Design online safety

This week may mark a turning point in how we understand responsibility in the digital world.

In the United States, a New Mexico jury ordered Meta to pay $375 million in a case linked to online child sexual exploitation—recognizing that platform features can play a role in enabling harm. In Los Angeles, a court found Meta and YouTube negligent for designs that contribute to youth addiction and mental health harm, drawing attention to the role of infinite scroll, algorithmic recommendations, and engagement loops.

At the same time, rapid developments in artificial intelligence—including OpenAI pausing aspects of Sora—and growing regulatory momentum in Europe around deepfakes are raising urgent questions about how emerging technologies are built and governed.

Taken together, these developments signal a profound shift.

"We are moving beyond the idea that harm online is simply about content or individual bad actors. We are beginning to recognize a deeper truth: design choices shape behavior, exposure, and risk at scale."

For those of us working to end childhood sexual violence, this is not new.

For years, survivors, their families, and advocates have been calling attention to the ways digital environments can enable harm. Through the Brave Movement and Together for Girls, we have consistently heard that the risks children face online are not accidental. They are the result of decisions—about how platforms connect people, how content is discovered, and how engagement is incentivized.

These systems are not neutral. And they can be redesigned.

For the past three years, the Brave Movement—through our Safe Online campaign—has been advocating for safety by design and holding technology companies accountable. This work has been grounded in the lived experiences of survivors, alongside a growing body of evidence showing how digital environments can facilitate sexual violence against children and adolescents.

These recent legal and policy developments did not happen in isolation. They reflect sustained advocacy, evidence, and the courage of activists, experts, and—most of all—survivors, as well as parents who have lost children in the most tragic circumstances. Their leadership has helped shift the narrative from reaction to prevention, from content moderation to system design, from individual responsibility to corporate accountability.

But this moment is only the beginning.

For too long, our responses to online harm have focused on managing risk at the edges—through content moderation, reporting systems, age verification, and parental controls. These measures are important, but they are not sufficient. They do not address the underlying design choices that shape how harm occurs in the first place.

"We cannot moderate our way out of systems that are not designed for safety."

Safety by design means something fundamentally different.

It means building systems that anticipate and prevent harm before it happens. It means introducing friction where risk thrives, limiting pathways that enable exploitation, and questioning engagement models that prioritize attention over wellbeing. It means designing platforms where children are safe by default—not protected as an exception.

We know from decades of research and practice that violence against children is preventable—and that systems, environments, and social norms play a critical role in shaping outcomes. Technology must be held to that same standard.

This is not about limiting innovation. It is about guiding it responsibly.

As technology continues to evolve—particularly with the rapid advancement of artificial intelligence—the stakes are only getting higher. The pace of innovation cannot outstrip our commitment to safety, accountability, and human rights.

The question before us is simple: will we design for harm and try to contain it, or will we design for safety from the start?

At Together for Girls and through the Brave Movement, we believe the answer is clear.

We must center survivor voices. We must follow the evidence. And we must demand that safety is woven into the fabric of technology from the very beginning.

This week’s developments show that change is possible.

Now we must ensure it becomes the norm.

More like this