There’s an online forum that I sincerely hope you’ve never encountered. It’s a place I wish your loved ones remain unaware of. This forum boasts tens of thousands of members and has generated millions of posts, all centered on one haunting topic: suicide.
However, it serves more than just as a place for discussion. It provides detailed instructions on methods, hosts threads for users to live-post their own deaths, and even features a “hook-up” section for members seeking to find others to die alongside. This forum actively encourages, promotes, and normalizes suicide, primarily attracting a demographic of young, troubled, and vulnerable individuals.
This unsettling platform underscores the challenges of online regulation, especially in light of the Online Safety Act (OSA), which has just celebrated its first anniversary after receiving Royal Assent on October 26, 2023. The act aims to position the UK as the safest place in the world for internet users.
Recognizing the dangers posed by such forums, Ofcom, the UK’s online and broadcasting regulator, communicated with the site following our reports last November. They warned that, once the OSA is fully functional, the forum would be violating UK law by promoting suicide and self-harm. Ofcom urged the forum’s administrators to take corrective actions or face severe repercussions down the line.
Although the forum appears to be based in the U.S., details regarding its administrators and server locations remain elusive. In response to Ofcom’s warnings, the site administratively blocked UK users—though this “block” lasted a mere two days before being lifted. The forum continues to be accessible to young people in the UK; alarming evidence from my research indicates that at least five British children have died after interacting with the site.
In a private acknowledgment, Ofcom concedes that smaller, foreign websites with anonymous users may be beyond its reach, even when the OSA is in full effect. However, bigger tech companies may find it increasingly difficult to disregard the new legal responsibilities imposed by the OSA for platforms operating within the UK. Importantly, before Ofcom can enforce these regulations, it must engage in public consultations about codes of practice and guidelines. Currently, we are still in that consultation phase, and Ofcom’s real power lies in the looming threat it poses to tech firms.
Even though the law has not yet been implemented, it appears to be already driving significant change. This comprehensive legislation aims to combat issues ranging from access to pornography and terrorism-related content to the spread of fake news and ensuring children’s safety.
The OSA is the culmination of years of effort—dating back to five prime ministers and at least six digital ministers. Its inception was spurred by a growing public and legislative awareness of the immense influence wielded by social media platforms without adequate accountability.
A pivotal moment was the tragic case of Molly Russell, a schoolgirl whose story galvanized Parliament into action. Molly’s experience resonated deeply; she could have been anyone’s daughter, sister, niece, or friend.
At just 14 years old, Molly took her own life in November 2017, after being inundated with dark and disturbing content on Instagram and, to a lesser degree, Pinterest. The coroner’s inquest established that social media had “more than minimally” contributed to her death, citing the distressing content that had been fed to her.
In the wake of this tragedy, her father, Ian Russell, emerged as a passionate advocate for change, demanding regulations to rein in the power of Silicon Valley. He approached Parliament and even reached out to key figures in the tech industry, including Sir Tim Berners-Lee, the internet’s inventor, along with members of the British royal family.
Reflecting on his journey, Ian described receiving the OSA as a bittersweet moment. “Seven years after Molly’s death, I remain convinced that effective regulation is the best way to protect children from the harm caused by social media, and that the Online Safety Act is the only way to prevent more families from experiencing unimaginable grief,” he expressed.
The OSA claims to be the most extensive law of its kind in the world, accompanied by the potential for multi-million-pound fines against platforms and even criminal penalties for tech executives who repeatedly resist compliance. Yet many advocates believe it does not go far enough or fast enough. The law will be introduced in three phases, following months of dialogue between Ofcom, the government, campaigners, and tech companies.
While the OSA primarily addresses platform behavior, it is worth noting that significant changes in individual online conduct have already begun. New criminal offenses related to cyber-flashing, spreading false information, and encouraging self-harm went into effect in January.
Despite most provisions of the OSA not being enforceable yet, Silicon Valley seems to be paying attention. On September 17, a press release from Meta—parent company of Instagram, WhatsApp, Facebook, and Messenger—announced significant changes, including the introduction of specific “teen accounts.” Existing accounts for users under 18 will transition to new accounts with enhanced restrictions and greater parental controls for those under 16. New sign-ups for children will automatically be routed to these safer accounts.
While the reality of these new teen accounts may not fully align with Meta’s promotional messaging, the company was not compelled to make these changes. Was the potential implementation of the OSA a driving force behind this decision? Partially, yes. Yet, it would be misguided to attribute all improvements in child online safety solely to the impending OSA. The UK is merely one player in a global movement to curtail big tech’s power.
In February, the EU’s Digital Services Act took effect, imposing transparency obligations on major firms and holding them accountable for illegal or harmful content. In contrast, federal legislation in the U.S. appears to be at a standstill. However, lawsuits seeking to hold large social media platforms accountable are mounting, with families of children affected by harmful online content, school boards, and attorneys general from 42 states taking the lead. These lawsuits, grounded in consumer protection laws, argue that social media was designed to be addictive without sufficient safeguards for children, pursuing billions of dollars in damages.
Molly Russell’s story also reverberates strongly in these legal battles, with many plaintiffs and their lawyers familiar with her name. Tech executives are increasingly aware of her impact. Even before the OSA was enacted, companies began to elevate their efforts in content moderation.
Nevertheless, Ian Russell emphasizes the need for ongoing vigilance and action. “While firms continue to move fast and break things, timid regulation could cost lives… We have to find a way to move faster and to be bolder.”