Designed to Hook: Why the UK Can’t Ignore the US Social Media Verdicts

In California, Kaley, a 20-year-old woman alleged that she became addicted to YouTube from age 6, and Instagram from age 9, leading to depression and suicidal thoughts. TikTok and Snapchat settled before trial. Meta and YouTube fought on, and after days of deliberation by the jury, the jury found that Meta (which owns Instagram) and Google (which owns YouTube) intentionally built addictive social media platforms that harmed the 20 year old’s mental health.

The jury awarded Kaley damages of $6m (£4.5m), with Meta to pay 70% and YouTube the remaining 30%. Meta and YouTube have both said they intend to appeal the verdict. This is a significant moment. In US legal terms, this is known as a “bellwether trial”; a test case used to see how similar claims may unfold. There are thousands more cases in the pipeline

The verdict came a day after a jury in New Mexico found Meta liable for the way in which its platforms endangered children and exposed them to sexually explicit material and contact with sexual predators. 

On their own, these two legal rulings mark significant moments for the platform design of social media platforms. Together, they represent a seismic shift.

But this is just the beginning. More than 2,200 cases brought against social media platforms over their alleged harms to children have been consolidated into a federal trial in California set to begin in June 2026.

The current landscape in the UK

You may be wondering what this all means for the UK. Before looking at the direct impacts of the US rulings in the UK, it is important to establish the current state of play.

Social media has become a global issue. Australia has implemented a social media ban for under-16s, and the European Commission’s preliminary finding that TikTok were in breach of the Digital Services Act for its ‘addictive design’ and are now investigating Snapchat. Whilst in the UK, it is widely acknowledged that the current regulation of social media and other online platforms for under 16s is insufficient.

The Online Safety Act 2023 protects both children and adults online, putting a range of duties on social media companies and search services. As of July 2025, platforms have a legal duty to protect children online, and are required to use highly effective age assurance to prevent children from accessing inappropriate content, such as porn.

Critics have argued that the Online Safety Act is not broad enough. It does not include AI chatbots, leading to a Government announcement in February of this year that it will close the loophole. The Act has also been criticised for focussing too heavily on harmful content, and not enough on addictive design features.

This is where policy meets politics and things get a little confusing! In January 2026, Conservative peer Lord Nash (a member of the House of Lords) proposed a social media ban for under-16s as an amendment to the Children’s Wellbeing and Schools Bill. The House of Lords voted in favour of the ban, meaning that it had to go back to the House of Commons to be voted on by MPs. This is called Parliamentary ‘ping pong’, where laws bounce back and forth between the House of Commons and the House of Lords until they both agree. However, the House of Commons voted against the social media ban. The ping pong rally continues as last week, the House of Lords once again backed a social media ban, meaning the ball is back in the Common’s court!

On 18 March 2026, the House of Lords also voted in favour of Baroness Beeban Kidron’s amendment putting more regulations on AI. For example, under the laws that Kidron proposed, it would be illegal to create a chatbot that produced certain content. This follows an increase in awareness of the risks of AI chatbots, however, they are still often misunderstood.

The government has recently announced a national consultation ‘Growing up in the online world’. As well as considering whether there should be a minimum age for children to access social media, the government is also looking at whether to restrict ‘addictive design features’ that encourage excessive use in children, and how age verification and age assurance technologies might support effective implementation.

Similar action in the UK? 

Over the past few years, there has been a growing number of legal cases assessing whether social media platforms ought to have done more to protect children. With many legal cases, things start in the US and are then followed in the UK. Given this precedent, and the concern around social media harm, similar legal proceedings against social media companies in the UK are inevitable.

There are already a number of high profile cases of harm being suffered to children in the UK. Ellen Roome has been campaigning for changes to social media after the death of her 14 year old son Jools Sweeney, which she believes was caused by a TikTok challenge gone wrong in 2022. Ian Russell has also been campaigning for changes to social media after the death of his 14 year old daughter Molly Russell who had viewed suicide material online. 

Similarly to the cases in the US, it is likely that legal cases in the UK would focus on the design of the social media platforms. But social media is just the beginning. There are similar allegations being raised against gaming platforms, as well as AI companions. It is likely that any legal action against “Big Tech” would include gaming and AI platforms, as well as social media.

It is difficult to predict what legal action could look like in the UK, and just because American judges have ruled one way, it does not mean that those in the UK will follow suit. However, the key battle grounds for any action have begun to emerge. These include the fundamental question of whether social media platforms were designed to be addictive, but also other questions, like whether the platforms had a duty to warn users of potential risks, and whether platforms took reasonable steps to ensure age verification was effective. There is also the major issue of causation (can social media be categorically proven to have caused the harm in question).

The issue of addiction

As mentioned above, a key question in any legal action would be about the idea of “addiction”, and the idea of “addictive design”. Many campaigners are calling for a ban on addictive design, including FlippGen. In fact, one of the three key points in our manifesto, written for the For Us Coalition, is calling for the prohibition of addictive features on social media platforms for U16. 

However, the term “addictive features” is often thrown around, but there is often very little explanation of what they are.

So, what do we mean by “addictive design”.Ofcom’s research into ‘persuasive design’ suggests that addictive features may include:

  • Infinite scrolling: new content keeps loading automatically as you scroll down, so the page never really ends.

  • Autoplay features: the next piece of content automatically starts without requiring action by the user

  • Affirmation functions: functions that give users feedback, validation and encouragement. Such as likes, and reactions.

  • Alerts and push notifications: An alert is a message that pops up inside an app or website to get your attention right away. A push notification is a message sent to your phone or device by an app, even when you’re not using it, to update you or bring you back to the app.

Conclusion

The recent decisions in the US mark a pivotal moment for social media platforms. It marks the first time that the courts have ruled in favour of what many people have believed for a long time; the way social media platforms are designed poses a real risk to the mental health of young people. The conversation around these risks are now common in British politics, with Lord Nash and Baroness Kidron at the heart of efforts to regulate social media. Alongside this effort in Parliament, there is an ongoing effort by charities and civil society. FlippGen is part of this effort. We recently delivered the For Us manifesto to Downing Street, which calls for a prohibition on addictive features on social media platforms for U16s. Whilst legal action in the UK akin to that seen in the US is inevitable, it is not clear what these cases will look like, and what the outcomes will be. What is clear however, is that the status quo is no longer defensible.


Next
Next

A Seat at the Table: My Experience Meeting the Prime Minister