Landmark Ruling Holds Meta and YouTube Liable for Child Mental Health Harms
A California jury has ruled that Meta and YouTube must pay damages for mental health harms caused to children by social media addiction. This landmark decision marks the first case of its kind focused on the addictive nature of these platforms. Since the verdict, over 40 lawsuits from state attorneys general have been filed against Meta, with hundreds of individual claims also emerging.
The ruling has reignited discussions about how families can better manage children's social media use. Experts and doctors now hope the case will push the tech industry toward stricter regulations to protect young users. The American Academy of Pediatrics advises against social media use for children under 13. Many experts argue that waiting even longer—until the 8th grade or beyond—can be more beneficial. Some parents have taken action by joining the Wait Until 8th movement, committing to delay giving their children smartphones until they reach that age.
To help families set boundaries, the academy provides a free Family Media Plan tool. This resource allows parents to create personalised guidelines for screen time and online activity. Experts also recommend open-ended conversations with children about their digital experiences, ensuring their voices are heard in any changes made.
Setting practical limits, like timers or 'phone-free zones', can encourage more intentional use of social media. Parents are urged to model healthy habits themselves, as children often mirror adult behaviour. The goal is to involve kids in the process, recognising that social media affects everyone in the household.
The recent legal action has put pressure on platforms like Instagram and YouTube to address addictive design features. With dozens of new lawsuits emerging, the case could lead to broader industry reforms aimed at reducing harm to young users. The verdict against Meta and YouTube highlights growing concerns over social media's impact on children's mental health. Families now have more tools and guidance to manage screen time, from structured plans to delayed smartphone access. As legal challenges mount, the tech industry may face increasing demands to redesign platforms for safer use by minors.