
A California court has opened a unprecedented legal proceedings against Instagram and YouTube This could change how social media is understood worldwide. A jury will have to decide whether these platforms were deliberately designed to hook children and teenagers, creating a pattern of compulsive use comparable to addiction.
The case revolves around the story of a 20-year-old woman, identified as KGM, who claims to have suffered a serious deterioration of his mental health after starting to use YouTube at just six years old and Instagram at eleven. What's at stake is not only his individual case, but the possibility that this trial will open the door to hundreds of similar lawsuits against Big Tech and force profound legal changes in the United States and, by extension, in other regions such as Europe.
A pioneering case: from children's television to the courtroom
The lawsuit continues in the Superior Civil Court of Los Angeles County The lawsuit names Meta (owner of Instagram and Facebook) and Alphabet (Google's parent company and owner of YouTube) as the main defendants. According to the documents filed, the young woman began watching YouTube videos as a child and opened her own account. Instagram profile at age 11, before later adding other apps like Snapchat or TikTok.
The plaintiff's lawyers argue that The configuration of these platforms is not neutral.but rather it was allegedly designed to specifically target very young users. The lawsuit describes a network of features—autoplay videos, infinite feed scrolling, constant notifications, extreme content personalization, and recommendation systems based on algorithms - designed to keep children connected for as long as possible.
According to KGM's account, That intensive use of Instagram and YouTube eventually led to depression, anxiety, self-esteem issues, suicidal thoughts, and the need to regain control of screen timeThe legal team argues that the young woman was not simply a user with occasional difficulties, but the direct result of "deliberate design decisions" that exploited the vulnerability of minors to increase attention and, with it, advertising revenue.
The trial, which is expected to last at least six weeks, has already become a pilot test for more than a thousand cases that are progressing in parallel in different US courts, many of them driven by families, school districts and state attorneys general who accuse social media of harming the mental health of children and teenagers.
Addictive design: the heart of the accusation
In his opening statements, the young woman's lawyer, Mark Lanier, described Instagram and YouTube as “machines built to addict children’s brains”According to his thesis, the platforms have consciously incorporated behavioral and neurobiological techniques similar to those used by slot machines or, in the past, by the tobacco industry.
The lawsuit details that YouTube's recommendation systems The following video will automatically play. Before the user has time to decide whether to continue watching, the algorithm learns precisely what type of content maximizes attention. Simultaneously, Instagram offers a “endless feed” of photos, videos and stories in which the user can scroll endlessly, in search of reactions, comments and continuous social validation.
The plaintiff's lawyers emphasize that The specific content is not debated as much. Published by users, largely protected by US legislation, as is the very architecture of the apps: how they are programmed to push young people to come back again and again, making it difficult to "just stop using it" which is often used as an argument.
Along those lines, experts supporting the prosecution have drawn parallels with the major tobacco trials of the 1990s, when it was proven that the companies knew about the harmful effects of their products and yet still proceeded to do so. They enhanced features that increased dependenceAccording to the plaintiffs, Instagram and YouTube followed a similar logic: the more time a minor spends online, the more data, the more ads, and the greater the profit.
If this approach succeeds, the case could circumventing the traditional Section 230 shield and liability protections which for decades have shielded platforms from claims for content published by third parties, opening up a completely new legal scenario for the technology industry.
Jury selection and the weight of Zuckerberg's figure
One of the most striking aspects of the process has been the long and meticulous jury selectionwhich lasted for days and in which opinions, often conflicting, about social networks and emblematic figures like Mark Zuckerberg came to light.
During this phase, several prospective jurors acknowledged having a very critical view of Meta and the origins of FacebookSome recalled episodes such as the platform's use to rate the attractiveness of female university classmates or the Cambridge Analytica scandal, centered on the misuse of personal data. Others, however, openly declared themselves in favor of the company and stated they felt sympathy for its founder.
Meta's lawyers tried to remove those they considered “excessively hostile” to the company's social media, while the plaintiff's team did the same with those who tended to place sole responsibility on families and did not see a direct connection between platform design and mental health problems.
A jury of twelve people was finally formed to hear the evidence and determine whether there was negligence in the design of the applications. Testimony is expected during the trial. senior executives of the companiesincluding Mark Zuckerberg himself, Instagram's head Adam Mosseri, and YouTube's director Neil Mohan.
The presence of these figures before the court could have a strong media and political impact, since it involves This marks the first time Meta and Alphabet have defended the design of their platforms before a jury. In a case of this type, beyond appearances before legislative committees or regulatory procedures.
Meta and Google's response: security, content, and responsibility
Meta and Google have flatly rejected the accusations and maintain that Their products are not designed to cause harmbut they increasingly include digital protection and wellbeing features, especially designed for underage users.
Meta spokespeople have emphasized in public statements that the company “does not strongly agree” with the description of the claim and that the tests will demonstrate a continued commitment to the safety of young people. They point to tools such as parental controls, usage time limitsreminders to pause and options to filter or report problematic content.
From Google, owner of YouTube, the line of defense is similar: The accusations are described as "simply false." The company emphasizes that it has been working for years on safer experiences for children and teenagers. The video platform stresses that it has created specific versions for minors, restricted certain features, and introduced systems to better moderate content and targeted advertising.
Both companies also try to focus the discussion on the fact that, ultimately, The content is generated by the users.And that US law protects platforms from direct liability for what third parties post. Their lawyers have tried, unsuccessfully so far, to prevent comparisons between their operation and that of addictive products like tobacco.
In parallel, both Meta and Alphabet emphasize that The well-being of children depends on multiple factors —family environment, socioeconomic context, prior health, etc.— and that attributing all the weight to social media oversimplifies a complex reality. At the trial, the defense attempts to show other elements of KGM's life that could have influenced the onset of his psychological distress.
UNICEF, mental health and the debate on social media “addiction”
Beyond the strictly legal aspects, the case has reignited the debate about whether or not one can speak of “addiction” to social media in clinical terms. International organizations and social entities warn of the risks, but qualify their language.
UNICEF Spain, for example, points out that The World Health Organization does not officially recognize social media addiction as a specific disorder. Instead, it is more accurately described as “problematic use“or “excessive use”, categories that allow analysis of the impact on daily life without automatically equating it to other already classified addictions.
This does not mean, experts consulted point out, that the phenomenon is harmless. People who develop this problematic use may show a very intense symptomatology: anxiety, physical somatization, depression and, in the most severe cases, risk of suicidal behaviorIn adolescents, there is also the added pressure of body image, constant comparison with others, and exposure to hurtful comments.
The trial against Instagram and YouTube, although taking place in the United States, fits into a global trend of concern about the impact of social media on youth mental healthAcademic research, institutional reports, and family testimonies have fueled the perception that the attention-grabbing business model clashes with child protection.
For advocates of stricter regulations, the KGM case illustrates how a design focused on maximizing online time And the constant flow of stimuli can interfere with healthy emotional development, especially when the first contact occurs at ages as young as six or eleven.
International reaction: age limits and new laws in Europe
While juries in Los Angeles hear arguments and expert testimony, in other parts of the world more direct steps are being taken to restricting minors' access to social mediaSeveral countries have opted to set minimum ages and strengthen control of the platforms.
In Europe, the debate is especially intense. Spain has put forward a proposal to limit the use of these apps to minorsThis initiative has even reached the Council of Ministers as part of a broader agenda on children's digital well-being. The idea is to tighten access conditions and to give families more tools to monitor what their children do on the internet.
France, for its part, has already passed a law that It prohibits access to social networks for children under 15 without parental consent.The aim is for the new regulations to come fully into effect in the short term. The measure is based on growing evidence about the effects of screen time and exposure to harmful content.
Outside of Europe, countries like Australia have gone even further, demanding that platforms block or delete millions of accounts identified as belonging to minors When legislation establishes a minimum age of 16 to use these services, these restrictions have generated intense debate about privacy, freedom of expression, and the responsibility of technology companies.
In the UK, the Government is also studying Strengthen regulations to reduce exposure time and limit potentially harmful content for teenagers, in line with the so-called “Online Safety” which is driving stricter regulations across Europe.
In that context, the case against Instagram and YouTube is being watched with particular interest from Europe: any change in the way the United States assumes responsibility for the platforms could influence how community regulations are interpreted and applied such as the Digital Services Regulation (DSA) or the data protection legislation (GDPR).
A wave of litigation that goes beyond a single case
The Los Angeles trial is not an isolated incident, but the vanguard of a wave of legal actions against Big Tech. Thousands of lawsuits are piling up in US state and federal courts, brought by parents, students, education authorities, and attorneys general, all with one thing in common: attributing a key role to social media in the worsening of children's mental health.
In parallel to the KGM case, another important trial has begun in the state of New Mexico in which Meta is accused of failing to adequately protect children and adolescents from sexual exploitation on their platforms and to obtain economic benefit from that illicit activity. Although the specific subject matter is different, both processes fall within the same trend of demanding greater responsibility from companies.
At the federal level, a judge is coordinating more than 2.000 similar lawsuits against Meta, Google, TikTok, and Snap, and assesses the extent to which traditional liability protections remain valid when what is being questioned is not only the content, but the very design of digital services.
Some of these companies have already opted to settle before going to trial. In the case of KGM, TikTok and Snapchat reached out-of-court settlements whose contents have not been made public, which has left Meta and Alphabet as the main protagonists of the ongoing proceedings in California.
For legal experts and regulators, what happens in this first trial will serve as barometer for upcoming litigationIf the jury finds that there was negligence or intentional addictive design, other potential victims might be encouraged to claim, and lawmakers would have a further argument for proposing far-reaching reforms in the regulation of large platforms.
The case pitting a young user against two of the most powerful technology corporations on the planet has thus become a turning point in the discussion about social media, childhood and mental healthWhat is decided in a Los Angeles court transcends the borders of the United States and is linked to the restrictions that are already being studied or applied in countries such as Spain and other European partners, where the priority is increasingly to put clear limits on the design and access to platforms that, for years, have grown with hardly any regulatory restrictions.