Los Angeles jury on Mar. 25, 2026 awarded a total of $6 million in compensatory and punitive damages to a 20-year-old woman known as KGM. The jury found that Google and Meta were negligent in the design of their apps and failed to adequately warn users about the risks. The jury held that Meta should be responsible for 70% of the payout and Google should bear the remaining 30%. ByteDance and Snap settled with KGM before the trial began. She is one of more than 1,600 plaintiffs suing various social media platforms for mental health harms in a Master Complaint, JCCP 5255.
The Master Complaint alleges that internet platforms, specifically Meta (Instagram), Snap (Snapchat), ByteDance (TikTok), and Google (YouTube), have targeted children with social media apps purposely engineered to “attract and addict youth.” Plaintiffs allege that Meta and others have created a “mental health crisis” in children who are “uniquely susceptible to harm” from these apps. In addition, the complaint accuses platforms of concealing or downplaying the risks while maximizing engagement and advertising revenue.
KGM’s case was the first bellwether trial in California’s coordinated Social Media Cases, JCCP 5255, pending in Los Angeles Superior Court before Judge Carolyn B. Kuhl. KGM’s short-form case appears as Glenn-Mills v. Meta Platforms, Inc., et al., case no. 23SMCV03371, within the coordinated proceeding whose lead case is 22STCV21355.
KGM, a minor when the case was filed, allegedly became addicted to Meta’s Instagram and Google’s YouTube “at a young age because of their attention-grabbing design, such as the ‘infinite scroll’ that encourages users to continue looking at new posts,” according to Reuters. KGM’s trial focused on her own history, and she had to prove that app design choices caused mental health injuries.
KGM’s lawyer, Mark Lanier, likened the apps’ design features to the kind of variable-reward mechanism commonly used in casinos to foster compulsive use. According to Courthouse News Service, Lanier argued that these platforms are designed to be extremely addictive.
“KGM’s lawyer, Mark Lanier, likened the apps’ design features to the kind of variable-reward mechanism commonly used in casinos to foster compulsive use. According to Courthouse News Service, Lanier argued that these platforms are designed to be extremely addictive.”
Notably, Lanier relied in part on internal company documents and executive testimony to argue that the platforms knew that young users were especially vulnerable, deliberately pursued youth engagement, and failed to respond adequately to the risks. One of the documents stated that “the goal is not viewership, it’s viewer addiction.” An internal strategy memo from YouTube stated that it was important to bring children in as tweens “to win big with teens.” Perhaps most damning, an internal memo from an Instagram employee reportedly said that social media platforms are “basically pushers. … We’re causing reward deficit disorder, because people are binging on Instagram so much they can’t feel the reward.”
According to Courthouse News Service, KGM told jurors that she began using YouTube at age six and Instagram at age nine and was on social media “all day long.” She reportedly became so addicted that losing access to her phone sent her into a panic. She said the apps left her anxious, depressed, and insecure about her appearance, and affected her sleep, her school performance, in-person social life, and family relationships. She also suffered from suicidal thoughts and began cutting herself at age ten to cope with depression. Design features such as augmented reality beauty filters caused her to have ongoing struggles with body dysmorphia and self-image.
Meta, on the other hand, somewhat sensibly argued that it was her turbulent home life that caused her mental health issues. According to trial testimony, none of her therapists specifically “identified social media as the cause” of her struggles. Attorneys for Meta and Google pointed instead to “other factors that drove her anxiety and depression, including a learning disability, real-life bullying, the use of other platforms like TikTok and Snapchat, and, above all, her tumultuous home life.” They argued that her dad left when she was very young, leaving her in the care of a single mother whose behavior was allegedly erratic, “at times, quite loving and supportive” and at others “physically and emotionally abusive.” They also cited testimony that her older sister had attempted suicide and suffered from an eating disorder, according to reporting from the trial.
In another landmark social media lawsuit in New Mexico (State of New Mexico v. Meta Platforms, Inc.)plaintiffs won a $375-million verdict against Meta alone, with a further remedies phase still to come. The New Mexico lawsuit was a state enforcement action brought by the attorney general under state consumer-protection law. It focused more heavily on misleading safety representations, child exploitation risks, and unfair practices. The state claimed that Meta failed to address child sexual abuse material (CSAM) and sexual exploitation of children on its platforms, saying that Meta “was, and is, aware that its platforms are being used to target, groom, sexually exploit, and traffic children.”
Together, the Los Angeles and New Mexico lawsuits reflect a broader shift away from disputes over third-party content and toward claims centered on platform design, safety failures, and harm to minors.
FreedomForever.us partners with a number of organizations such as Bark and Protect Young Eyes to protect and prepare families for an online world. Parental involvement is imperative for children of all ages. Please consider prohibiting or limiting the use of social media and limiting time spent online on all devices, including Smartphones. Your child will thank you.
Freedom Forever is an all-volunteer organization that focuses its energy and time on preserving the innocence and safety of children.
*Please subscribe here for the Freedom Forever newsfeed.
*This article was originally published in American Thinker. Image by PickPik