News
Instagram Addiction Doesn’t Exist, According to App’s Top Executive
Comments
Link successfully copied
Instagram CEO Adam Mosseri leaves the Los Angeles County Superior Court after testifying in a social media trial on Feb. 9, 2026. (Apu Gomes/AFP via Getty Images)
By Beige Luciano-Adams
2/12/2026Updated: 2/12/2026

LOS ANGELES—When asked whether he believes there is such a thing as being addicted to Instagram, the social media platform’s top executive, Adam Mosseri—testifying as an adverse witness for the prosecution on Day 3 of a landmark jury trial in Los Angeles Superior Court—said simply that he does not.

“I think it’s important to differentiate between clinical addiction and ‘problematic use,’” Mosseri said Wednesday, defining the latter term as someone “spending more time than they feel good about” on the app. “I don’t think it’s the same as clinical addiction.”

That contrasts with testimony given Tuesday by one of the country’s top experts on substance and behavioral addiction, who described a vicious cycle in which vulnerable adolescents are intentionally targeted and exploited for profit by design features that prey on their developing brains.

Instagram, YouTube, Snapchat, and TikTok, and their parent companies, plaintiffs allege, are addicting kids and fueling a national mental health crisis in the process.

The case is one of three bellwether trials that will presage how thousands of related lawsuits brought by children, parents, school districts, and state attorneys general are argued and tried—and what kind of damages might be expected.

Like Big Tobacco and National Opioid settlements before them, the cases mark a generational turning point and could have profound, long-term policy implications.

Teens as Test Subjects?


Mark Lanier, attorney for the plaintiff, a 20-year-old California woman referred to in court documents by her initials, “K.G.M.” or “Keslsey G.M.,” spent the better part of Wednesday poking holes in the tech giant’s claims that it is proactively protecting users’ well-being, rather than pushing addictive features on vulnerable populations.

The plaintiff’s attorneys say that K.G.M. was a minor when she allegedly became addicted to social media platforms, which she claims led to damaging mental health impacts, including depression, body dysmorphia, and suicidal ideation.

Lanier prodded points of conflict, found in media appearances, leaked documents, and internal communications, where Mosseri appeared to contradict himself on social media addiction, known harms, and design features.

Lanier focused on Instagram’s “filters”—digital tools that enhance or augment photographs online—as an example of design features that may negatively impact users, highlighting internal documents showing the decision-making process behind a series of policy measures to limit their use.

The use of Instagram and Snapchat filters has been linked, in a new but growing body of research, to dysmorphia disorders, especially among female users.

“Would a reasonable company use millions of teens as test subjects for an untested product, or would it study the risk first?” Lanier asked, delineating a choice between two options: Profit now, test later—or test first and protect minors.

Mosseri took issue with the premise, saying he believed prioritizing safety is a long-term benefit—good for business, image, and revenue alike.

“I’m constantly trying to make sure [the team] is thinking about the long term,” Mosseri said. “We know if people have a bad experience on the platform, they’re going to leave.”

Attorney Mark Lanier arrives at the Los Angeles Superior Court before testifying on Feb. 11, 2026. (Ethan Swope/Getty Images)

Attorney Mark Lanier arrives at the Los Angeles Superior Court before testifying on Feb. 11, 2026. (Ethan Swope/Getty Images)


Beauty Filters


Following an internal debate in 2019 over whether to ban third-party beauty filters with effects that “can’t be replicated with makeup,” which one executive worried would limit the company’s ability to compete in Asian markets, Mosseri favored allowing the filters but stopped short of promoting them to users.

In an email to Meta CEO Mark Zuckerberg, another top executive registered her grave concern over the decision

“I want to just say for the record that I don’t think it’s the right call given the risks. As a parent of two teenage girls, one who has been hospitalized twice, in part for body dysmorphia,” the executive wrote, “I can tell you the pressure on them and their peers coming thru social media is intense with regard to body image.”

She further noted that hard data to prove causal harm won’t likely be available for many years, if ever. In the meantime, she said, “I was hoping we can maintain a moderately protective stance here.”

The company ultimately opted to allow filters that can’t be replicated with makeup and ban filters that promote or celebrate cosmetic surgery.

But while Lanier suggested the decision was motivated by profit at the expense of users’ well-being—and specifically the estimated $45 million to $50 million Mosseri has made from the company in the past few years—the tech executive said, “I was never worried about any of these things affecting our stock price one way or the other.”

Under cross-examination from Phyllis Jones, an attorney for Meta, Mosseri said that teens may use filters more than other demographics, but don’t represent a big money-maker for the company, given their relatively low spending power.

“Given how few people use filters, I haven’t seen any data suggesting filters drives consumption or ads, I haven’t seen anything indicating it will drive revenue,” Mosseri said.

The Instagram app on a smartphone, in this illustration taken on July 13, 2021. (Dado Ruvic/Illustration/Reuters)

The Instagram app on a smartphone, in this illustration taken on July 13, 2021. (Dado Ruvic/Illustration/Reuters)


Protecting Users


Jones also guided him through a range of safety features Instagram has introduced over the past decade to protect users, including parental supervision controls such as daily limits and blackout periods, quiet mode, topic management, and more.

“The Instagram Miss K.G. signed up for was very different,” Mosseri said, noting it carried fewer risks.

“I think the world is changing increasingly quickly, and Instagram is going to have to change along with it if we want to stay relevant. That’s something I’m proud of, that we’re innovating and improving to give teens positive experiences on the platform.”

According to her lawyers, K.G.M. at one point was using the platform for more than 16 hours in a single day, and had reported more than 300 people for cyberbullying.

Continued use of the app after such negative experiences, Lanier suggested, spoke to the compulsion that has been insidiously cultivated by “sticky” design features that keep kids enthralled in a vicious cycle of addiction and mental illness.

Lanier also drew attention to a 2021 article by Mosseri about efforts to protect teens and support parents online, in which he cites a study noting 210 million people are suffering from social media addiction, a phenomenon that had steadily increased over the past decade.

“The paper you cite says social media is intentionally engineered to be addictive and exploit vulnerabilities in human psychologies,” Lanier said.

Despite obvious discrepancies in Mosseri’s public statements, the attorney’s “gotcha” moment never quite landed.

When asked whether he agreed with the claim that platforms are intentionally designed to be addictive, Mosseri, in a measured tone that remained unchanged throughout hours on the stand, simply said, “No.”

Share This Article:
Beige Luciano-Adams is an investigative reporter covering Los Angeles and statewide issues in California. She has covered politics, arts, culture, and social issues for a variety of outlets, including LA Weekly and MediaNews Group publications. Reach her at beige.luciano@epochtimesca.com and follow her on X: https://twitter.com/LucianoBeige

©2023-2026 California Insider All Rights Reserved. California Insider is a part of Epoch Media Group.