LOS ANGELES—Judy Rogg had waited years for this moment. In late February, a YouTube executive took the stand in a landmark social media addiction trial in Los Angeles—the first to test whether tech companies could be held liable for the design and operation of their platforms and resulting psychological harm to children.
For parents who have lost children to accidental deaths or suicides they say were caused or facilitated by social media, it was a watershed moment, and an emotional one. How would leaders of the world’s most powerful social media companies answer claims that they knew the risks, but targeted young people anyway?
Rogg, now in her 70s, was a little older than the other parents who were a fixture at the trial. She wore a large pin with the image of her son, Erik: freckled, forever 12, his bright blue eyes echoing his mother’s. She lost him in 2010 after he tried a “choking game” challenge, also known as a “blackout challenge,” in which kids attempt to get a brief high by hyperventilating or using ligatures to cut off oxygen until they pass out.
Such games predate the internet, but algorithms and mimetic posting on platforms such as YouTube and TikTok have exponentially amplified their reach, globalizing what was once a localized adolescent dare.
Both companies prohibit dangerous challenges and have become more proactive about removing them, and both use AI to detect and remove underage accounts. Executives have said for years that they can’t find evidence of choking challenges, even suggesting many viral trends are in fact “hoaxes” fueled by media and moral panics.
And yet, kids keep dying after seeing these videos on their apps.
As she watched an attorney for the plaintiff in the Los Angeles trial grill Cristos Goodrow, YouTube’s vice president of engineering, on Feb. 23, Rogg said she felt vindicated by internal documents that painted a damning picture of the company’s approach to safety, and by a timeline that coincided with her own research about the circulation of choking game videos on YouTube.
During a break, she walked out of the courtroom to find a message on her phone: A family friend of another potential choking game victim, this time a 15-year-old boy, had just reached out.
“My heart stops, and my stomach falls out, because sadly, it’s the same,” Rogg told The Epoch Times.

Judy Rogg holds a photo of her late son Erik in Los Angeles on April 15, 2026. Erik died in 2010 after attempting a “choking game” challenge, in which participants seek a brief high by cutting off oxygen until they pass out. (John Fredricks/The Epoch Times)
That family has not gone public yet—sometimes it takes parents years to speak about their experience, and most asphyxiation game deaths are mischaracterized as suicides, according to Rogg, who keeps an informal tally through her advocacy organization, Erik’s Cause.
“It’s so outrageous that you just can’t even fathom it and you kind of put your hands over your eyes metaphorically because you just can’t look at it. Whereas with suicide, generally speaking, there are signs,” she said.
Since 2007, the year the iPhone was introduced, Rogg counts some 741 deaths, the vast majority of them boys in the United States.
As the trial got underway in Los Angeles on Feb. 9, parents in Stephenville, Texas, around 1,300 miles away, were mourning the sudden loss of their 9-year-old girl.
On Feb. 3, Curtis and Wendi Blackwell found their daughter, JackLynn, unconscious in their backyard, a cord wrapped around her neck. After watching videos on YouTube, they told media, she tried the “blackout challenge.”
Like “Kaley G.M.,” the 20-year-old plaintiff in the Los Angeles trial, JackLynn was on YouTube young, and often. Users under 13 are not allowed to register for a YouTube “main” account; they are instead diverted to a restricted version. But anyone can watch without an account—or, like Kaley did, simply enter a random birth date.
“She was on YouTube a lot, which, of course, a lot of kids are,” Curtis Blackwell told a CBS reporter in a tearful television interview in March.

A man holds a phone displaying the YouTube app in this file photo. Algorithms and viral sharing on platforms such as YouTube and TikTok have amplified the reach of the “choking game” challenge, turning a once localized dare into a global trend. (Oleksii Pydsosonnii/The Epoch Times)
Media outlets reported JackLynn would be added to 82 documented cases of “choking challenge” deaths. But those numbers are wildly outdated; they come from an analysis the Center for Disease Control conducted on possible cases from 1995 to 2007, and national reporting has not been updated since.
Many risky challenges appeared to peak during the pandemic in 2020 and 2021, and choking game fatalities have since decreased significantly, according to Rogg’s count. But their popularity tends to be cyclical in nature, resurfacing suddenly and then subsiding again.
Spurred by an insatiable drive for virality, an age-old fascination with asphyxiation can take new shapes as social media evolves.
Earlier this month, videos of the popular Gen-Z influencer Clavicular getting choked out, losing consciousness and convulsing during a livestream went, in his words, “giga viral”—in turn spawning endless commentary and iterative videos, spreading across YouTube, TikTok, and other platforms.
Researchers have been warning about the blackout challenge on YouTube at least since 2009. Parents claim dangerous content still circulates, in spite of moderation—or because of it, served to children unsolicited by exploitative algorithms.
6 PM
Every day, Annie McGrath sets a timer. At 6 p.m., she searches for choking game videos on YouTube and reports the results through the company’s online system.
McGrath’s son, Griffin—known to loved ones as Bubba—died after attempting a pass out challenge in 2018. He was 13.

Survivor parents listen as an attorney speaks to the press during a landmark social media addiction trial outside Los Angeles Superior Court in Los Angeles on Feb. 18, 2026. (Jill Connelly/Getty Images)
“He wasn’t even on social media at all; he had a flip phone until he was 13. Then we finally gave him [a smartphone]—he wanted it because he was a speed Rubics Cuber,” McGrath told The Epoch Times. “He just looked at YouTube once in a while to see the other Cubers. His best time was 8.7 seconds ... We thought it was harmless.”
Since his death, McGrath said, “I have been finding kids every single day on YouTube actively choking themselves. There are still some up that I was reporting six years ago.”
When she reports them, she receives an automated message indicating the company will remove the video if it violates community guidelines.
McGrath’s daily frustrations in 2023 became the subject of a federal product liability lawsuit, in which she and other mothers alleged YouTube and TikTok designed defective content moderation systems that failed to remove dangerous challenges.
“This case is about social media companies ignoring vital reports from parents who voluntarily report harmful videos online in an effort to prevent future harm to other children,” Juyoun Han, an attorney for the plaintiffs, told The Epoch Times.
The companies’ reporting systems, she said, are designed to be difficult to access and are not reviewed or responded to as they should be, “promising protection but failing when it matters most.” She likened it to a 911 line that “drops over 95 percent of calls or mistakenly tells the caller that it is not an emergency.”
A federal judge dismissed the case, ruling that plaintiffs failed to identify any duty owed by defendants independent of their role as publishers of third-party content, which is broadly protected from liability and negligence claims by the First Amendment and Section 230 of the Communications Decency Act.

Cristos Goodrow (L), YouTube vice president of engineering, arrives at Los Angeles Superior Court for a trial examining whether social media companies designed their platforms to be addictive to children in Los Angeles on Feb. 23, 2026. (Frederic J. Brown/AFP via Getty Images)
Han said she plans to file an appeal with the Ninth Circuit this month.
“We think the court has misunderstood our argument. The duty is very simple—to represent and create a reporting system that should work as it’s represented and designed to do,” Han said.
“We are plainly asserting that companies promising specific safety systems must be held accountable when those systems fail with flying colors. This has nothing to do with Section 230 or content moderation.”
While Section 230 has historically provided a bulletproof legal shield, recent decisions are chipping away at it—including landmark verdicts in Kaley’s case, and in a case brought by New Mexico in which a jury last month found Meta liable for misleading consumers about the safety of its platforms and endangering children.
In 2024, a U.S. appeals court reversed a previous dismissal of a wrongful death lawsuit brought by the mother of a 10-year-old blackout challenge victim against TikTok, ruling the company’s use of an algorithm to recommend content to children can be considered first-party speech, and thus subject to liability.
A YouTube spokesperson said the company could not comment on ongoing litigation. TikTok did not respond to inquiries about the case.
In June 2023, McGrath was invited to present her concerns to shareholders of Alphabet, YouTube and Google’s parent company.
“They had this pre-taped [response], and it said, ‘We are so sorry for your loss. We can’t find any pass out challenges, and if we do, we take them down,'” she said.

Relatives of victims stand outside the Los Angeles Superior Court holding portraits of their loved ones in Los Angeles on March 25, 2026. While Section 230 of the Communications Decency Act has historically provided a bulletproof legal shield for social media platforms, recent decisions are chipping away at it. (Frederic J. Brown/AFP via Getty Images)
Hoax or Harm?
In a hearing before a U.S. Senate subcommittee in 2021, Michael Beckerman, then head of public policy at TikTok, told lawmakers that his company was not able to find “any evidence of a blackout challenge on TikTok at all.”

Such content, he said, would violate the company’s guidelines, and would be removed if found by AI or human monitors. Transparency reports at the time showed more than 94 percent of violating content was removed proactively, he said.
Beckerman further suggested reports of such “alleged challenges” on TikTok were rumors, sensationalized by the media and politicians, but never actually appearing on the app.
“We divert searches, we block content, we remove content,“ he said. ”But unfortunately, something we have seen recently are press reports about alleged challenges that when fact checkers ... look into it, they find out that these never existed on TikTok in the first place, and in fact, were hoaxes that originated on other platforms.”
YouTube similarly does not allow content that depicts behavior showing adults risking serious bodily harm or death, particularly if it encourages the behavior or someone watching could imitate it. Specifically, it prohibits “extremely dangerous challenges,” including acts that risk asphyxiation.

Michael Beckerman, vice president and head of public policy at TikTok, testifies before a Senate subcommittee on Consumer Protection, Product Safety, and Data Security hearing in Washington on Oct. 26, 2021. (Samuel Corum/Getty Images)
And yet, searches conducted by The Epoch Times in April found numerous examples of choking, asphyxiation, and fainting videos on YouTube.

Searching with popular challenge names, altered slightly to get around the site’s bans, produced myriad videos of adolescents hyperventilating and passing out, with or without friends pressing on their carotid arteries, and choking one another until they lose consciousness.
TikTok searches produced similar results. Both platforms now have a number of warning videos that appear at the top of search results.
A YouTube representative said the company reviewed and removed 13 videos flagged by The Epoch Times for containing asphyxiation-related content.
“We simply do not want this content on our platform. And if there’s any indication it’s still existing we want to remove it as quickly as possible,” Boot Bullwinkle, the YouTube representative, said.
YouTube removed more than 8 million videos for violating its community guidelines from October to December 2025, the latest period for which it provides reporting. The vast majority, 64 percent, were pulled for child safety reasons.
The company suggests that even when violative videos are not immediately taken down, they’re not seen much—with a view rate of only around 0.15 percent.

A man holds a phone displaying the TikTok app in this file photo. Around 63 percent of U.S. teens use TikTok, according to the Pew Research Center. (Oleksii Pydsosonnii/The Epoch Times)
‘Tip of the Iceberg’
Rogg’s data, which are based on personal reporting from families across the globe and are not independently verified, show that fatalities from choking games have dropped to only around 78 since 2020.
“I take them at their word. They’re not making this up out of thin air,” Rogg said.
But she thinks the true impact is hidden in plain sight.
“If someone actually looked at the numbers from asphyxial tween and teen deaths in the suicide numbers and did psychological autopsies on them, I think they’d be shocked,” she said.
While CDC data on choking game deaths has not been updated since 2007, the agency surveys adolescents every two years through its Youth Risk Behavior Survey. Through her organization, Rogg managed to get a relevant question added to an optional survey supplement that a state or school district can ask for—but the results are not reported nationally by the CDC.
The Epoch Times submitted multiple requests to the CDC for additional data and information but did not receive a response.
Rogg has become a hub for grieving families, who are unsure about what they’ve experienced and are looking for answers. Her own story reflects this uncertainty.
Erik was a “happy, well-adjusted kid,” Rogg said. Active in Boy Scouts and little league, he was excited and engaged in his future. He hoped to go to West Point when he turned 18 and even email their admissions department to make sure he would be able to meet their physical requirements.


(Left) Judy Rogg holds one of the baseballs used and written on by her deceased son Erik in Los Angeles on April 15, 2026. (Right) A photo of Erik Robinson sits next to items he owned in Los Angeles on April 15, 2026. (John Fredricks/The Epoch Times)
“At 12 years old, 80 pounds soaking wet with clothes on, he could do a dozen pull-ups. He was working toward his goals,” she said.
Deep down, Rogg knew Erik didn’t want to take his own life.
As she waited by his bedside, where he remained on life support after the incident, two detectives showed up to tell her: “This wasn’t suicide. This was a choking game.”
Eventually, classmates came forward and told her YouTube videos of the game were circulating widely among his grade, and he and another boy had been seen practicing choking one another. Still, she has no evidence of where Erik learned it.
“The majority of choking game deaths are misclassified as suicide. Medical examiners, detectives are not adept at understanding the difference. They walk in, it looks like suicide, they check a box and leave,” she said, noting that parents often struggle to change a death certificate after the fact.
“It leaves parents in a double whammy of, ‘What signs didn’t I see? How bad of a parent am I?’ They weren’t a bad parent. It’s just that no one was around to talk about it.”
McGrath said most parents don’t know what happened, or don’t have access to their kids’ phones, so they assume their child purposely killed themselves.
Documented cases, she said, are “the tip of the iceberg.”

Judy Rogg holds a baseball and glove used by her late son Erik in Los Angeles on April 15, 2026. (John Fredricks/The Epoch Times)
‘Please Don’t Be Dead’
The police took Bubba’s computer and his phone, and brought them back to McGrath six months later. When the boy’s best friend visited, he saw the phone and plugged in his password.
“The last text was, ‘Please don’t be dead.’ That was from one of the little girls. He had a crush on her,” McGrath said. She explained that she eventually learned Bubba had been FaceTiming with two classmates who had dared him to try the viral challenge.
“What little boy wouldn’t do something for their first crush?”
Accidental asphyxiation deaths remain extremely stigmatized, even more so than suicide, Rogg said, and parents often struggle to understand how their clever, happy child could be susceptible.
At the hospital, she recalls telling the detectives, “He’s too smart for something that stupid.”
Researchers have attributed adolescent fascination with games promoting self-harm to a desire to overcome fear or seek out intense sensations, including the sense of power after surviving a potentially fatal challenge, or to escape reality in times of frustration or anxiety.
In the years since Erik’s death, scientists have largely moved away from the idea that human brain development stops in early childhood, toward an understanding that it continues through late adolescence and beyond.
The pre-frontal cortex—responsible for decision-making, impulse control, and executive function—is among the last regions to develop. It is now widely accepted that the maturation process continues into the mid-20s or even early 30s.
Understanding this, Rogg said, has helped her make sense of the impossible tragedy.

Judy Rogg is reflected in a window with a photo of her and her late son Erik in the background in Los Angeles on April 15, 2026. Rogg hopes lawsuits and legislation will eventually lead to change, but thinks education is the missing link. (John Fredricks/The Epoch Times)
Rise of Dangerous Challenges
In 2009, a study published in the journal Clinical Pediatrics reviewed YouTube videos of adolescents participating in “recreational partial asphyxiation,” including those resulting in hypoxic seizures. The platform, authors concluded, had enabled millions of young people to watch choking game videos and risked normalizing the behavior.
Ninety-two percent of U.S. teens aged 13 to 17 used YouTube in 2025, according to the Pew Research Center. Sixty-eight percent use TikTok.

At the time, the 65 surveyed videos had around 200,000 views. In 2016, the same year YouTube met its goal of 1 billion daily watch hours, a study in Global Pediatric Health provided an update, surveying more than 400 choking game videos. It found that related content had increased by more than 400 percent, with videos viewed more than 22 million times.
A scoping review published in the journal Injury Epidemiology in 2025 found that an increase in articles reporting injuries from risky social media challenges began to increase in 2021, but the authors noted a “glaring” lack of implemented interventions.
The migration of such games to online spaces has arguably made them more dangerous, as kids doing them alone can be at higher risk of injury or death.

People look at their phones on the subway in New York on April 1, 2014. (Samira Bouaou/The Epoch Times)
Many social media challenges are harmless, their popularity among adolescents an extension of the need for social belonging, recognition, identity development and expression. But viral content is known to trigger strong emotions, both positive and negative, often propelling users toward more extreme behavior.
Children who have been innately groomed by YouTube to become YouTubers may be driven to create ever more spectacular content as they strive to maintain an audience, wrote the authors of a 2020 study published in the Spanish-language journal Salud Colectiva. The study examined self-inflicted harm from online challenges in Brazil, where researchers in 2017 had noted a similarly dramatic proliferation in the number and variety of choking game videos.
French anthropologist and sociologist David Le Breton has suggested that online choking and fainting games may function as a kind of replacement for lost community, cultural, or religious rites of passage.
Such games are transgressive and taboo, but not in the way drugs are—offering “an easy way to access an altered state of consciousness,” the value of which is enhanced by their hidden nature.
That nature continues online with an ever-shifting lexicon of search terms, hashtags, and alternate spellings kids effortlessly invent to outsmart censors and hide their games in plain sight.

Bracelets reading “Stop Online Harm” promote the message of Erik’s Cause, a nonprofit named after Rogg’s son, Erik, in Los Angeles on April 15, 2026. (John Fredricks/The Epoch Times)
Blaming the Parents
Parents who have lost children to suicide, overdoses, or accidental deaths they link to social media said there is a pervasive myth that these tragedies only afflict troubled kids or children of bad parents.
Joann Bogard, a co-plaintiff with McGrath in the federal lawsuit against YouTube and TikTok, said she used every available tool to monitor her son’s social media use.
At 15, her son Mason was a happy, witty, “outdoor” kid who loved camping, fishing, and hiking. He had good friends, went to a good school. He wanted to be on YouTube to watch videos about how to make a better fishing lure or master his woodworking skills, she said.
“I had all of the settings, I had the watchdog apps, I checked his devices. I did everything the experts told me to do,“ Bogard said. ”In 2019, the YouTube algorithm fed him the choking challenge unsolicited. Mason tried it, and it was fatal to him.”
Both Bogard and McGrath are prominent advocates who have achieved, with support from other parents, a degree of comfort sharing their personal tragedies with the public. But the reception can be brutal.
“I get the harshest comments,” McGrath said about her recent appearance on a podcast. “They say, ‘Survival of the fittest, Darwin’s theory, just don’t blame YouTube because your son’s an idiot.’ Just really harsh, and blaming the parents.
“People think, ‘My kid would never do such a stupid thing.’”

Joann Bogard holds a photo of herself with her son, Mason, at Los Angeles Superior Court in Los Angeles on Feb. 5, 2026. Mason died at age 15 in 2019 after attempting a viral “choking challenge” on YouTube. (Courtesy of Joann Bogard)
McGrath knew about challenges; she and her husband talked to their teens and checked their phones randomly. “But we never had any idea that any were deadly, you know?”
Child Safety v. Free Speech
Parents such as McGrath, Bogard, and Rogg hope the coming wave of litigation will compel tech companies to change how they operate and prioritize safety for young users.
But they feel the best shot at substantive change is the Kids Online Safety Act (KOSA), federal legislation that would impose a “duty of care” on platforms to mitigate harms to teens and kids, tame the impact of algorithmic recommendations and addictive features, and give parents added controls.
It would also streamline the reporting systems McGrath and Bogard allege are faulty and require annual third-party audits and public reporting.
Critics, meanwhile, warn that such laws will open the door to state-sanctioned surveillance and censorship—ushering the end of online anonymity and the free internet as we know it.
“The overbroad language in KOSA and similar legislation risks censoring everything from jokes and hyperbole to useful information about sex ed and suicide prevention,” Jenna Leventoff, senior policy counsel at the American Civil Liberties Union, said in a statement last year.
As the Foundation for Individual Rights and Expression (FIRE) argues, KOSA’s vague mandate to mitigate harm caused by “design features” would leave things open for future interpretation—by the Federal Trade Commission, which would enforce a duty of care; the courts; all 50 state attorneys general; and the platforms themselves.

Rep. Gus Bilirakis (R-Fla.) speaks during a rally held in support of the Kids Online Safety Act on Capitol Hill in Washington on Dec. 10, 2024. (Jemal Countess/Getty Images for Accountable Tech)
These mandates, the organization argues, would “leave a regulatory hammer hanging over social media platforms,” leading to preemptive censorship, according to whichever way political winds are blowing at the time.
Proponents of the legislation counter that “duty of care” would only apply to a fixed and clearly established set of harms—“medically-recognized mental health disorders,” addictive use, illicit drugs, federally defined child sexual exploitation, and suicide—and that FTC cannot add or change harms covered under the bill. Additionally, they say, KOSA will not make platforms liable for content they host or remove, or for providing content to young users when they search for it.
First introduced in 2022 by Sen. Richard Blumenthal (D-Conn.) and Sen. Marsha Blackburn (R-Tenn.), KOSA has since been incorporated into a broader package that advanced out of a House subcommittee in March, but faces an uncertain future in the Senate as lawmakers disagree over versions of the bill.
Meanwhile, critics of the House version, including the National Association of Attorneys General, contend it would hinder states’ ability to tackle online harms, and guts the centerpiece of KOSA, duty of care.
As the fight to find a new balance between online safety and free speech continues, many of the issues currently being litigated are expected to eventually make their way to the Supreme Court.
In 2025, the court in a 6–3 decision upheld a Texas law requiring websites that have at least a third of their content composed of “sexual content harmful to minors” to collect age-verification information from all users.
The ruling, according to critics including the Electronic Frontier Foundation, “ignores the many ways in which verifying age online is significantly more burdensome and invasive” than flashing an ID card at a store. Requiring all users to hand over a “data-rich” government ID will lead to a host of “serious anonymity, privacy, and security concerns,” the foundation argues in a critique of the court’s decision. Age verification also threatens online anonymity, critics warn. Even if companies are not compelled to age-gate their platforms by law, they may be compelled by the threat of liability to implement mandatory ID checks or biometric scans.

A young boy looks at his iPad screen with the YouTube Kids app open in Sydney on Dec. 7, 2025. (George Chan/Getty Images)
The Supreme Court has consistently upheld anonymity, including online anonymity, as a constitutionally protected “shield from the tyranny of the majority,” as Justice John Paul Stevens wrote in 1995.
But the limits of that protection, and how it intersects with efforts to protect minors from obviously harmful content, predatory algorithms, and addictive features continues to evolve.
Both YouTube and TikTok have introduced progressive age verification mechanisms in recent years, but neither platform currently requires all users to verify their age with identification or biometric assessment to open an account.
If an account is flagged as potentially belonging to an underage user, based on its activity, that user may be required to submit a government ID, or selfie video to prove their age or, in the case of TikTok, unlock features such as the app’s live stream.
TikTok’s system flags accounts it suspects belong to users under 13. To avoid a ban, a flagged user can submit a selfie with ID or a credit card, or use a third-party facial age estimation technology.
The company has said it removes around 6 million suspected underage accounts monthly.
In 2025, YouTube introduced AI to interpret account activity to estimate user age. If the company believes a user is under 18, teen settings automatically kick in and users have the option of verifying their identity with a government ID or credit card.

In this photo illustration, the Instagram account creation page is displayed on a phone in Sydney on Dec. 7, 2025. Both YouTube and TikTok have introduced progressive age verification mechanisms in recent years—but neither currently requires all users to verify their age with identification or biometric assessment to open an account. (George Chan/Getty Images)
‘They Still Don’t Know’
Rogg hopes lawsuits and legislation will eventually lead to change, but thinks education is the missing link.
“Even if we get the laws passed, it’s going to be decades in the courts ... The companies are going to continue to appeal,” she said.
“You have to get in front of kids, families, and communities, because they still don’t know. And to put this genie back in the bottle, even if they do make all of these changes, is going to take a long time.”
While awareness campaigns for cyberbullying, suicide, fentanyl poisoning, and sextortion have become increasingly common, Rogg said, it hasn’t been easy getting the awareness program she developed through Erik’s Cause into schools.
“I’m just not sure school districts really understood how important this whole topic is until the AI issue came around this past year,” Rogg said.
Some states have opted for an online portal, instead of a live class, to teach internet safety.
“This is one step too far for us,“ she said. ”We will not create a program that is taught by a computer on teaching internet safety. I truly believe it needs to be taught by a live person who can answer questions.”
Some schools balk at having speakers address the issue on campus, fearing liability for copycat incidents.
Rogg points to results from surveys conducted in a Utah school district from 2016 to 2023 to argue that, where education has increased, “curiosity to play” these dangerous games has receded.




















