Gen Z gets to have its Lucky Strike moment, living amidst a social behavior so widely accepted that we hardly notice its near-complete ubiquitousness and carry on a seemingly innocuous yet encompassing habit without a second thought.
Today’s addictive product isn’t rolled in paper and sold at gas stations. It lives in our back pockets. And this week, for the first time, a jury suggested it might be just as dangerous — and just as liable.
Cigarettes were everywhere in America in the 1950s through the 1980s, and they were cool. You saw them in every movie and TV show and on every billboard. More than 40 percent of adults smoked. It was normalized, glamorous, and quietly killing people.
Then came the lawsuits. States sued Big Tobacco for hiding what they knew about addiction. Internal documents proved companies had engineered dependence and targeted children. The 1998 Master Settlement Agreement forced the four biggest tobacco companies to pay states more than $206 billion, ban cartoon ads, and fund anti-smoking campaigns. The once romanticized and ubiquitous habit fell out of favor: Between 1998 and 2018, overall cigarette consumption fell more than 50 percent, and youth smoking dropped from 36 to 6 percent.
Fast forward to the present, and a now-20-year-old woman named Kaley — identified in court papers as K.G.M. — filed a lawsuit against the corporate behemoths of our time, Meta and Google, when she was 17. She began using YouTube at age 6 and Instagram at age 9, sometimes scrolling for 16 hours a day. She argues that the apps’ design, with endless scrolling, Pavlovian notifications, and calculated recommendation algorithms, prompted endless scrolling, causing her serious depression, anxiety, body image issues, and suicidal thoughts.
Her lawyers’ arguments echoed those against the big cigarette companies decades earlier. They argued product liability: the apps are defectively designed, the companies knew it, and they launched them anyway. The jury agreed, finding both Meta and Google negligent in how they designed their platforms and in failing to warn users. Kaley was awarded $6 million. One day earlier, a separate New Mexico jury hit Meta with a $375 million verdict for similar conduct.
These are the first jury verdicts to hold that social media’s business model itself can constitute a defective product. This is a watershed moment — not just for Kaley, but for the 2,400 similar lawsuits waiting in federal court and hundreds more in California. And New Mexico’s case has a second phase in May that could compel Meta to actually change its apps nationwide: stronger age verification, safer algorithms, meaningful oversight.
Meta knew what it was doing. Its own internal 2020 “Project Mercury” study found that deactivating Instagram reduced depression and anxiety in users. The company buried the findings and shut down the study.
But parents should resist the urge to treat this verdict as someone else finishing the job for them.
Fifty-one percent of parents say they regularly check their children’s social media accounts. Many don’t know which apps their kids are using and abusing at all. Meanwhile, 60% of young people self-report feeling addicted to social media. 45% say they spend “too much time” on it, and 48% of teens say it has a mostly negative effect on people their age. These kids know something is wrong. They cannot fix it alone — the apps are engineered to make quitting feel impossible — and a courtroom in California cannot reach into a bedroom at midnight and take the phone off the nightstand. Parents have to do that.
There are more lessons from the tobacco lawsuits to be heeded. The 1998 settlement was historic on paper, but it only changed behavior because prices rose, culture shifted, and parents got involved. States spend most of the settlement money on things other than prevention. Tobacco companies pivoted overseas and kept selling. A legal victory is only as durable as what follows it.
Congress isn’t helping. The Kids Online Safety Act passed the Senate in 2024, stalled again in 2025, and remains stuck in the House. COPPA 2.0 passed the Senate unanimously — twice —and died both times. This is the fourth major legislative attempt since 2022. At some point, passing a law that bans social media for those under 19 has to be easier than explaining to constituents why you didn’t.
There is also a real tension at play: the same companies now being sued are also building AI tools with genuine potential — in medicine, education, and mental health. Heavy-handed regulation designed to punish Meta could inadvertently slow innovation that actually helps kids. Any serious legislative solution has to thread that needle: protect children online without strangling the technology that may one day serve them.
These verdicts are a genuine first step. The jury got it right: these products were defectively designed, and the companies that built them knew the harm they were causing. But the work doesn’t end in a courtroom. It continues at home, in the living room, when a parent decides to actually look at what their kid is doing online. It continues in Congress when lawmakers stop treating children’s mental health as a bargaining chip. The jury started something this week. Now everyone else has to finish it — we can all start by touching some grass.

.png)
.png)

