Mass Torts 201: Big Tech on Trial—Meta & Google Accused of Designing Addiction for Children
Failed to add items
Add to basket failed.
Add to Wish List failed.
Remove from Wish List failed.
Follow podcast failed
Unfollow podcast failed
-
Narrated by:
-
By:
About this listen
📌 CORRECTIONS & CLARIFICATIONS:
In this podcast episode, we discussed the facts of K.G.M. v. Meta & Google, which is a California STATE court trial in Los Angeles — not a federal case. During our legal discussion, some of our procedural commentary (motion to dismiss stage, summary judgment still to come) actually applies to the separate federal multidistrict litigation in Oakland, CA, which is still in pre-trial stages. By our air date, the K.G.M. state trial was already underway with opening statements and witness testimony.
A few additional corrections:
K.G.M. began using YouTube at age 6 and Instagram around age 9 (we stated age 10)
Opening statements began the week of Feb. 9, 2026 — not Jan. 27 (that was jury selection/TikTok settlement)
40+ state attorneys general have filed suit against Meta (we stated 33)
The core of our discussion — the product liability theory, Section 230 workaround, internal Meta documents, TikTok/Snapchat settlements, and the Big Tobacco parallels — remains accurate. Multiple overlapping cases are happening in 2026 and some details got crossed.
Video Chapters
0:00 – Welcome to Lay Man’s Law School (trial attorneys from Georgia)
0:15 – Friday cheers + “heart out” intro banter
1:18 – “Stand Up to a Bully Day” + why this topic matters
1:32 – The big question: should Big Tech be liable for harm to kids?
2:32 – The case: KGM v. Meta & Google (anonymous plaintiff)
2:50 – Allegations: addiction-by-design, depression/anxiety, body dysmorphia
3:02 – Sextortion + explicit content claims (Instagram/Snapchat)
3:39 – Internal Meta emails: “Instagram is a drug… we’re pushers”
4:34 – Legal problem: third-party content vs platform responsibility
5:18 – When addiction-by-design becomes “product” behavior
6:15 – What Meta allegedly knew: teen body image + dopamine targeting
6:47 – The Big Tobacco analogy (profits vs safety)
7:41 – Psych & neuroscience: engineered dopamine loops
8:22 – Mental health stats spike (2010–2020) + why timing matters
9:17 – “It’s free” but still monetized: ads as the business model
10:25 – Phone company analogy: is this just like prank calls?
11:16 – Why it’s different: intentional addiction mechanics
12:04 – Gaming/gambling parallels + rising depression trends
13:05 – Jonathan Haidt: 4 harms (social, sleep, attention, addiction response)
14:33 – Where the case really is: early stages, not “trial win” yet
16:00 – TikTok & Snapchat settlements vs Meta/Google fighting
16:53 – Big Tobacco parallel: small players settle, big players gamble
17:11 – Zuckerberg “FaceMash” origin story + intent argument
18:45 – How society regulates new inventions (cars/seat belts analogy)
20:25 – Personal algorithm example + why kids can’t self-regulate
21:27 – Warnings, age limits, and caregiver responsibility
23:31 – Age restriction loopholes + “we’re not fixing it” allegations
24:29 – Practical barrier: what verification actually works? (ID, biometrics?)
27:20 – Section 230 explained in plain English
29:48 – The key strategy: product liability (design), not user content
30:34 – What “motion to dismiss” means (plausibility vs proof)
31:28 – The justice/value angle: culture must protect vulnerable groups
34:05 – Takeaway for families: don’t let 10-year-olds on social media
36:31 – If you could legislate today… what would you do?
38:05 – Parallels to porn ID laws + why the law lags tech
39:36 – Hypothetical: if this were a drug, lawsuits would explode
41:41 – “Intentionality” is the whole case
43:22 – Final takeaways + reminder: summary judgment is next
45:04 – Next episode tease: birthday surprise
CONNECT WITH US
📸 Instagram:instagram.com/AugustaAttorneys
👍 Facebook: facebook.com/hawklawgroup
🎵 TikTok: tiktok.com/@augustaattorneys
💼 LinkedIn: linkedin.com/company/hawk-law-group
🌐 Website: hawklawgroup.com