Fear and denial in Silicon Valley over social media addiction trial
1659447091
107 points
167 comments
March 28, 2026
Related Discussions
Found 5 related stories in 53.1ms across 3,471 title embeddings via pgvector HNSW
- What next for big tech after landmark social media addiction verdict? ColinWright · 13 pts · March 26, 2026 · 65% similar
- Jury signals tech titans on hook for social media addiction Brajeshwar · 31 pts · March 21, 2026 · 63% similar
- Meta and Google found liable in social media addiction trial ColinWright · 97 pts · March 25, 2026 · 62% similar
- Meta and YouTube Found Negligent in Social-Media Addiction Trial 1vuio0pswjnm7 · 67 pts · March 26, 2026 · 61% similar
- Meta and YouTube found negligent in landmark social media addiction case mrjaeger · 437 pts · March 25, 2026 · 56% similar
Discussion Highlights (20 comments)
slopinthebag
Good hahaha. The ethically devoid people who have no problems engineering platforms to maximise addictiveness at the cost of immense societal harm should be scared. Doubly so the execs who push for it.
ViktorRay
"It is difficult to get a man to understand something, when his salary depends on his not understanding it."
mmaunder
We all know they're addictive, they're designed to be addictive, and they're very, very harmful, to both adults and children. The individuals who are profiting from the harm are clearly identifiable. And that harm directly targets children. That this is allowed to continue is a symptom of a sick society.
operatingthetan
>The verdict has forced those inside the companies to grapple with the fact that many outsiders do not view them as favourably as they have come to view themselves. I'm not sure this rings true to me. Meta has to know that millenials and younger are giving up on their platforms, they have endless internal data showing it, right? If anything they are just afraid of endless litigation while they are struggling to gain an AI foothold.
PearlRiver
A lot of people make their job their identity instead of something to pay off the mortgage with. Which in turn creates a lot of denial about your actions.
iugtmkbdfil834
Dunno how I feel about it. On the one hand, clearly something has to be done, because it all has been steadily going downhill for a while now. And heavens know, courts may be just one of the very few things big corps actually fear. Still, there is a part of me questions to what extent we are to blame. Yes. I know corps do what they can to keep us engaged. I read HN too. I didn't say it was a big part.
ZunarJ5
Crocodile tears.
lifestyleguru
For years "addictive" had been a positive and desired adjective in description of projects, jobs, and services. So it appears... they really are... addictive.
artyom
> "We remain confident in our record of protecting teens online" Meta rep said on Wednesday. I mean, if that's where your confidence comes from...
amazingamazing
This site is also guilty. Why can’t you hide your karma from the top and read all comments without the unreadable colors they give downvoted comments? Forcing you to play stupid games. Unsurprising since this site is from the same Silicon Valley. People will give excuses for this. Guess what, meta and Google have their own too.
webdoodle
Not in that order: first denial, because like nicotine industry, they KNEW IT WAS ADDICTIVE but got everyone hooked anyway. The Fear is only because it might (but probably won't) get regulated heavily. They are predators, and the only way to fix this is to give them hard, long jail time. Fines won't do shit.
next_xibalba
I am convinced that social media is addictive for some, and likely a negative influence for many. But this is just shoddy journalism: > "The verdict has forced those inside the companies to grapple with the fact that many outsiders do not view them as favourably as they have come to view themselves." They quote one unnamed insider for this characterization. I recall from my stats 101 class that n=1 is not a strong basis from which to make broad claims about a population of 10s of thousands.
techblueberry
Meta has made it abundantly clear through their words and actions they dgaf what happens to anyone as long as it doesn’t get in the way of their profits so I say throw the book(s) at them. Repeatedly. Indefinetly.
taurath
I hope they’re gone and all their money Feeds without options should be illegal. Not every interaction needs to be your self control vs 30 years of professional marketing psychology doing A/B tests. It’s not a fair fight. Pokemon cards are the same too.
jimmyjazz14
I have no love for social media, but I also really don't like the idea of the government regulating how apps are designed, or trying to circumnavigate online privacy to "protect children" which where I see this whole thing going. On another note, personally I'm not sure I buy the "addictive" argument with social media, maybe its just me but I find social media pretty boring, but I think for a lot of younger people it is something that fills a need for meaning and connection to the world that has been diminished due to a loss of community in our society (which does predate social media).
lern_too_spel
Good. Zuckerberg fought common sense regulation, and now people are suing for what he did without those regulations. Let the chickens go home to roost.
martythemaniak
I propose a Neotemperance movement. The original Progressives of the late 19th and early 20th century were not just against alcohol but all sorts of social ills, including gambling. The Neotemperance movement would be anti engineered addiction, anti gambling, anti misinformation, anti ads, and anti corruption.
ktimespi
The fact that I couldn't turn off shorts recommendations on youtube is just so, so annoying. It's such a time sink and I'm glad that the tides are finally shifting against addictive algorithms like these.
czhu12
What would be an actually good faith way of regulating this short of banning it for children (which I’d think is fine). How do you define what is too addictive? At any given time it seems like whatever is defined as the most addictive is just the one with most market share? For me personally I think most addictive is actually hacker news (god bless you all)
jmyeet
I believe there's a Chicxulub level meteor headed for social media and it's not addiction. It's liability . We, as a society, don't really care about addiction. That's reflected in our government. Gambling, nicotine, alcohol, drugs, etc. Remember with tobacco it was the harm not the addiction that was their undoing. Core to all of this is what's colloquially become known as The Algorithm. Google in particular has sucessfully propagandized this idea that The Algorithm is a neutral black box over which we have no influence (for search). But every feature and behavior of any kind of recommendation or ranking or news feed algorithm is the result of a human intentionally or negligently creating that behavior. So one thing most of us here should be aware of is to get more distribution for a post or a video or whatever is through engagement . That is likes, comments, shares, reposts, quotes and so on. All these companies measure those and optimize for engagement. That sounds neutral and possibly harmless but it's not and I think it's foreseeably not harmless and no doubt there's evidence along the way to demonstrate that harm. We've seen this with some very harmful ideas that get a lot of traction online. Conspiracy theories, antivaxxer nonsense, doxxing queer people, swatting, the manosphere and of course eating disorders. ED content has a long history on the Internet and you'll find pro-ana or "thinspiration" sites and forums going back to the 1990s. So I think social media sites are going to have three huge problems going forward: 1. That they knowingly had minors (and children under 13, which matters for COPPA) on their platforms and they profited from that by knowingly or negligently selling those audiences to advertisers; 2. They knew they had harmful content on their platforms but hid Section 230 in particular as simply being the host for third-party content. I believe that shield is going to fail; and 3. They knowingly or negligently pushed that content to children to increase overall engagement. One clue to all this is you see Mark Zuckerberg who wants to push age verification into the OS. Isn't that weird? The one company that doesn't have an OS thinks the OS should handle that or, more specifically, should be liable for age verification? That's so strange. In an era where we have LLMs (and the systems that came before) that can analyze posted content (including video) and derive features about that content you don't get to plead ignorance or even user preference. These companies will be held liable for the harm caused by content they distribute.