The back story behind the first "$1.8B" dollar "AI Company"
chermanowicz
50 points
9 comments
April 06, 2026
Related Discussions
Found 5 related stories in 59.5ms across 3,752 title embeddings via pgvector HNSW
- A.I. Helped One Man (and His Brother) Build a $1.8B Company jbredeche · 50 pts · April 02, 2026 · 64% similar
- OpenAI Cap Table leak reveals Microsoft's 18x return diehunde · 20 pts · April 04, 2026 · 57% similar
- Revealed: UK's multibillion AI drive is built on 'phantom investments' tablets · 91 pts · March 09, 2026 · 54% similar
- Yann LeCun raises $1B to build AI that understands the physical world helloplanets · 403 pts · March 10, 2026 · 54% similar
- Yann LeCun's AI startup raises $1B in Europe's largest ever seed round ottomengis · 409 pts · March 10, 2026 · 53% similar
Discussion Highlights (6 comments)
datadrivenangel
The NYT article says that they saw financial statements, and apparently the owner has cleared $60-70m already. Also the article said that they're expanding to selling ED treatments as well. Also they had at least 10 direct contractors, and OpenLoop is a whole platform as well.
parsimo2010
So he basically just set up a middleman that connected customers to physicians to sell a popular drug. On the one hand, it's easy to blame them for not being more careful with customer data or not being more honest with their advertising or being more compliant with any/all laws. And it's easy to point the finger at AI because they used it to code the product. On another hand, a whole lot of non-AI companies have also had embarrassing data breaches or misleading ads or some other misconduct. Uber basically just coded an app and the majority of their workforce are actually gig contractors. Theranos basically lied their ass off about their blood tests. I can't count how many companies have lost my personal information. So I don't really think this is the AI's fault. This is a founder in "move fast and break things" mode and it's their fault for not taking the rules seriously. The AI just means they aren't abusing some young coder eager for a good payday when the company exits. The young coder would have never pushed back on the founder and warned them about the FDA and potential class action lawsuits. So the AI can't really be blamed for not doing that either- the founder has to take responsibility for the crap their company did, whether it was an AI or a human employee.
rossdavidh
So, "BS-as-a-service" allows you to make more money off BS, including deepfake before-and-after, and falsifying doctor profiles at scale. I mean, fraud is definitely one application where "hallucinations" are not so much of a problem, that is a valid point.
m_ke
So just a typical dropshipper
sharadov
Strange but I smelled a fraud as soon as I finished reading the NYT article. This is the second time in my life that this has happened to me. First time was when I interviewed at this social media startup which had raised a 100 mil during the pandemic, and promptly collapsed, all numbers there were cooked.
AndresCampos
more to come most likely