Corry Wang
@corrywang.bsky.social
520 followers 50 following 110 posts
Compute @ Anthropic | Formerly AI strategy @ Google and tech equity research @ Bernstein Research
Posts Media Videos Starter Packs
corrywang.bsky.social
From 1840 to 1850, private Britons cumulatively invested 40% of British GDP into the country’s first rail network. For reference, the equivalent today would be the tech industry spending like, $10 trillion dollars on a single thing

Anyways it’s confirmed, guess we’re all doing this again guys
corrywang.bsky.social
I don't think Americans realize that outside the US, you can now just buy Ozempic online for $150/month. This will ultimately fall to <$50/month

This actually might've ended up as the important thing in global society in the 2020s, it weren't for the whole, yknow, AI thing
corrywang.bsky.social
In 1978, AT&T launched the US's first modern cell service in Chicago. The nationwide launch was scheduled for the early 80s, but never happened because AT&T was broken up for antitrust violations in 1982

Predicting the future is easy. Making money is hard
corrywang.bsky.social
There's a famous anecdote about the invention of the cellphone: in 1981 McKinsey estimated it'd have a TAM of <1M people, so AT&T exited the market

Turns out this anecdote is made up. AT&T's marketing team did claim this, but the engineers just ignored them and launched anyways
corrywang.bsky.social
Like, every single part of this sentence is wrong?? Inference on a 500M parameter model requires 1 billion flops, which is not 1000 flops, which is also not 1 tflop (that's a trillion flops)

LLMs are actually fairly good at explaining how they work these days... try asking them!
corrywang.bsky.social
I sometimes wonder these days what % of equity research is just written by ChatGPT. But then I see UBS publish a paragraph like this and realize I'm still getting 100% authentic human content
corrywang.bsky.social
It's quite striking that despite everything that's happened in AI over the last 3 years, the world is still spending *less* capex building semiconductor foundries today than in 2022

All of AI is still small enough to be washed away by consumers buying -10% fewer Android phones
corrywang.bsky.social
I will say that anecdotally when I was at Google, the handful of folks I met at Waymo were not particularly scaling-inclined. Hence the urgency
corrywang.bsky.social
7/ I’ve never been that impressed by Tesla FSD compared to Waymo. But if Waymo’s own paper is right, then we could be on the cusp of a “GPT-3 moment” in AV where the tables suddenly turn overnight

The best time for Waymo to act was 5 years ago. The next best time is today!
corrywang.bsky.social
6/ In contrast to Waymo, it’s clear Tesla has now internalized the bitter lesson

They threw out their legacy AV software stack a few years ago, built a 10x larger training GPU cluster than Waymo, and have 1000x more cars on the road collecting training data today
corrywang.bsky.social
5/ If the same thing is true in AV, this basically obviates the lead that Waymo has been building in the industry since the 2010s. All a competitor needs to do is buy 10x more GPUs and collect 10x more data, and you can leapfrog a decade of accumulated manual engineering effort
corrywang.bsky.social
4/ The bitter lesson in LLMs post 2019 was that finetuning tiny models on bespoke edge cases was a waste of time. GPT-3 proved if you just to train a 100x bigger model on 100x more data with 10,000x more compute, all the problems would more or less solve themselves!
corrywang.bsky.social
3/ Waymo built its tech stack during the pre-scaling paradigm. They train a tiny model on a tiny amount of simulated and real world driving data and then finetune it to handle as many bespoke edge cases as possible

This is basically where LLMs were back in 2019
corrywang.bsky.social
2/ This paper shows autonomous driving follows the same scaling laws as the rest of ML - performance improves predictably on a log linear basis with data and compute

This is no surprise to anybody working on LLMs, but it’s VERY different from consensus at Waymo a few years ago
corrywang.bsky.social
1/ I don’t think people have realized how much this new Waymo scaling laws paper is basically an admission that “Waymo was wrong, Tesla was right”

Hopefully this becomes a call to action internally within Waymo
cpaxton.bsky.social
According to new research by waymo, self driving cars neural nets perform better according to power scaling laws. More data and compute = better performance. waymo.com/blog/2025/06...
corrywang.bsky.social
When I started my career, nobody cared about semis. The sector didn't grow, Moore's law was dead, and everybody just wanted to talk about SaaS stocks

How the times change
corrywang.bsky.social
Of course, the big thesis for NVIDIA in 2016 was that "virtual reality is finally going to take off"

Meanwhile, AI accelerator revenues were forecasted to hit a whole... $1B in 2018. (Today, that business is runrating at $160B annualized)
corrywang.bsky.social
When Goldman Sachs initiated on NVIDIA in 2016, there was general amazement that any semiconductor company could actually grow revenues sustainably
corrywang.bsky.social
(5/5) Anyways, only one way to find out. I’ll be working on compute at Anthropic to help scale the next generation of models

When I was at Bernstein, our NVIDIA analyst used to repeat an old Jensen quote: “This will either be great, or terrible.”

Let’s hope great
corrywang.bsky.social
(4/5) Are AI scaling laws going to be the Moore’s Law of the 21st century?

I think it’s already pretty clear that they’re the most important thing happening in tech. But I also think there’s an off-chance they’re also the most important thing happening in society… in general
corrywang.bsky.social
(2/5) I learned an incredible amount in my 5 years at Google - about tech strategy. About building effective teams. About transformer model math

But probably the most important thing I learned was the magic of a straight line on a log chart: x.com/corry_wang/s...
x.com
corrywang.bsky.social
Life update: I joined Anthropic at the start of the month!

(1/5)
corrywang.bsky.social
More than one friend has asked me in the last few months… so here’s my rule of thumb:

1 ChatGPT query costs, like, 5 H100 seconds

An H100 consumes roughly the same electricity as the average American house

So 1 ChatGPT query = turning on the lights in your house for 5 seconds
andymasley.bsky.social
I wrote a cheat sheet version of my Using ChatGPT is not bad for the environment post. It's paired down and more focused on simple responses to common objections, without a long intro or my background environmental philosophy. andymasley.substack.com/p/a-cheat-sh...
A cheat sheet for conversations about ChatGPT and the environment
Arm yourself with knowledge
andymasley.substack.com
corrywang.bsky.social
The top 2 Korean battery makers (LG and Samsung) have now burned nearly $10B of cash since mid-2022, just as CATL has flipped to $8B+ in annualized profits

Another 5 years of this, and the ex-Chinese battery industry is going to cease to exist