Sam Harsimony
banner
harsimony.bsky.social
Sam Harsimony
@harsimony.bsky.social
I write about opportunities in science, space, and policy here: https://splittinginfinity.substack.com/
What will you do with 1000 Wikipedia's worth of knowledge stored on your laptop? You can have thousands of recipes, textbooks on every subject, advice for every possible car repair, travel ideas for every city.

Might only need internet access when you want to gossip.
December 3, 2025 at 9:56 PM
Yeah there's always going to be an advantage to larger models (i.e. VCD), assuming you have enough data to assign every bit in your model.

But more params means more costs! and in a lot of applications, I predict people will choose cheap/local/fast AI
December 3, 2025 at 12:46 AM
Other notes:

Params. needed to achieve a certain performance level halves every ~4 months:
arxiv.org/abs/2412.04315

So we'll run modern frontier models on our laptops in 2029?? With way lower latency.

Running a 70B param. model today would take two 3090's. Costs about $2K. So 700B -> $20K?
Densing Law of LLMs
Large Language Models (LLMs) have emerged as a milestone in artificial intelligence, and their performance can improve as the model size increases. However, this scaling brings great challenges to tra...
arxiv.org
December 2, 2025 at 6:25 PM
Super interesting. Has anyone tried to use MHD drives on air that's been turned into a plasma?
December 2, 2025 at 5:26 PM
That's a good habit. I think it's virtuous to work towards a more walled-off and less divisive internet.

Good-faith replies are more virtuous in some sense, but not sustainable for me.

Angry replies are a step backwards and I wish people would avoid it.
November 26, 2025 at 3:51 PM
A counterpoint to this: Google (and others?) haven't given up on parameter scaling yet

bsky.app/profile/hars...
Given Tim's 10T estimate and this guess of 7.5T I may have been Wrong that Gemini is <3T params.

The fact that everyone serves in 4-bit quant now really muddles things. Perhaps model size should be measured in memory rather than params.
oh, here he estimates 7.5T

x.com/scaling01/st...
November 26, 2025 at 3:46 PM
This is why I subscribe to the anti-AI blocklist. It blocks the people who block Ted.

bsky.app/profile/segy...
ted was chosen because he is relatively non-combative and extremely, extremely widely blocked, i can assume that if your list blocks Ted for his AI opinions it's just pure hater stuff
November 26, 2025 at 3:10 PM
I'd add that since urbanization and population-weighted density have increased, the typical home today is closer to a large city, enjoying its amenities. Thus a higher price.
November 26, 2025 at 4:35 AM
I guess we'll have to agree to disagree here. Re-reading, it seems framed by the appropriate amount of context:

"Probably some of these won’t replicate, and in a few years we’ll be left with a thinner and more believable profile of GLP-1 effects."
November 25, 2025 at 6:58 AM
I think it's fine to write posts like this!

I'd rather people publish Bold Theories than only utter what can be supported with a dozen footnotes.
November 24, 2025 at 11:33 PM