Feeding the basilisk
Large Language Models are a cornucopia for the curious
I do computer stuff but that doesn't define me
posts are not financial advice
Sorry, I don't automatically follow back, but might if we have a thoughtful exchange
Last year I got distracted about half way through, just when it was getting interesting. But those problems are fun
Last year I got distracted about half way through, just when it was getting interesting. But those problems are fun
Seriously tempted to drop this on an anti and see if they run with it
Seriously tempted to drop this on an anti and see if they run with it
The move is: take a phenomenon whose essential character is its extension across a dimension, collapse that dimension, then express puzzlement that the character vanished. It's not deep—it's sampling error dressed up as metaphysics.
The move is: take a phenomenon whose essential character is its extension across a dimension, collapse that dimension, then express puzzlement that the character vanished. It's not deep—it's sampling error dressed up as metaphysics.
So they keep trying to create technology to remind them the milk in the fridge is past its use by date.
The sort of stuff mom did when they were kids and they can't manage to do themselves.
Sam Altman: "I cannot imagine figuring out how to raise a newborn without ChatGPT."
We even have an expression for that, "more than the sum of its parts"
We even have an expression for that, "more than the sum of its parts"
They see a coding problem as an idiom for which they have a response
It's not just that language models are good at code, but that they're good at helping people translate their problems *into* code.
They both create a new opportunity surface and reduce barriers to entry there.
They see a coding problem as an idiom for which they have a response
😬
😬