https://twitter.com/impactology
Folks who have a granular awareness of the MECHANISM of their cognitive processes in their working memory during problem-solving activities
Folks who have a granular awareness of the MECHANISM of their cognitive processes in their working memory during problem-solving activities
Makes me think of valve reinventingorganizationswiki.com/en/cases/val...
Makes me think of valve reinventingorganizationswiki.com/en/cases/val...
Corporate design work necessitates predictability via SOPs, deliverables, design systems, "data driven design"
That's why you have designers that talk like product managers not because they want to but they have to.
Corporate design work necessitates predictability via SOPs, deliverables, design systems, "data driven design"
That's why you have designers that talk like product managers not because they want to but they have to.
If AI operates on tokens, people must operate on meanings between tokens
If AI operates on tokens, people must operate on meanings between tokens
Post-token post-structuralism
Post-token design
Mix and connect things so far apart that no probability distribution would connect them.
Post-token post-structuralism
Post-token design
Mix and connect things so far apart that no probability distribution would connect them.
Think outside the token & think between the token
Think outside the token & think between the token
en.wikipedia.org/wiki/Concept...
en.wikipedia.org/wiki/Concept...
I think its standup comedian, a joke is funny because its unexpected
I think its standup comedian, a joke is funny because its unexpected
Non-ergodic work : work where outcomes aren’t predictable averages and where your returns depend on rare events, work that exists in anti-statistical space
Non-ergodic work : work where outcomes aren’t predictable averages and where your returns depend on rare events, work that exists in anti-statistical space
Work that violates statistical axioms so fundamentally that modeling it destroys its value
Work that violates statistical axioms so fundamentally that modeling it destroys its value
That which has plenty of eg of what recurs, co-occurs, that can be generalized across many examples
So a question to ask in a job that is being LLMed is to ask what kind of work problems, techniques, situations hardly recur and are not generalizeable?
That which has plenty of eg of what recurs, co-occurs, that can be generalized across many examples
So a question to ask in a job that is being LLMed is to ask what kind of work problems, techniques, situations hardly recur and are not generalizeable?
Its always asking what is the most coherent next token? Every prompt is a statistical scaffolding
What is easy to model statistically? That which can be quantified & correlated
Its always asking what is the most coherent next token? Every prompt is a statistical scaffolding
What is easy to model statistically? That which can be quantified & correlated