Alex Chadyuk
banner
chadyuk.bsky.social
Alex Chadyuk
@chadyuk.bsky.social
GPT Models are Brilliant Storytellers.

In this episode of 'This Is AGI', Alex Chadyuk explores the difference between episodic and semantic memory, and explains why today’s generative AI excels at narrative recall but lacks true world models.
January 5, 2026 at 8:28 PM
CEO approving deploy of AI agent after demo
January 4, 2026 at 4:42 PM
We take objects, classes, and relationships for granted, but they are just conventions, not truths carved into reality. Let's explore why these conventions matter, and why guiding AGI to adopt them may be the difference between an intelligible AI world model and the one we can’t understand at all.
December 1, 2025 at 2:53 PM
If we are given only one thing that we can teach a super-intelligent being, this should be an appreciation of complexity. Then it will not try to kill us because we look different, pray in a different gospel, consume too much energy, or simply not clever enough. youtu.be/RfOmArsSBHc
October 15, 2025 at 1:27 AM
As a species, we do not have evolutionary siblings. Genus homo has only one member: homo sapiens. Genus panther has five. When we are succeeded by the next most intelligent species, an embodied AGI, things may not work out too well for us.

'This Is AGI’ podcast: listen every Monday morning.
October 8, 2025 at 12:13 AM
Hallucinating LLMs are a critical step towards artificial general intelligence (AGI). We should not try to fix them but instead build more complex agents that will channel the LLMs’ runaway creativity into self-perpetuating cycles of knowledge discovery.
October 5, 2025 at 6:08 PM
Ban ice cream because it is associated with street violence. Rather, hot weather is associated both with street violence and ice cream demand.

Infection during pregnancy is associated with both autism risk and Tylenol use. Same pattern. Ban ice cream!

www.nature.com/articles/mp2...
Prenatal fever and autism risk - Molecular Psychiatry
Molecular Psychiatry - Prenatal fever and autism risk
www.nature.com
September 29, 2025 at 7:08 PM
Sam Altman can say: “You can choose to deploy 5GW of compute to cure cancer or you can choose to offer free education to everybody on Earth” as a moral justification for burning another 10GW of compute, while leaving him under no obligation to either cure cancer or provide education to anybody.
September 29, 2025 at 7:01 PM
Hallucinating LLMs are a critical step towards artificial general intelligence (AGI). We should not try to fix them but instead build more complex agents that will channel the LLMs’ runaway creativity into self-perpetuating cycles of knowledge discovery.

This Is AGI. Listen every Monday morning.
September 26, 2025 at 5:42 PM
Ban ice cream because it is associated with street violence. Rather, hot weather is associated both with street violence and ice cream demand.

Infection during pregnancy is associated with both autism risk and Tylenol use. Same pattern. Ban ice cream!
September 24, 2025 at 1:47 PM
'This is AGI' explores the meaning, science, and future of artificial general intelligence. Making complex ideas accessible, we uncover how AGI will reshape technology, society, and everyday life.

Listen every Monday morning on your favourite podcast platform.

www.youtube.com/playlist?lis...
This is AGI - YouTube
Artificial General Intelligence and everything else
www.youtube.com
September 24, 2025 at 1:19 PM