I like computers and Korean and computers-and-Korean and high school CS education.
Georgia Tech → 연세대학교 → 東京工業大学.
https://theoreticallygoodwithcomputers.com/
And some less technical stuff like #Korean, #Esperanto, and #trains (mostly in Japan, just due to proximity).
Since you are here, here are a few of my favorite posts.
And some less technical stuff like #Korean, #Esperanto, and #trains (mostly in Japan, just due to proximity).
Since you are here, here are a few of my favorite posts.
This is a reminder that @cornelltech.bsky.social runs a Red Team Clinic that provides a *free* safety consultation to nonprofits / public sector orgs that are developing a public-facing AI tool and want to stress-test it for possible abuse vectors.
Applications welcome on a rolling basis:
This is a reminder that @cornelltech.bsky.social runs a Red Team Clinic that provides a *free* safety consultation to nonprofits / public sector orgs that are developing a public-facing AI tool and want to stress-test it for possible abuse vectors.
Applications welcome on a rolling basis:
We are looking for people to help us pioneer the next generation of AI—building from Japan to the world.
Join us: sakana.ai/careers
We are looking for people to help us pioneer the next generation of AI—building from Japan to the world.
Join us: sakana.ai/careers
Find more information on juliacon.org/2026/cfp/ and submit your proposal until February 28th at pretalx.com/juliacon-202...
#julia @julialang.org
Find more information on juliacon.org/2026/cfp/ and submit your proposal until February 28th at pretalx.com/juliacon-202...
#julia @julialang.org
There have been discussions about it for *years*, but it still hasn't landed.
Here's a cool write up from an undergrad that took it on as an intern project:
There have been discussions about it for *years*, but it still hasn't landed.
Here's a cool write up from an undergrad that took it on as an intern project:
Merge with us at the Tokenization Discord.
Merge with us at the Tokenization Discord.
I'm printing this out to put on my wall.
permalink: wizardzines.com/comics/oh-sh...
from our zine "Oh shit, git!": wizardzines.com/zines/oh-shi...
I'm printing this out to put on my wall.
www.koreaherald.com/article/1066...
www.koreaherald.com/article/1066...
A few minutes into the ride, Han Duck-soo walked out of the VIP car to take a phone call.
Anyway, he is now about to spend 20+ years in prison.
www.yna.co.kr/view/AKR2026...
A few minutes into the ride, Han Duck-soo walked out of the VIP car to take a phone call.
Anyway, he is now about to spend 20+ years in prison.
Inspire tomorrow’s developers with your open source project. Application period January 19 - February 3, 2026.
➡️ Read all the details in our GSoC Blog here: goo.gle/gsoc-2026-me...
Inspire tomorrow’s developers with your open source project. Application period January 19 - February 3, 2026.
➡️ Read all the details in our GSoC Blog here: goo.gle/gsoc-2026-me...
Standard LLMs force a rigid linear structure on context, treating physical proximity as relevance. Cognitive Load Theory suggests this is inefficient—models waste capacity managing noise instead of reasoning.
arxiv.org/abs/2512.14391
*a popular Japanese rhythm arcade game; think musical electronic wack-a-mole
en.wikipedia.org/wiki/Jubeat?...
*a popular Japanese rhythm arcade game; think musical electronic wack-a-mole
en.wikipedia.org/wiki/Jubeat?...
✅ Open weights (4B, 12B, 27B)
✅ 55 languages + 100s more in training data
✅ Multimodal capabilities (image text)
Blog: blog.google/innovation-a...
Paper: arxiv.org/pdf/2601.09012
Model: huggingface.co/collections/...
Cookbook: colab.research.google.com/github/googl...
✅ Open weights (4B, 12B, 27B)
✅ 55 languages + 100s more in training data
✅ Multimodal capabilities (image text)
Blog: blog.google/innovation-a...
Paper: arxiv.org/pdf/2601.09012
Model: huggingface.co/collections/...
Cookbook: colab.research.google.com/github/googl...
ChatGPT, Gemini, and Claude Opus got it first try and Claude Sonnet got it after very slight prodding.
Pretty impressive, as they were given very little context.
ChatGPT, Gemini, and Claude Opus got it first try and Claude Sonnet got it after very slight prodding.
Pretty impressive, as they were given very little context.