Information Sciences and English, UIUC. Distant Horizons (Chicago, 2019). tedunderwood.com
- avatar colors that show whether we're mutuals
- exportable bookmarks with custom folders
- feed of trending papers and articles
- safety alerts when a post goes viral
- researcher profiles with topics, affiliations, featured papers
- avatar colors that show whether we're mutuals
- exportable bookmarks with custom folders
- feed of trending papers and articles
- safety alerts when a post goes viral
- researcher profiles with topics, affiliations, featured papers
While building The ATOM Project and other tools to measure the open ecosystem at Interconnects, we are often frustrated with using downloads as a primary metric
While building The ATOM Project and other tools to measure the open ecosystem at Interconnects, we are often frustrated with using downloads as a primary metric
“I want to trap hungry 19c ghosts in jars to help us with historical research” ✅
“Please read our holiday card; we got a hungry ghost to write it this year” ❌
"My girlfriend is a hungry ghost I trapped in a jar"? No. Deranged.
“I want to trap hungry 19c ghosts in jars to help us with historical research” ✅
“Please read our holiday card; we got a hungry ghost to write it this year” ❌
I really hope the open source community can keep pace.
I really hope the open source community can keep pace.
"My girlfriend is a hungry ghost I trapped in a jar"? No. Deranged.
"My girlfriend is a hungry ghost I trapped in a jar"? No. Deranged.
Submit a 2-4 page paper to the CHI workshop I am co-organising! (deadline Feb 12) “Science and Technology for Augmenting Reading"
chi-star-workshop.github.io
Submit a 2-4 page paper to the CHI workshop I am co-organising! (deadline Feb 12) “Science and Technology for Augmenting Reading"
chi-star-workshop.github.io
We found that if you simply delete them after pretraining and recalibrate for <1% of the original budget, you unlock massive context windows. Smarter, not harder.
We found embeddings like RoPE aid training but bottleneck long-sequence generalization. Our solution’s simple: treat them as a temporary training scaffold, not a permanent necessity.
arxiv.org/abs/2512.12167
pub.sakana.ai/DroPE
We found that if you simply delete them after pretraining and recalibrate for <1% of the original budget, you unlock massive context windows. Smarter, not harder.
They organized the theme: yellow, with streamers, driving across town & up to the Lowe’s parking lot
Whenever the model solves a big problem I want to stop working and go tell someone (human) "look how good this turned out!" With a human collaborator, we'd just tell each other.
i don’t mean psychotic episodes it triggers in some or the delusions etc.
i feel like i get some kind of… hangover of uncanny. like i ate plastic
Whenever the model solves a big problem I want to stop working and go tell someone (human) "look how good this turned out!" With a human collaborator, we'd just tell each other.
The products in 2+ years will feel approx instantaneous relative to today.
The products in 2+ years will feel approx instantaneous relative to today.
i don’t mean psychotic episodes it triggers in some or the delusions etc.
i feel like i get some kind of… hangover of uncanny. like i ate plastic
Huge congrats to SHANY DROR for her effort and this incredible achievement.
📄 www.science.org/doi/10.1126/...
Huge congrats to SHANY DROR for her effort and this incredible achievement.
📄 www.science.org/doi/10.1126/...
Spread it around:
Spread it around: