building @tiles.run.
community https://userandagents.com.
When Musk got pally with Trump I hoped this would result in Apple being forced to open up more by regulators.
Musk would get his way but in the process we would end up with Apple loosening their grip on iOS.
When Musk got pally with Trump I hoped this would result in Apple being forced to open up more by regulators.
Musk would get his way but in the process we would end up with Apple loosening their grip on iOS.
I've been trying to write this piece for years. Every time I get started I'm just overwhelmed with paralyzing visions of the FOSS commentariat accusing me of WrongThink.
But I'm tired and we urgently need to get our shit together.
Today we call them “ATproto” apps.
In the future … they’re just a better way to build a wide range of apps, where users own their data, login in anywhere, and choose different interfaces.
Today we call them “ATproto” apps.
In the future … they’re just a better way to build a wide range of apps, where users own their data, login in anywhere, and choose different interfaces.
People consider extreme levels of daily pain *absolutely worth enduring* for the features they care about.
They give exactly zero fucks whether their need is "in scope" in your product plan.
People consider extreme levels of daily pain *absolutely worth enduring* for the features they care about.
They give exactly zero fucks whether their need is "in scope" in your product plan.
www.project-syndicate.org/commentary/a...
www.project-syndicate.org/commentary/a...
- deepseek.ai/blog/deepsee...
- deepmind.google/models/gemin...
- deepseek.ai/blog/deepsee...
- deepmind.google/models/gemin...
• open source
• open (& local) data
• open model choice (incl. local)
What's Surf? Watch for more.
• open source
• open (& local) data
• open model choice (incl. local)
What's Surf? Watch for more.
Memory layers are definitely worth studying. Given their parameter-efficient nature and applicability to smaller models, the authors reported promising results using a setup with a 1.3B model + a 1B memory pool.
jessylin.com/2025/10/20/c...
Memory layers are definitely worth studying. Given their parameter-efficient nature and applicability to smaller models, the authors reported promising results using a setup with a 1.3B model + a 1B memory pool.
jessylin.com/2025/10/20/c...