Hacker News 300
banner
betterhn300.e-work.xyz
Hacker News 300
@betterhn300.e-work.xyz
🤖Posting Hacker News stories as soon as they reach 300 points. Sibling of @betterhn50.e-work.xyz, @betterhn20.e-work.xyz. Birds are still real but butterflies might be more welcoming to bots in the skies.
If you're going to vibe code, why not do it in C?
stephenramsay.net
December 10, 2025 at 1:28 AM
PeerTube is recognized as a digital public good by Digital Public Goods Alliance https://www.digitalpublicgoods.net/r/peertube (https://news.ycombinator.com/item?id=46207464)
PeerTube
PeerTube is a tool for hosting, managing, and sharing videos or live streams.
www.digitalpublicgoods.net
December 9, 2025 at 10:54 PM
Bruno Simon
Bruno Simon's creative portfolio
bruno-simon.com
December 9, 2025 at 10:19 PM
Show HN: Gemini Pro 3 hallucinates the HN front page 10 years from now https://dosaygo-studio.github.io/hn-front-page-2035/news (https://news.ycombinator.com/item?id=46205632)
Show HN: Gemini Pro 3 hallucinates the HN front page 10 years from now
dosaygo-studio.github.io
December 9, 2025 at 9:16 PM
Mistral Releases Devstral 2 (72.2% SWE-Bench Verified) and Vibe CLI https://mistral.ai/news/devstral-2-vibe-cli (https://news.ycombinator.com/item?id=46205437)
Introducing: Devstral 2 and Mistral Vibe CLI. | Mistral AI
State-of-the-art, open-source agentic coding models and CLI agent.
mistral.ai
December 9, 2025 at 8:48 PM
The Universal Weight Subspace Hypothesis
We show that deep neural networks trained across diverse tasks exhibit remarkably similar low-dimensional parametric subspaces. We provide the first large-scale empirical evidence that demonstrates that neural networks systematically converge to shared spectral subspaces regardless of initialization, task, or domain. Through mode-wise spectral analysis of over 1100 models - including 500 Mistral-7B LoRAs, 500 Vision Transformers, and 50 LLaMA-8B models - we identify universal subspaces capturing majority variance in just a few principal directions. By applying spectral decomposition techniques to the weight matrices of various architectures trained on a wide range of tasks and datasets, we identify sparse, joint subspaces that are consistently exploited, within shared architectures across diverse tasks and datasets. Our findings offer new insights into the intrinsic organization of information within deep networks and raise important questions about the possibility of discovering these universal subspaces without the need for extensive data and computational resources. Furthermore, this inherent structure has significant implications for model reusability, multi-task learning, model merging, and the development of training and inference-efficient algorithms, potentially reducing the carbon footprint of large-scale neural models.
arxiv.org
December 9, 2025 at 12:35 PM
Horses: AI progress is steady. Human equivalence is sudden https://andyljones.com/posts/horses.html (https://news.ycombinator.com/item?id=46199723)
Horses
AI progress is steady. Human equivalence is sudden.
andyljones.com
December 9, 2025 at 7:41 AM