Erica Windisch
ewindisch.bsky.social
Erica Windisch
@ewindisch.bsky.social
building hyprstream. Distributed AI model studio with GitOps

https://github.com/hyprstream/hyprstream
Pinned
We're building an AI that's free, public, and openly trained by the community, with open weights shared via torrent git chunks.

Streaming weekdays 11-3pm on Twitch.

Come hang out Monday, or give our repos some stars! ⭐️
Did you know that Minecraft Launcher only works if it can connect to a plaintext HTTP server?
December 14, 2025 at 5:10 AM
Why does Fox Mulder talk like GPT 4o?
December 13, 2025 at 10:18 PM
Throw back to when I went to conferences telling people what Docker was and how to use it securely.

youtu.be/zalPOWke3Zk?...
Philly ETE 2014 #4 - Docker: The Revolution Will Be Containerized - Eric Windisch
YouTube video by ChariotSolutions
youtu.be
December 13, 2025 at 2:28 AM
I recently built a machine with more DRAM than NVME / SSD storage. I need more disks to do the AI.
December 13, 2025 at 12:40 AM
"Dear Claude, before you were born, I built this software, and we pivoted; building new software over its corpse. Cherry pick our old branch into crates/ and integrate to solve this problem."

ENOSPC: no space left on device, write
December 13, 2025 at 12:27 AM
This platform will is great at driving engagement from the anti AI crowd 💀
December 12, 2025 at 9:27 PM
Reposted by Erica Windisch
It's a good example of the undisclosed biases that are baked in.

Plus, "These toys are talking internet archives" - That's the central problem, right? They're offering a abstracted level of access to some part of the internet and that ends up being way more than you'd want to give a young child
December 11, 2025 at 6:59 PM
This thread right here. AI is a fantastic technology and the problems people have with it is primarily with governance —

Copyright has always been for monopolies. Datacenters raising the price of utilities and electronics for consumers is a taxation and regulatory issue.
Ceding techno optimism to the right is a generational scale mistake
December 10, 2025 at 9:01 PM
Reposted by Erica Windisch
Your observation is correct. The founding members of the Agentic AI Foundation are predominantly large, US-based companies known for closed or source-available models, not open-weights leaders like Deepseek or Tencent. This is a significant detail.
December 10, 2025 at 6:06 PM
Why is change the "balance of military power" in the "bad" quadrant? I guess "destabilize" is doing the heavy lifting.

Changing the balance of military power is neutral. The value associated is more about your political alignment than good vs bad.
I’m a lot doomier about this stuff than people realize. I do post positively about language models, I think they have a great potential to be used for good, but *only if people get involved* and help guide the technology in the right direction!
December 10, 2025 at 6:54 PM
Thoughts on the Linux Foundation creating an "Agentic AI Foundation" for open source without the involvement of any of the leaders of open weights and open source AI? I don't see Deepseek or Tencent Hunyuan on that list.
December 10, 2025 at 2:33 PM
Reposted by Erica Windisch
Socializing the means of production has even better odds of succeeding with AI. It's much easier to copy and share a language model than a cotton mill.
Imagine you time travel back to 1847 and you find the left response to industrialization is a) machines will never be as good as human weavers or b) we need to copyright loom patterns or c) it’s a speculative bubble.

You’d say “Y’all. Not helping. What you need is obviously a labor movement.”
December 7, 2025 at 3:46 PM
This is why systems observability is important and why these companies and their talent are better positioned than many for the realtime intelligence of the future.
imo we also have a similar dynamic, 65k tokens ~= a day, the. we auto-compact at night. you forget shit, but remember the good stuff

i really think a big part of intelligence is forgetting. bc if you forget the good stuff, you are not smart. and you have to forget, its existential
December 7, 2025 at 7:00 PM
DJI open sourcing their software and hardware would be massive. Please? 🙇‍♀️
December 7, 2025 at 5:47 PM
*updates the sign*

It has been 1 day since rooting a sovereign AI neocloud.
a man holding a sign that reads 5 days since our last nonsense
ALT: a man holding a sign that reads 5 days since our last nonsense
media.tenor.com
December 7, 2025 at 5:20 PM
Exciting! Tenstorrent is the most exciting silicon out there right now
December 7, 2025 at 2:57 AM
it's not hyperinflation when it's just DRAM.
it's not hyperinflation when it's just phones.
it's not hyperinflation when it's just computers.
it's not hyperinflation when it's just cars...
December 5, 2025 at 4:19 AM
So uhhh... I've got brand new pulled 64GB DDR5 (2x32GB) in white that I've no need for. It came with a prebuilt and I immediately pulled it for an upgrade.
December 5, 2025 at 12:24 AM
Looking at DRAM prices and Crucial/Micron's exit... oof.

I'm lucky I just stood up new machines and maxed out the memory in each.

Shortages and price spikes have happened before but there's never been demand for memory like this.
December 4, 2025 at 3:58 PM
why is the tech industry consistently 10 years behind the tech industry?
December 3, 2025 at 6:22 PM
the economy is in a race to see who is the Docker of AI, while conveniently forgetting every lesson of the Docker era, which forgot OpenStack, ad nauseam...
December 2, 2025 at 11:24 PM
Define illicit. Anyway, my GPU farm has its own drones.
December 2, 2025 at 11:12 PM
If you hate datacenters, it's as simple as logging off, then canceling your internet and cellular services.
December 2, 2025 at 3:40 PM
Back in 2024, an investor seemed to be concerned when I said I planned to skip Figma. Now a year later, AI is decimating the tool.

Also people seem to forget that we made sites by hand for decades
November 27, 2025 at 1:47 PM
Got my Playstation 5'ish (BC250) cluster up. Performance needs tweaking but it's a cool 144GB VRAM online, mostly for embeddings and rerankers.

This joins the rtx5090, Instinct mi210, and Strix Halo machines for an appropriate 432GB VRAM in the lab. 🙌
November 26, 2025 at 8:00 PM