david barsky
davidbarsky.com
david barsky
@davidbarsky.com
i like cooking and reading books. my day job is to work on ersc.io, but before, it was rust-analyzer. he/they is fine.
holy moly
😳😳😳
February 2, 2026 at 12:29 AM
i am willing to become a pickup truck guy but for only this car
January 30, 2026 at 7:25 PM
my dumb ass cat has had to get a mass in her eardrum that ate through her skull, but had no symptoms, and we had no idea it was there until dentistry discovered it by accident???
January 23, 2026 at 3:07 AM
Reposted by david barsky
We are excited to announce that we can successfully use Rust's standard library from the GPU. This has never been done before.

www.vectorware.com/blog/rust-st...

Supporting Rust's standard library enables existing Rust code to work on the GPU and makes GPU programming feel normal.
Rust's standard library on the GPU
GPU code can now use Rust's standard library. We share the implementation approach and what this unlocks for GPU programming.
www.vectorware.com
January 20, 2026 at 3:39 PM
i also think that inline completions à la “human writes the function signature, llm writes the body as a flavor of stochastic term search” is an evolutionary dead end. i want either fully agentic editing supported by LSP or next edit predictions alongside LSP
for intellectual honesty’s sake, i found llms to be underwhelming for my work until mid-2025 (i called their output “slop”). i tried out the latest version of gemini and claude at the time, which still didn’t generate valid rust code at the time, but were pleasantly human refinable
It happened again!
January 9, 2026 at 8:11 PM
for intellectual honesty’s sake, i found llms to be underwhelming for my work until mid-2025 (i called their output “slop”). i tried out the latest version of gemini and claude at the time, which still didn’t generate valid rust code at the time, but were pleasantly human refinable
It happened again!
January 9, 2026 at 8:05 PM
Reposted by david barsky
oh, and we got a fun new logo!
We're actively looking to grow our team at @ersc.io again! If you are interested or know someone who might be, I'd love to chat.

We are looking for distributed storage folk as well as frontend/UX.
January 8, 2026 at 10:28 PM
thanks meta ai!
December 25, 2025 at 4:42 PM
i'm not the biggest fan of copilot, but i respect the fact that one of their internal models is called "metis". they've got some james c. scott sickos over in devdiv
December 25, 2025 at 12:05 AM
the isaac chotiner of distributed systems jepsen.io/analyses/nat...
Jepsen: NATS 2.12.1
jepsen.io
December 9, 2025 at 5:27 PM
alright, ai ben garrison is pretty darn good
November 30, 2025 at 11:05 PM
Reposted by david barsky
And sure enough, this changed the 8 second expansion time into a 100ms! A bit more info can be found in the PR description here github.com/rust-lang/ru.... This should improve general speed of rust-analyzer in the majority of projects given how pervasive these kinds of macros are in Rust.
proc-macro-srv: Reimplement token trees via immutable trees by Veykril · Pull Request #21097 · rust-lang/rust-analyzer
For macros, rust has the concept of TokenTrees and TokenStreams which are basically a tree-like form of the underlying tokenized input with (){}[] delimiters forming Groups (internal nodes) and all...
github.com
November 22, 2025 at 4:20 PM
okay i can work with this for a week for two
November 19, 2025 at 12:17 AM
the barber shaved off too much of my beard the other day and every time i look in the mirror, i get a jumpscare 😭

i think i gotta do moustache-only until my beard is grown back
November 17, 2025 at 10:44 PM
OH: sharon van etten is like more shoe-gazey big thief
November 16, 2025 at 3:08 AM
so, uh, asking a for a friend: is using pulumi a good idea these days? i like the programming model a lot (and _really_ like the AWS CDK), but it’s strange how many *ex*-pulumi people i know of
November 11, 2025 at 10:30 PM
claude, i assure you, i am probably one of the worst people to run this upsell on
November 6, 2025 at 1:16 AM
i hit this issue in the last two weeks! i’m glad the oxide folks wrote this up
My colleague Dave Pacheco wrote up a great description of a new (to us?) Rust async pathology--"futurelock"--another extremely sharp edge to watch out for, with no particular guardrails. Of course, we'll be talking about it with @bcantrill.bsky.social on Monday's Oxide and Friends
609 - Futurelock / RFD / Oxide
rfd.shared.oxide.computer
October 31, 2025 at 10:13 PM
so this is how anthropic will get me to spend $200/month, huh
October 31, 2025 at 10:05 PM
i think one of more surreal moments in the last few weeks was seeing @steveklabnik.com's announcement be featured by @quinnypig.com in Last Week in AWS. like, i started that jj startup! we're not even on AWS at the moment!
October 30, 2025 at 6:59 PM
Reposted by david barsky
rust-analyzer now fully uses the new trait solver! 🎉🎉 rust-analyzer.github.io/thisweek/202...
Changelog #299
Commit: 049767eRelease: 2025-10-27 (v0.3.2658)
rust-analyzer.github.io
October 27, 2025 at 11:55 AM
Reposted by david barsky
TL;DR: I'm going to be leaving @oxide.computer next month, which I'm very sad about. But it's to join @ersc.io , which I'm very excited about!
October 22, 2025 at 5:26 PM
Reposted by david barsky
I see a future in jj
Blog post: I see a future in jj by Steve Klabnik
steveklabnik.com
October 22, 2025 at 5:22 PM
the thing is, rust can have really fast compile times, but the pervasive use of macros and build scripts obscures that fact. impure proc macros and build scripts are the free parking of the rust ecosystem: benefiting the few at the cost of many
at this point, i think i prefer `go generate` over build dot rs scripts. the latter are too annoying to parallelize and i think codegen is rare enough that being able to run it as an explicit step is worth it for faster/more reliable IDEs
October 18, 2025 at 6:56 PM
no kings, only queens
October 18, 2025 at 5:04 PM