Apoorv Khandelwal
@apoorvkh.com
820 followers 200 following 16 posts
cs phd student at brown https://apoorvkh.com
Posts Media Videos Starter Packs
apoorvkh.com
Will be at ACL this week! #ACL2025 #ACL2025NLP

Presenting Tian Yun’s paper on abstract reasoners at CoNLL on Thursday.

I’ve been investigating how LLMs internally compose functions lately. Happy to chat about that (among other things) and hang out in Vienna!
apoorvkh.com
Curious how many papers were assigned to reviewers on average! Review quality seems better than average from my small sample size. Wondering if that correlates with a lower reviewer load? E.g. I only received 2 papers to review.
Reposted by Apoorv Khandelwal
eugenevinitsky.bsky.social
Tests on USAMO immediately after problems were posted yield surprisingly bad model performance. Suggests there's much more training on test than expected.
arxiv.org/abs/2503.219...
Scores of R1, Flash-thinking, Claude 4.7, QwQ, o1-pro, o3-mini on USAMO 2025. Scores less than 5% of max score.
Reposted by Apoorv Khandelwal
sashamtl.bsky.social
Just read that AI’s energy consumption in data centers is nothing to be worried about because most of the hyperscale datacenters running AI are "powered by renewable energy or low-carbon nuclear power."

Let's debunk that, shall we?
Reposted by Apoorv Khandelwal
nsaphra.bsky.social
If you're in the northeastern US and you're submitting a paper to COLM on March 27, you should absolutely be sending its abstract to New England NLP on March 28.
New England NLP Meeting Series
nenlp.github.io
apoorvkh.com
+ No system pre-reqs, multi-stage PyTorch workflows in one script, CLI integrations, catching system failures as exceptions, SLURM support, better logging, and so much more!

Additional fine-tuning examples in our docs with:
@pytorch.org, Deepspeed, @lightningai.bsky.social, HF Accelerate
apoorvkh.com
A cool side-effect: fine-tune any LLM (from
@huggingface
transformers) on any text dataset *with multiple nodes* in just *one command*.

torchrun.xyz/examples/tra...
apoorvkh.com
It's a replacement for CLI tools, like "torchrun".

Most basic usage: specify some (SSH-enabled) machines you want to parallelize your code on. Then launch a function onto that configuration.

All from inside your Python script!
apoorvkh.com
We made a library (torchrunx) to make multi-GPU / multi-node PyTorch easier, more robust, and more modular! 🧵

github.com/apoorvkh/tor...
Docs: torchrun.xyz

`(uv) pip install torchrunx` today!

(w/ the very talented, Peter Curtin, Brown CS '25)
GitHub - apoorvkh/torchrunx: Easily run PyTorch on multiple GPUs & machines
Easily run PyTorch on multiple GPUs & machines. Contribute to apoorvkh/torchrunx development by creating an account on GitHub.
github.com
Reposted by Apoorv Khandelwal
lambdaviking.bsky.social
✨How does the depth of a transformer affect its reasoning capabilities? New preprint by myself and @Ashish_S_AI shows that a little depth goes a long way to increase transformers’ expressive power

We take this as encouraging for further research on looped transformers!🧵
Paper: A Little Depth Goes a Long Way: The Expressive Power of Log-Depth Transformers
Reposted by Apoorv Khandelwal
soniakmurthy.bsky.social
(1/9) Excited to share my recent work on "Alignment reduces LM's conceptual diversity" with @tomerullman.bsky.social and @jennhu.bsky.social, to appear at #NAACL2025! 🐟

We want models that match our values...but could this hurt their diversity of thought?
Preprint: arxiv.org/abs/2411.04427
apoorvkh.com
I started a blog! First post is everything I know about setting up (fast, reproducible, error-proof) Python project environments using the latest tools. These methods have saved me a lot of grief. Also a short guide to CUDA in appendix :)

blog.apoorvkh.com/posts/projec...
Managing Project Dependencies
blog.apoorvkh.com
apoorvkh.com
I think typing my code and using a linter (ruff) + static type checker (pyright) saves me a lot of grief.
Reposted by Apoorv Khandelwal
jamestompkin.bsky.social
Can GANs compete in 2025? In 'The GAN is dead; long live the GAN! A Modern GAN Baseline', we show that a minimalist GAN w/o any tricks can match the performance of EDM with half the size and one-step generation - github.com/brownvc/r3gan - work of Nick Huang, @skylion.bsky.social, Volodymyr Kuleshov
Reposted by Apoorv Khandelwal
nsaphra.bsky.social
Let he who hath not \usepackage[subtle]{savetrees}
Reposted by Apoorv Khandelwal
Reposted by Apoorv Khandelwal
dorialexander.bsky.social
“They said it could not be done”. We’re releasing Pleias 1.0, the first suite of models trained on open data (either permissibly licensed or uncopyrighted): Pleias-3b, Pleias-1b and Pleias-350m, all based on the two trillion tokens set from Common Corpus.
apoorvkh.com
I am an ex-Paperpile user and am liking Zotero lately! Free storage from the university helps.
Reposted by Apoorv Khandelwal
benlipkin.bsky.social
Lots of folks talking about scaling LLM inference over this last year

Internally, I’ve been developing and using a library that makes this extremely easy, and I decided to open-source it
Meet the decoding library: github.com/benlipkin/de...

1/7
GitHub - benlipkin/decoding: Composable inference algorithms with LLMs and programmable logic
Composable inference algorithms with LLMs and programmable logic - benlipkin/decoding
github.com
apoorvkh.com
“Turn” a decoder into an encoder with LLM2Vec (github.com/McGill-NLP/l...). Seen at COLM 2024 :)

If you want the naive, training-free / model-agnostic approach: their related work section says it is most common to using the final token’s last hidden state.
GitHub - McGill-NLP/llm2vec: Code for 'LLM2Vec: Large Language Models Are Secretly Powerful Text Encoders'
Code for 'LLM2Vec: Large Language Models Are Secretly Powerful Text Encoders' - McGill-NLP/llm2vec
github.com
Reposted by Apoorv Khandelwal
joestacey.bsky.social
Okay genius idea to improve quality of #nlp #arr reviews. Literally give gold stars to the best reviewers, visible on open review next to your anonymously ID during review process.

Here’s why it would work, and why would you should RT this fab idea:
apoorvkh.com
Thanks and great! Hope you are likewise doing well!
apoorvkh.com
Would be great to join, thanks!