Mike
banner
grabbou.com
Mike
@grabbou.com
Objective-C was great 💔 cto and founder at callstack, react native core contributor
If you haven't done it yet, you should also definitely subscribe to weekly newsletter by @sebastienlorber.com. It's packed with insane amount of information from both React and React Native! thisweekinreact.com
This Week In React | This Week In React
A weekly newsletter to stay up-to-date with React & React-Native
thisweekinreact.com
September 22, 2025 at 3:00 PM
You can read each month here on our website www.callstack.com/newsletter, or subscribe!
React Native Newsletter | Expert Insights & Ecosystem Updates | Callstack
Stay on top of the React & React Native ecosystem with our developer newsletter. Get exclusive insights, trends, and updates from core contributors.
www.callstack.com
September 22, 2025 at 3:00 PM
Try it out today! `@react-native-ai/apple` and AI SDK!
September 19, 2025 at 2:32 PM
With React Native AI, we ship precompiled libraries for the most popular models. This setup is reduced to zero.
September 19, 2025 at 2:32 PM
What are prebuilds in this context? (Not React Native this time!)

MLC needs its runtime and model libraries compiled before use. Until now, that meant extra manual steps before you could run a model locally.
September 19, 2025 at 2:32 PM
Get started today. Check out our repository github.com/callstackin...
GitHub - callstackincubator/ai: On-device LLM execution in React Native with Vercel AI SDK compatibility
On-device LLM execution in React Native with Vercel AI SDK compatibility - callstackincubator/ai
github.com
September 18, 2025 at 1:48 PM
The MLC provider is part of a broader suite of our tools for running AI on mobile devices. Check them all out here and star the repo!
🔗 github.com/callstackin...
GitHub - callstackincubator/ai: On-device LLM execution in React Native with Vercel AI SDK compatibility
On-device LLM execution in React Native with Vercel AI SDK compatibility - callstackincubator/ai
github.com
September 17, 2025 at 2:15 PM
It's an interesting alternative: you can mix both providers, e.g. run Apple on-device first while your remote model downloads in the background.
x.com/grabbou/sta...
September 17, 2025 at 2:15 PM
Unlike the Apple provider, the MLC provider doesn’t rely on system-wide models. It downloads weights for each model you use, which can take time.
September 17, 2025 at 2:15 PM
What is going on here?
- MLC is an optimized runtime for running different models on iOS, Android, and desktop GPUs
- We download the model weights onto the device
- Finally, we load it into the engine and set up everything needed for inference
September 17, 2025 at 2:15 PM
We cover the common causes and solutions:
- Reverse ADB port
- Update network configs
- Reset emulator data

Step-by-step here: www.callstack.com/blog/debugg...
September 16, 2025 at 3:00 PM
Here's the repo: github.com/callstackin...
callstackincubator/ai
On-device LLM execution in React Native with Vercel AI SDK compatibility - callstackincubator/ai
github.com
September 15, 2025 at 3:00 PM
Thank you Dan for all your work on Bluesky and React Native! I enjoyed discussions on decentralization, I think you did make it much easier for all of us, and moved React Native even further! Excited for what's next for you, and enjoy your family time! I'll try to catch you next time I am in London!
February 4, 2025 at 9:43 PM