Gal Zahavi
banner
galzahavi.bsky.social
Gal Zahavi
@galzahavi.bsky.social
Shaping the dialogue between developers and AI, right from the terminal | Gemini CLI @ Google | #AI
Everything stays in context, making your workflow smoother and more powerful.

If you want to read more about it, check out this blog post: developers.googleblog.com/en/say-hello...
Say hello to a new level of interactivity in Gemini CLI- Google Developers Blog
We're excited to announce an enhancement to Gemini CLI that makes your workflow more powerful a...
developers.googleblog.com
October 15, 2025 at 5:19 PM
You need more than one knife in the kitchen! So awesome that the CLI is the one you grab every day!
August 27, 2025 at 7:49 PM
It also provides a reliable point to restart from if any issues arise, especially when using an LLM, ensuring you can quickly get back on track.
August 26, 2025 at 5:44 AM
Breaking down large tasks into smaller, manageable subtasks is a great way to stay organized. I recommend committing your changes after completing each subtask and frequently staging your changes. This approach keeps your work organized, separating finished code from what's still in progress.
August 26, 2025 at 5:43 AM
Thanks for the feedback! If you haven't already, please file a bug or feedback report on the Gemini CLI repo. This is the best way to ensure the team gets this feedback.
August 19, 2025 at 4:36 AM
That looping behavior is incredibly frustrating. You can set maxSessionTurns in your settings, which will stop after N turns. Additionally, we recently shipped an update with a specific loop check; it's designed to recognize when it's in a repetitive cycle and should automatically stop the process.
August 8, 2025 at 1:04 PM
That loop is incredibly frustrating, you can set maxSessionTurns in your settings to help. This safety net automatically stops token-wasting requests after N turns. Also, clearing the context when starting a new task or subtask can improve response quality. Context rot has a big impact on quality.
August 8, 2025 at 12:56 PM