Question: “This is a message form user.”
Bot answer: {“answer”: something somerthing…}
Happening on all bots!
Example 2:
{ “thought”: "The user wants a social media post (FB & IG) for a laundry service pick-up point (XXXXX). The content is specific: Christmas wishes and opening hours for today (until 16:00).
EDIT.
Seems the problem is happening on Gemini 3 Pro and Gemini 3 Flash, no bugs on Grok 4.1 Fast
EDIT 2. Seems this is an issue by actions in Gemini 3 Preview models according to Google AI forums. If possible update the models to non preview ones (if they are available). For anyone else trying to fix their answers, simply disable all ACTIONS / TOOLs from your bot and the answers return to normal.
I’m making a pickaxe that outputs raw HTML code with Gemini 3 Flash as the underlying model. It’s getting maxed out (Output token exhaustion) without a max out error that comes with the full API model. Meaning, this is likely the preview model.
When an AI generates long code blocks (especially HTML with extensive internal CSS), it often hits the “Max Tokens” limit of the specific model (e.g., GPT-5.1 or Claude 4.5 Sonnet). It doesn’t “error out”; it simply stops mid-sentence when it runs out of “room” to talk.
Thanks for taking the time to dig into this and for sharing the Google AI forum thread. I completely get the frustration here. Newer models like Gemini 3 are exciting, but early releases do sometimes come with rough edges like this, especially around Actions and Tools.
From what we’re seeing, this is coming from the Gemini 3 Preview models themselves rather than anything specific to your Pickaxe setup. Unfortunately that means there isn’t much we can reliably fix on our side right now, beyond flagging it upstream.
For the moment, the best workaround is to switch to a non-preview model or a different provider until Google resolves the issue. We’re actively keeping an eye on the Google AI forum and will share updates as soon as there’s something concrete to act on.