Models repeat their message in testing

When I test my prompts the models tend to repeat their messages 2 times. Is there way to combat this or is this just a thing that happens in the builder?

Hi @hurmuli! Are you using Grok 4.20 when this happens, or a different model?

Happening on alle models I’ve tested for some reason.

@lindsay_support

Note that this issue is pretty serious since its doubling the price of each message. Not sure how to combat it since prompting the agents doesn’t seem to help and chancing model has no effect either. Its also doing this doubling in Studio so not just in the builder :frowning:

Here are some more chat IDs where the message is repeated:

50713593-1944-4efd-bd79-94afea2f542b

4BV90TUC8U959ZCH4IYY

2A37P0DABJWXH6TXCLKZ

Thanks so much for sharing the chat IDs. We can see the duplicate assistant turns in those transcripts and have engineering debugging it now, so you don’t need to change any prompts or swap models on your side. I’ll come back to this thread as soon as we have a fix in place. If you notice it only happening after a certain tool fires or when a specific feature (memory, web, etc.) is enabled, feel free to DM me and I’ll add it to the repro notes.

@hurmuli Hi, can you please try again see if its fixed for you

1 Like

No repeating at the moment, but I will send a message if the issue rises again. Thank you for fixing the issue so quickly.

1 Like

@stephenasuncion / @luna_support / @lindsay_support

It seems the issue has returned.

Session ID: LBUG9DK1JH3WBYTVKXJU

@stephenasuncion / @luna_support / @lindsay_support

Session ID: b1c175f4-2261-41ac-af54-d287ae297756

@stephenasuncion / @luna_support / @lindsay_support

Another one: O481RQKCD0NAXXJZWSE4

Note that I’ve tested this now on different models and they all keep doing the same repeating even with explicit prompt telling them to stop.

Hi,

I am looking into it.

1 Like

@hurmuli
Thanks for sharing the additional Session IDs, that really helps.

I’ve created a new ticket (PRD-774) and we’re taking a deeper look into this one. I know this had come up before and was expected to be resolved, so I completely understand how frustrating it is to see it reappear. This time, we’ll dig further to make sure we get to the root cause and fix it properly.

Really appreciate you taking the time to document and share all these details, it makes a big difference in helping us move faster and more accurately. I’ll keep a close eye on this and keep you posted.

1 Like

@hurmuli hi toni !

just deployed another fix to production, shoutout to stephen for doing the legwork, but please let us know if this persists :cook:

as always thank you and talk soon my friend !

2 Likes

Seems the issue is fixed again, big thanks for the whole team. :slight_smile:

1 Like