Chat response very delayed and no dots or icon to show it is working

Hi, my chats are very delayed in responding when a user asks a question. I would be ok with that if there was a way to show them that something was actually happening. It just goes blank after the send and then about 20 seconds later a response show. The user has no idea that it’s responding as it doesn’t have the dots like it’s typing or anything at all show up and it just looks like it’s broken. Is there a setting that can show chatgpt is thinking or typing after they hit send so they know it’s working and just slow?

Hi @clevra
I took a quick look at a few of the Pickaxe setups in your studio, and the slow response time seems to come from how the prompts are structured. Many of your tools have very heavy Model Reminders that repeat or even conflict with the main role prompt. When the model has to untangle overlapping instructions, it slows down and sometimes looks frozen to the user.

A helpful thing to remember is that Model Reminder is not the place to define the role.
The Role Prompt is where you set the identity, behavior, tone, and main rules.
The Model Reminder should only include small nudges like “keep answers short” or “follow the structure.” When big role instructions live in both places, the model hesitates.

I’d suggest starting small. Clear out the complex text from the Model Reminder, keep the full role definition in the Role Prompt, and build one piece at a time. This makes it much easier to spot what is causing delays and helps keep the model focused.

Once you simplify things, you should see much smoother responses. If the issue continues, feel free to reach out at info@pickaxeproject.com and we’ll help you take a closer look.

Hi Abhi, thanks so much for your detailed response. It worked perfectly even with the model reminder on ChatGPT 4o but stopped working when I switched to ChatGPT5 which is why I assumed it was a bug. I will revert them all back to the older model and will also remove the model reminder

1 Like

Glad to hear it worked. :slight_smile:
GPT-5.1 is a bit slower these days, even inside native ChatGPT :smiley:
so what you’re seeing is pretty normal. It’s mostly a supply and demand thing. When everyone piles onto the newest model at once, it tends to take its time…

You might want to try a faster model for everyday use or turn off reasoning if your tool doesn’t need it. That usually speeds things up right away.

If anything still feels off after simplifying the prompt, I’m always happy to take another look.