@nathaniel and @admin_mike â Is it (or could it) be possible for us to setup memory to NOT collect but to be USED? Here is the scenario - we want to populate user memories programmatically with data we have captured for our users externally (using API). We DONâT want conversations the user has to update these memories but we DO want the memories to be used in conversations.
For additional context, we have brand profiles that we build with clients and we would want to prepopulate them and use or Make.com processes to keep updated. This enables us to ensure that the information they contain includes all the essential information for the pickaxes. It would also enable to work around the 10 memory limit as we can put several pieces of data about the user into a single memory such as their name, title, company name, location, etc. (I think this should work). We have some 20-30 pieces of profile information that we want to leverage with memory.
How can I call or use the user memory stored in role or model reminders so that the AI remembers it at the start of a conversation? Whatever role or model reminder prompt I write, it doesnât workâthe AI keeps asking me to provide those details again instead of recalling them automatically.
If youâd like an additional âUse onlyâ option without collection, we recommend posting it under Feature Requests so other users can vote and discuss it.
You can create user memories programmatically using the Studio API:
For now, you can try prompting the model not to update specific memory fields. With careful prompting, you may be able to achieve the intended behavior, and you can also edit user memories later to refine them further.
Thank you for reaching out. User memory is not stored in the role prompt or model reminder. To make the AI model recall information automatically, youâll need to create user memories in the Users tab of your Studio under User Memories.
If you meant something different or are referring to another part of your setup, please clarify so we can assist you more accurately.