For this group, I am thinking that one of the most valuable things would be a kind of form or history for the parents, that they don’t need to enter their kids’ entire histories every time they go looking for something- home care, financial assistance, therapists, financial planning, etc.
Would User Memory include facts like they have a 19 year old autistic son David, or would that be better for a database?
SOC II provides a level of credibility for this market, right, in general security?
But, HIPPAA? I see there is another open question on that.
I am proposing to start this app on Pickaxe to take advantage of the User Memory and Knowledgebase features.
Hi @kenlyle, this is a good question! Ultimately, the answer depends on your users’ level of comfortability, as I imagine parents have varying degrees of concern over privacy. Below I’m going to add a short summary of answers to privacy-related questions; hopefully this is helpful to you, and it’s all information that you can safely pass on to your users!
Pickaxe has robust security protocols to protect user data. The data you upload to Pickaxe is kept on our servers within AWS, and is encrypted in transit and at rest. The specifics of our privacy policy are laid out in our Terms of Service, and we have released a short video detailing many of the specifics, but in short: Pickaxe will never train models on the data you upload to the knowledge base. While we are not currently training any models with Pickaxe user data, we do reserve the right to train models based on user interactions with Forms or Chatbots - however, if you are a Pro customer, you automatically opt out of this policy, and user interactions cannot be used for training purposes.
For details of the security currently offered by our platform, feel free to view the Pickaxe Trust Center, set up with help from our partners at Vanta, that provides an in-depth look at infrastructure, organization, and product security, as well as our internal security procedures. Pickaxe is now SOC II compliant, which is obviously a strong show of trust in our security protocols and which should help users and end users alike feel confident keeping their data with us. Pickaxe is also officially GDPR compliant, which involves implementing measures such as data encryption, access control, and user consent management. We are also happy to work with users on establishing HIPPA compliance (though this would require a BAA with at least one of our model providers). If users have further questions about security specifics, they can always reach out to us at info@pickaxeproject.com.
This is an excellent idea. I’ll share a bit from my own experience building a mental health Pickaxe for an organization working with kids with special needs. From a compliance perspective, it really depends on the country, the organization involved, and how conservative they choose to be.
HIPAA is a US law, so it primarily applies within the United States. I’ve built mental health tools on Pickaxe for a US-based licensed entity, and they were comfortable with Pickaxe’s security setup. The key factor was how the service was framed.
In our case, the tool was positioned as a support and guidance layer, not a clinical system or medical record. We avoided storing identifiable health records in AI memory and treated the chatbot as an assistive interface rather than a replacement for providers. That distinction made a big difference in how compliance was evaluated.
For anything that looks like long-term records or sensitive histories, I’d recommend keeping that in a separate system and letting Pickaxe focus on interaction, navigation, and personalization. That approach keeps things flexible while staying on the safer side.