DPAI-02

Is any institutional data retained in AI processing?

Explanation

This question is asking whether your AI systems store or keep any data from the educational institution during or after processing. 'Institutional data' refers to any information owned by or pertaining to the institution, which could include student records, research data, administrative information, or other sensitive data. This question is being asked in a security assessment because AI systems often need to process data to function, but retaining that data creates security and privacy risks. Educational institutions are particularly concerned about this because they handle sensitive information protected by regulations like FERPA (Family Educational Rights and Privacy Act) and may have research data with intellectual property implications. When AI systems retain institutional data, it creates several risks: 1. Data breach exposure if the AI vendor is compromised 2. Potential misuse of data for training other AI models without permission 3. Compliance violations if data retention exceeds authorized periods 4. Privacy concerns if personal information is stored longer than necessary To best answer this question, you should: - Be specific about what data is retained (if any) and for how long - Explain the purpose of any data retention - Describe security controls protecting retained data - Reference any data deletion processes or policies - Mention if you have a 'zero retention' policy where applicable

Example Responses

Example Response 1

No, our AI solution does not retain any institutional data after processing We operate on a zero-retention model where all data is processed in memory and immediately discarded once the processing task is complete No institutional data is stored, cached, or used to train our AI models We have implemented technical controls to ensure data cannot be inadvertently persisted, including memory wiping procedures after each processing session and regular third-party audits of our data handling practices.

Example Response 2

Yes, our AI system temporarily retains certain institutional data for specific operational purposes Specifically, we retain query logs and associated metadata for 30 days to support troubleshooting and service improvement Additionally, error samples may be retained for up to 90 days when needed to resolve complex issues However, we do not retain full documents or sensitive personal information, and we do not use institutional data to train our AI models All retained data is encrypted at rest, access-controlled through role-based permissions, and automatically purged after the retention period Customers can request immediate deletion of any retained data at any time through our admin portal.

Example Response 3

We have not implemented specific controls regarding institutional data retention in our AI processing While our system does cache processed data to improve performance, we have not established formal retention periods or automatic deletion processes for this data Our engineering team can manually delete cached data upon request, but we do not currently track what institutional data might be retained in our systems or for how long We're working to improve our data governance practices in this area and expect to implement more robust controls in our next major release.

Context

Tab
Privacy
Category
Privacy and AI

ResponseHub is the product I wish I had when I was a CTO

Previously I was co-founder and CTO of Progression, a VC backed HR-tech startup used by some of the biggest names in tech.

As our sales grew, security questionnaires quickly became one of my biggest pain-points. They were confusing, hard to delegate and arrived like London busses - 3 at a time!

I'm building ResponseHub so that other teams don't have to go through this. Leave the security questionnaires to us so you can get back to closing deals, shipping product and building your team.

Signature
Neil Cameron
Founder, ResponseHub
Neil Cameron