DRPV-13

Do you or your subprocessors use or disclose "anonymized," "de-identified," or otherwise masked data for any purpose other than those identified in the agreement with an institution (e.g., sharing with ad networks or data brokers, marketing, creation of profiles, analytics unrelated to services provided to institution)?

Explanation

This question is asking whether your organization or any subprocessors (third parties you work with) use data that has been stripped of identifying information ('anonymized' or 'de-identified') for purposes beyond what was explicitly agreed upon with the institution. In data privacy contexts, even when personal identifiers are removed from data, there are concerns about how this transformed data might be used. The question specifically asks about potentially problematic secondary uses like: - Sharing with advertising networks or data brokers - Marketing activities - Creating user profiles - Analytics unrelated to the contracted services This question is being asked because: 1. Privacy regulations (like GDPR, CCPA) often restrict how data can be used, even when anonymized 2. There's growing recognition that truly anonymous data is difficult to achieve, as data can often be re-identified through combination with other datasets 3. Institutions want to ensure their data isn't being monetized or repurposed without their knowledge 4. Secondary use of data represents a potential privacy risk to the institution's users/customers When answering this question, you should: - Be transparent about any secondary uses of anonymized data - Clearly explain your data handling practices and those of your subprocessors - Reference your data processing agreements and privacy policies - If you do use anonymized data for secondary purposes, explain the safeguards in place - If you don't use anonymized data for secondary purposes, state this clearly

Example Responses

Example Response 1

No, neither our organization nor our subprocessors use or disclose anonymized, de-identified, or masked data for any purpose beyond those explicitly defined in our service agreements All data processing activities, including those involving anonymized data, are strictly limited to providing and improving the contracted services We maintain comprehensive data processing agreements with all subprocessors that explicitly prohibit any secondary use of customer data, regardless of its form Our internal policies require that all data processing activities be tied directly to service delivery, and we conduct regular audits to ensure compliance with these policies.

Example Response 2

Yes, we do use anonymized data for certain secondary purposes, but only with explicit disclosure and consent from our institutional customers Our Master Service Agreement includes an optional data addendum that allows institutions to opt-in to specific secondary uses of anonymized data for industry benchmarking and research purposes These activities are clearly defined in our agreements, and institutions can opt out at any time We never share this anonymized data with ad networks, data brokers, or use it for marketing purposes All subprocessors are contractually bound to these same limitations, and we maintain a comprehensive inventory of all data uses that is available to our customers upon request.

Example Response 3

We currently do not have controls in place to prevent secondary use of anonymized data While our primary systems and processes focus on delivering our core services, our analytics team does use de-identified data to develop market insights that we occasionally share with partners and use to inform our marketing strategies Additionally, some of our subprocessors may use aggregated data from multiple customers for their own product development purposes We recognize this is an area for improvement in our privacy program, and we are currently developing more robust policies and contractual safeguards to better control how anonymized data is used throughout our supply chain In the meantime, we're happy to discuss specific concerns and potential mitigations on a case-by-case basis.

Context

Tab
Privacy
Category
Data Privacy

ResponseHub is the product I wish I had when I was a CTO

Previously I was co-founder and CTO of Progression, a VC backed HR-tech startup used by some of the biggest names in tech.

As our sales grew, security questionnaires quickly became one of my biggest pain-points. They were confusing, hard to delegate and arrived like London busses - 3 at a time!

I'm building ResponseHub so that other teams don't have to go through this. Leave the security questionnaires to us so you can get back to closing deals, shipping product and building your team.

Signature
Neil Cameron
Founder, ResponseHub
Neil Cameron