Frequently asked questions
Do we have to disclose our algorithm in a DSAR?
No. Data protection lawAny law, statute, declaration, decree, directive, legislative enactment, order, ordinance, regulation, rule or other binding restriction (as amended, consolidated or re-enacted from time to time) that relates to the protection of individuals with regards to the Processing of personal data. does not require organisations to reveal source code, proprietary algorithms, or detailed weightings. However, you must provide meaningful information about how the output was generated and what it means for the individual.
Are AI-generated score or risk ratings really personal data?
Yes, if they relate to an identifiable individual. Even if the person did not provide the information directly, scores, classifications, and predictions generated about them are likely to be personal data.
What does ‘meaningful information’ mean in practice?
It means explaining, in clear language, what data was used, how the system generally works, and what the outcome represents. The explanation should allow the individual to understand and, where relevant, challenge the result.
Do we only need to explain the AI-generated outputs if the decision was fully automated?
No. Even where a human is involved in the final decision, AI-generated outputs that relate to an individual may still need to be disclosed in a DSAR. Additional safeguardsWhen transferring personal data to a third country, organisations must put in place appropriate safeguards to ensure the protection of personal data. Organisations should ensure that data subjects' rights will be respected and that the data subject has access to redress if they don't, and that the GDPR principles will be adhered to whilst the personal data is in the... apply where decisions are based solely on automated processing and have significant effects.