DPIA When introducing AI

DPIA When introducing AI

DPIAs and AI

Under Article 35(1) of the GDPR, organisations are required to carry out a DPIA where a type of processing is likely to result in a high risk to the rights and freedoms of individuals.

Article 35(3) of the GDPR sets out three types of processing which will always require a DPIA:

  • systematic and extensive evaluation of personal aspects relating to individuals which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the individual or similarly significantly affect the individual;
  • large scale processing of special category data; or
  • systematic monitoring of a publicly accessible area on a large scale.
A DPIA should be undertaken by organisations at the early stages of development of any project which involves AI and should feature the following:
  1. A systematic description of the processing activity: data flows and the stages at which AI and automated decisions may affect individuals should be outlined. The ICO suggests that, due to the complexity of AI systems, organisations maintain two versions of the DPIA. The first would be a thorough technical description for specialist audiences and the second, a high-level description of the processing and an explanation of the logic of how the personal data inputs relate to the outputs affecting individuals. The DPIA should also set out the roles and obligations of data controllers and processors. If the AI system is outsourced to an external provider, the organisations should assess whether they are joint controllers, and if so, collaborate in the DPIA process as appropriate.
  2. Assessment of necessity and proportionality: the use of AI for processing personal data needs to be based on a legitimate purpose. Organisations need to demonstrate that the processing of personal data by an AI system is a proportionate activity. Here, organisations should undertake a balancing exercise between the interests of the organisation and the rights and freedoms of individuals. In particular, organisations need to consider any detriment to individuals that could follow from bias or inaccuracy in the algorithms and data sets being used.
  3. Identifying risks to rights and freedoms: organisations should consider other relevant legal frameworks beyond data protection. For example, AI may result in discrimination based upon historical patterns in data, which could fall foul of equality  legislation.
  4. Measures to address the risks: data protection officers and other information governance professionals should be involved in AI projects as early as possible to ensure that risks can be identified and addressed early in the AI system life-cycle. The DPIA should include any safeguards put in place to mitigate the identified risks and it should document the residual levels of risk posed by the processing. These risks must be referred to the ICO for prior consultation if they remain high.
  5. DPIA – a ‘living’ document: while DPIAs must be carried out before the processing of personal data begins, they should be considered a ‘live’ document. DPIAs should be subject to regular review and re-assessment if the nature, scope, context or purpose of the processing changes.

Comment

As AI becomes increasingly prevalent, regulators are continuing to perform a balancing act, ensuring compliance with data protection laws without stifling innovation. The interaction between AI and the GDPR engages a number of complex legal issues. It comes as no surprise that the ICO listed AI as one of its three strategic priorities.

The ICO blog provides useful guidance for organisations when conducting a DPIA for projects involving AI. The ICO plans to publish its final consultation paper on the AI Auditing Framework no later than January 2020. Keep an eye on this blog for news on the final consultation paper!

Relentless Privacy and Compliance Services Ltd provides organisations with local and Global data privacy consultancy ensuring your organisation remains compliant wherever your operational data processing takes place.

Relentless GDPR 24/7 Platform automates the DPIA into your GDPR Strategy

Sharing is caring!

shares
error: Content is protected !!