The Impact of the EU’s New Data Protection Regulation on AI

The Impact of the EU’s New Data Protection Regulation on AI

The EU’s new data privacy rules, the General Data Protection Regulation (GDPR), will have a negative impact on the development and use of artificial intelligence (AI) in Europe, putting EU firms at a competitive disadvantage compared with their competitors in North America and Asia. The GDPR’s AI-limiting provisions do little to protect consumers, and may, in some cases, even harm them. The EU should reform the GDPR so that these rules do not tie down its digital economy in the coming years.

Tags
Artificial Intelligence GDPR AI

Comments

In reply to by Anonymous (not verified)

Profile picture for user n002871y
Submitted by Daniel Castro on Sat, 22/09/2018 - 16:38

Thank you for this thoughtful comment. Respectfully, however, I disagree with your conclusion that the GDPR does not impose a significant obligation on organizations to be able to conduct manual (i.e. human) review of certain solely automated decisions.

As described in the WP29 guidance, "The controller cannot avoid the Article 22 provisions by fabricating human involvement. For example, if someone routinely applies automatically generated profiles to individuals without any actual influence on the result, this would still be a decision based solely on automated processing. To qualify as human involvement, the controller must ensure that any oversight of the decision is meaningful, rather than just a token gesture. It should be carried out by someone who has the authority and competence to change the decision. As part of the analysis, they should consider all the relevant data."

If your point is merely that a particular review only must occur after a complaint, then I agree with you (and indeed, while your quote is from a section heading, of the summary of the report, the actual text of the report describes this point in more detail.) However, the burden of human review is substantial, and requires an upfront consideration on the part of businesses. Moreover, the potential for significant requests for manual review necessarily limits businesses from automating certain processes. This limit is not tied to the accuracy of those decisions. The data subject has no incentive not to seek a human review of an automated decision that has an appropriate, but adverse, effect (e.g. denying a loan to someone who is overextended on credit already). Without some countervailing measure, there is no backstop to prevent excessive requests for human reviews.

In evaluating any regulation it is important to look at both its intent, but also how it can lead to unintended consequences or even be abused. In this regard, the GDPR falls short.

 

User
Submitted by Anonymous (not verified) on Sat, 22/09/2018 - 13:52

User account was deleted