Extract

...

Introduction

In recent months, researchers,1 government bodies,2 and the media3 have claimed that a ‘right to explanation’ of decisions made by automated and artificially intelligent algorithmic systems is legally mandated by the forthcoming European Union General Data Protection Regulation4 2016/679 (GDPR). The right to explanation is viewed as a promising mechanism in the broader pursuit by government and industry for accountability and transparency in algorithms, artificial intelligence, robotics, and other automated systems.5 Automated systems can have many unintended and unexpected effects.6 Public assessment of the extent and source of these problems is often difficult,7 owing to the use of complex and opaque algorithmic mechanisms.8 The alleged right to explanation would require data controllers to explain how such mechanisms reach decisions. Significant hype has been mounting over the empowering effects of such a legally enforceable right for data subjects, and the disruption of data intensive industries, which would be forced to explain how complex and perhaps inscrutable automated methods work in practice.

You do not currently have access to this article.