THE PRIMITIVES OF LEGAL PROTECTION AGAINST DATA TOTALITARIANISMS

Similar documents
closer look at Rights & remedies

16 March Purpose & Introduction

Free and Fair elections GUIDANCE DOCUMENT. Commission guidance on the application of Union data protection law in the electoral context

AmCham EU Proposed Amendments on the General Data Protection Regulation

Opinion 3/2019 concerning the Questions and Answers on the interplay between the Clinical Trials Regulation (CTR) and the General Data Protection

EUROPEAN PARLIAMENT Committee on the Internal Market and Consumer Protection

ARTICLE 29 DATA PROTECTION WORKING PARTY. Article 29 Working Party Guidelines on consent under Regulation 2016/679

Principles and Rules for Processing Personal Data

DIRECTIVE 95/46/EC OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL. of 24 October 1995

EDPS Opinion on the proposal for a recast of Brussels IIa Regulation

9091/17 VH/np 1 DGD 2C

A Modern European Data Protection Framework Safeguarding Privacy in a Connected World

APPENDIX. 1. The Equipment Interference Regime which is relevant to the activities of GCHQ principally derives from the following statutes:

Privacy International's comments on the Brazil draft law on processing of personal data to protect the personality and dignity of natural persons

ARTICLE 29 DATA PROTECTION WORKING PARTY

Examination of CII and Business Methods Applications

134/2016 Coll. ACT BOOK ONE GENERAL PROVISIONS

General Data Protection Regulation

PROCEDURE RIGHTS OF THE DATA SUBJECT PURSUANT TO THE ARTICLES 15 TO 23 OF THE REGULATION 679/2016

PROJET DE LOI ENTITLED. The Data Protection (Bailiwick of Guernsey) Law, 2017 ARRANGEMENT OF SECTIONS PART I PRELIMINARY

EUROPEAN GENERAL DATA PROTECTION REGULATION CONSEQUENCES FOR DATA-DRIVEN MARKETING

STATUTORY INSTRUMENTS. S.I. No. 333 of 2011 EUROPEAN COMMUNITIES (ELECTRONIC COMMUNICATIONS NETWORKS AND SERVICES) (FRAMEWORK) REGULATIONS 2011

DATA PROTECTION (JERSEY) LAW 2018

LME App Terms of Use [Google/ Android specific]

A Legal Overview of the Data Protection Act By: Mrs D. Madhub Data Protection Commissioner

Opinion 3/2017 EDPS Opinion on the Proposal for a European Travel Information and Authorisation System (ETIAS)

Law Enforcement processing (Part 3 of the DPA 2018)

Colloquium organized by the Council of State of the Netherlands and ACA-Europe. An exploration of Technology and the Law. The Hague 14 May 2018

Lecture 8: Verification and Validation

Adequacy Referential (updated)

Data protection and privacy aspects of cross-border access to electronic evidence

1. The Commission proposed on 25 January 2012 a comprehensive data protection package comprising of:

EDPS Opinion 7/2018. on the Proposal for a Regulation strengthening the security of identity cards of Union citizens and other documents

(a) Unless otherwise expressly stated to the contrary, terms used herein shall bear the following meanings:

CHARTER OF DIGITAL FUNDAMENTAL RIGHTS OF THE EUROPEAN UNION

Information about the Processing of Personal Data (Article 13, 14 GDPR)

Consultation on the General Data Protection Regulation: CAP s evaluation of responses

GDPR. EU General Data Protection Regulation. ebook Version 1.2

COMPUTERS ON WHEELS WHO OWNS WHICH DATA?

***I DRAFT REPORT. EN United in diversity EN 2012/0010(COD)

GDPR: Belgium sets up new Data Protection Authority

Estonian National Electoral Committee. E-Voting System. General Overview

DIRECTIVE ON ALTERNATIVE DISPUTE RESOLUTION FOR CONSUMER DISPUTES AND REGULATION ON ONLINE DISPUTE RESOLUTION FOR CONSUMER DISPUTES

Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL

Contributary Platform User Terms of Service

The modernised Convention 108: novelties in a nutshell

EXECUTIVE SUMMARY. 3 P a g e

EU Data Protection Law - Current State and Future Perspectives

COMP Article 1. Article 1 Subject matter and objectives

BEFORE THE EUROPEAN COMMITTEE ON LEGAL COOPERATION OF THE COUNCIL OF EUROPE PLENARY MEETING OCTOBER 11-14, 2010

The Ministry of Technology, Communication and Innovation and The Data Protection Office. Workshop On DATA PROTECTION ACT 2017

House Standing Committee on Social Policy and Legal Affairs

Purposes of the Law. Information of Public Importance. Public Authority Body. Legal Presumptions of Justified Interest

Official Journal of the European Union. (Non-legislative acts) REGULATIONS

5418/16 AV/NT/vm DGD 2

SKILLSTAR 2018 NONPROFIT KFT. DATA PROTECTION POLICY

Identifying Drug Labs by Analysing Sewage Systems. Bart van der Sloot, Tilburg University, TILT

Opinion of the European Data Protection Supervisor

Annex - Summary of GDPR derogations in the Data Protection Bill

Data Protection Policy. Malta Gaming Authority

Compliance & Ethics. a publication of the society of corporate compliance and ethics MAY 2018

Having regard to the Treaty on the Functioning of the European Union, and in particular Article 16 thereof,

Data Protection Declaration in accordance with the DSGVO

10 October 2018 Without prejudice

AMENDMENTS EN United in diversity EN. European Parliament. PE v

The legal framework and guidance on data protection under the. Cross-border ehealth Information Services (CBeHIS) T6.2 JAseHN draft v.2 (20.10.

STATUTORY INSTRUMENTS. S.I. No. 47 of 2018 EUROPEAN UNION (NON-AUTOMATIC WEIGHING INSTRUMENTS) REGULATIONS 2018

(1) General information

A whitepaper prepared by Michalsons Attorneys concerning the benefits of using the impression

PE-CONS 71/1/15 REV 1 EN

Terms of Business

The Committee of Ministers, under the terms of Article 15.b of the Statute of the Council of Europe,

COMMISSION DECISION. of setting up the Expert Group on Digital Cultural Heritage and Europeana

Archival Legislation in Singapore

General Terms and Conditions of Ars Electronica Linz GmbH (AEC) Ars-Electronica-Straße 1, A-4040 Linz, Austria

Contract law as fairness: a Rawlsian perspective on the position of SMEs in European contract law Klijnsma, J.G.

Introductory remarks on the analysis of subsidiarity and proportionality

Case T-67/01. JCB Service v Commission of the European Communities

ARTICLE 29 DATA PROTECTION WORKING PARTY

DIVISION E--INFORMATION TECHNOLOGY MANAGEMENT REFORM

Data Distribution Agreement of BME Market Data

The forensic use of bioinformation: ethical issues

27 July 2017 Without prejudice TITLE [XX] DIGITAL TRADE

President's introduction

"PATRON" Token Sale Terms of Service

TERMS OF USE. Version 1.0 posted and effective as of February 28th 2018

UNITED STATES DISTRICT COURT WESTERN DISTRICT OF WASHINGTON AT SEATTLE. THIS MATTER comes before the Court on Defendants Motion for Judgment on the

Cluster Analysis. (see also: Segmentation)

UGANDA REVENUE AUTHORITY TERMS AND CONDITIONS FOR WEB PORTAL USE

IMPLEMENTATION OF SECURE PLATFORM FOR E- VOTING SYSTEM

Internet Policy and Governance Europe's Role in Shaping the Future of the Internet

Dr. Hielke Hijmans Special Advisor European Data Protection Supervisor

EOH 000 ICT TAC 01 Website Terms and Conditions of Use

Adopted on 23 June 2005

STATUTORY INSTRUMENTS. S.I. No. 69 of 2017 EUROPEAN COMMUNITIES (ELECTROMAGNETIC COMPATIBILITY) REGULATIONS 2017

CLASS ACTION DEVELOPMENTS IN EUROPE (April 2015) Stefaan Voet. Recommendation on Common Principles for Collective Redress Mechanisms

POMATO.COM - TERMS OF SERVICE AND USE

ARTICLE 29 Data Protection Working Party

COMMUNICATION FROM THE COMMISSION TO THE EUROPEAN PARLIAMENT AND THE COUNCIL

Data Protection Bill [HL]

Transcription:

THE PRIMITIVES OF LEGAL PROTECTION AGAINST DATA TOTALITARIANISMS Mireille Hildebrandt Research Professor at Vrije Universiteit Brussel (Law) Parttime Full Professor at Radboud University Nijmegen (CS) 3/23/18 Turing Institute GDPR & Beyond 1

3/23/18 Turing Institute GDPR & Beyond 2

3/23/18 Turing Institute GDPR & Beyond 3

SAP https://blogs.sap.com/2014/08/20/social-intelligence-using-sap-hana/ 3/23/18 Turing Institute GDPR & Beyond 4

3/23/18 Turing Institute GDPR & Beyond 5

Competitive advantage for companies & Europe Better data protection, better ML Micro-targeting may not work at all (sounds like science, but may be garbage) It works if decisions are based on it free services business model has reconfigured the markets political opinion as consumer preference disrupts democratic playing field 3/23/18 Turing Institute GDPR & Beyond 6

What s next? 1. Political economy of data driven platforms 2. Data totalitarianisms 3. Machine learning 4. Legal primitives and the GDPR 3/23/18 Turing Institute GDPR & Beyond 7

Political economy of data driven platforms Platforms: Ø technical architectures built on top of the internet and the www Ø corporate assemblages that incorporate horizontal and vertical competitors Ø sharing economy Ø AI platforms Ø cross-contextual data linking, cross-contextual targeting Ø hyperconnectivity and computational backend [digital unconscious] 3/23/18 Turing Institute GDPR & Beyond 8

Political economy of data driven platforms Monopolies and monopsonies: 1. Consumers pay an excessive price for free services, market failure (external costs) 2. Sharing economy: lowers price of services for consumers (Uber, AirBnB) or does it lowers earnings for small providers (Uber) avoiding external costs raises housing price, disturbs fabric of neighbourhoods 3. Concentration of buyer power for labour again results in failure of labour market (Amazon) 3/23/18 Turing Institute GDPR & Beyond 9

Political economy of data driven platforms the new machinic behaviourism Ø machinic neoplatonism (McQuillan) Ø ML assumes that a mathematic target function underlies human behaviours Ø Zuckerberg hoped to figure out fundamental mathematical law underlying human social relationships that governs the balance of who and what we all care about 3/23/18 Turing Institute GDPR & Beyond 10

Data Totalitarianisms 1. Pseudo-omniscient data driven architectures 2. Seamless data ecosystems [digital unconscious] 3/23/18 Turing Institute GDPR & Beyond 11

Data Totalitarianisms Platforms (Greek agora, Roman forum) are space for 1. buying and selling (market place, private sphere) 2. political discourse (parliament, public sphere) 3/23/18 Turing Institute GDPR & Beyond 12

Data Totalitarianisms Difference between tyranny and totalitarianism: Tyranny privatises the public sphere Totalitarianism destroys the private sphere 3/23/18 Turing Institute GDPR & Beyond 13

Machine Learning Mitchell: a computer program is said to learn: from experience E with respect to some class of tasks T and performance measure P, if its performance at tasks in T, as measured by P, improves with experience E 3/23/18 Turing Institute GDPR & Beyond 14

Machine Learning The politics are in who get to determine E, T and P The ethics are in how they are determined Law concerns the contestability 3/23/18 Turing Institute GDPR & Beyond 15

Machine Learning ML often parasites on human domain expertise (notion of ground truth ) [the performance metric will depend on it] 3/23/18 Turing Institute GDPR & Beyond 16

Machine Learning Ground truth for managers: 3/23/18 Turing Institute GDPR & Beyond 17

Machine Learning training, validation, testing out of sample? ground truth, labelling contestable? outperforming based on what performance metric, which ground truth? optimization, predictive accuracy mathematical or real? COMPRESSION mathematical vector-function that defines the dataset in terms of the task as defined unsupervised learning: it learns to compress/define any big data can be defined/compressed in many eg spurious - ways 3/23/18 Turing Institute GDPR & Beyond 18

Machine Learning Foundational indeterminacy inherent in ML: Mathematical: Godel s infinity theorem Algorithms cannot be trained on future data Machine learning: Wolpert s No Free Lunch Theorem Algorithms cannot be trained on future data 3/23/18 Turing Institute GDPR & Beyond 19

Machine Learning Indeterminacy implies we can always be computed in different ways Any particular calculation must be contestable Need to resist computational overdetermination it becomes important to ensure actionable transparency: contestability and foreseeability to serve an actionable right to object: against processing of personal data against machine decisions 3/23/18 Turing Institute GDPR & Beyond 20

What s next? 1. Political economy of data driven platforms 2. Data totalitarianisms 3. Machine learning 4. Legal primitives and the GDPR 3/23/18 Turing Institute GDPR & Beyond 21

What s next? 4. Legal primitives and the GDPR Ø Consent Ø Data minimisation and purpose limitation Ø Automated decisions & profile transparency 3/23/18 Turing Institute GDPR & Beyond 22

Taking consent seriously Art. 6.1(a) consent of the data subject for one or more specific purposes Art. 5.1(c): processing must be adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed 3/23/18 Turing Institute GDPR & Beyond 23

Taking consent seriously Art. 7. 1. Where processing is based on consent, the controller shall be able to demonstrate that the data subject has consented to processing of his or her personal data. 7.2. If the data subject's consent is given in the context of a written declaration which also concerns other matters: Ø the request for consent shall be presented in a manner which is clearly distinguishable from the other matters, Ø in an intelligible and easily accessible form, Ø using clear and plain language. Any part of such a declaration which constitutes an infringement of this Regulation shall not be binding. 3/23/18 Turing Institute GDPR & Beyond 24

Taking consent seriously Art. 7.3. The data subject shall have: Ø the right to withdraw his or her consent at any time. Ø The withdrawal of consent shall not affect the lawfulness of processing based on consent before its withdrawal. Ø Prior to giving consent, the data subject shall be informed thereof. Ø It shall be as easy to withdraw as to give consent. 3/23/18 Turing Institute GDPR & Beyond 25

Taking consent seriously Art. 7.4. When assessing whether consent is freely given: Ø utmost account shall be taken of Ø whether, inter alia, Ø the performance of a contract, including the provision of a service, Ø is conditional on consent to the processing of personal data Ø that is not necessary for the performance of that contract. 3/23/18 Turing Institute GDPR & Beyond 26

Taking consent seriously Recital 43: ( ) Ø Consent is presumed not to be freely given if: Ø ( ) the performance of a contract, Ø including the provision of a service, Ø is dependent on the consent Ø despite such consent not being necessary for such performance. NB! Check the negotations on the EP draft eprivacy Regulation, notably 8.1a epr 3/23/18 Turing Institute GDPR & Beyond 27

European Parliament draft of the electronic Privacy Regulation 8.1a. No user shall be denied access to any information society service or functionality, regardless of whether this service is remunerated or not, on grounds that he or she has not given his or her consent under Article 8(1)(b) to the processing of personal information and/or the use of processing or storage capabilities of his or her terminal equipment that is not necessary for the provision of that service or functionality. 3/23/18 Turing Institute GDPR & Beyond 28

Data Minimisation & Purpose Limitation Processing (broad definition) of personal data (broadly defined) Always requires A legal ground (consent, contract, vital interest data subject, statutory obligation, public task, legitimate interest data controller) An explicit, specified purpose Processing must be necessary for the ground AND for the purpose 3/23/18 Turing Institute GDPR & Beyond 29

Data Minimisation & Purpose Limitation Purpose is the central data protection principle: Whoever determines the purpose (de facto) is controller Controller is responsible and liable for violations (burden of proof) Processing is only allowed if necessary for specified purpose (or a compatible purpose) 3/23/18 Turing Institute GDPR & Beyond 30

Data Minimisation & Purpose Limitation Van der Lei s First Law of Informatics: Ø data shall be used only for the purpose for which they were collected collateral: Ø if no purpose was defined, they should not be used 3/23/18 Turing Institute GDPR & Beyond 31

Data Minimisation & Purpose Limitation Contrary to what some authors claim: ML is not possible without specifying a task Purpose limitation is part of the methodological integrity of ML A task could e.g. be: experimentation Purpose specification requires experimentation to achieve what? Often: to detect patterns in user behaviour, To improve the user interface (AB testing) To influence user behaviour (AB testing) 3/23/18 Turing Institute GDPR & Beyond 32

Automated decisions and profile transparency 4(4) GDPR profiling means: Ø any form of automated processing of personal data consisting of Ø the use of personal data to evaluate certain personal aspects relating to a natural person, Ø in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements; 3/23/18 Turing Institute GDPR & Beyond 33

Automated decisions and profile transparency Art. 15.1(h) GDPR provides a legal right to obtain information about: the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject. Ø Art. 13.1(g) and art. 14.1(h) obligate the controller to provide this information Ø Art. 12.1 requires that the information is provided in a concise, transparent, intelligible and easily accessible form, using clear and plain language, 3/23/18 Turing Institute GDPR & Beyond 34

Automated decisions and profile transparency 22.1 The data subject shall have the right: Ø not to be subject to a decision based solely on automated processing, Ø including profiling, Ø which produces legal effects concerning him or her or similarly significantly affects him or her. 3/23/18 Turing Institute GDPR & Beyond 35

Automated decisions and profile transparency Recital (71) adds: ( ) such as automatic refusal of an online credit application or e-recruiting practices without any human intervention. Such processing includes profiling that consists of any form of automated processing of personal data evaluating the personal aspects relating to a natural person, in particular to analyse or predict aspects concerning the data subject's performance at work, economic situation, health, personal preferences or interests, reliability or behaviour, location or movements, where it produces legal effects concerning him or her or similarly significantly affects him or her. 3/23/18 Turing Institute GDPR & Beyond 36

Automated decisions and profile transparency Art. 29 WP considers that Ø if someone routinely applies automatically generated profiles to individuals Ø without any actual influence on the result, Ø this would still be a decision based solely on automated processing [calling this fabrication of human involvement ] 3/23/18 Turing Institute GDPR & Beyond 37

Automated decisions and profile transparency 22.2 Paragraph 1 shall not apply if the decision: a) is necessary for entering into, or performance of, a contract between the data subject and a data controller; b) is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject's rights and freedoms and legitimate interests; or c) is based on the data subject's explicit consent. 3/23/18 Turing Institute GDPR & Beyond 38

Automated decisions and profile transparency 22.3 In the cases referred to in points (a) and (c) of paragraph 2, the data controller shall: Ø implement suitable measures to safeguard the data subject's rights and freedoms and legitimate interests, Ø at least the right to obtain human intervention on the part of the controller, Ø to express his or her point of view and Ø to contest the decision. 3/23/18 Turing Institute GDPR & Beyond 39

Automated decisions and profile transparency 22.4 Decisions referred to in paragraph 2 shall: Ø not be based on special categories of personal data referred to in Article 9(1), Ø unless point (a) or (g) of Article 9(2) applies and Ø suitable measures to safeguard the data subject's rights and freedoms and legitimate interests are in place. 3/23/18 Turing Institute GDPR & Beyond 40

Automated decisions and profile transparency Recital (71) adds: In any case, such processing should be subject to suitable safeguards, which should include specific information to the data subject and the right to obtain human intervention, to express his or her point of view, to obtain an explanation of the decision reached after such assessment and to challenge the decision (my emphasis). 3/23/18 Turing Institute GDPR & Beyond 41

Automated decisions and profile transparency Recital (71) adds: That right should not adversely affect the rights or freedoms of others, including trade secrets or intellectual property and in particular the copyright protecting the software. However, the result of those considerations should not be a refusal to provide all information to the data subject. 3/23/18 Turing Institute GDPR & Beyond 42

Automated decisions and profile transparency 1. Don t confuse profile transparency or explanations with legal justification of the decision itself! Ø justification could be freedom to contract [which is not unlimited] Ø in case of government agencies, legality principle requires attribution of competences Ø in case of criminal charge high standards apply, e.g. presumption of innocence 3/23/18 Turing Institute GDPR & Beyond 43

Automated decisions and profile transparency 2. Don t be overly impressed with the claim that there is necessarily a trade-off between predictive accuracy and interpretability of the system Ø this depends on uncontroversial ground truth, otherwise accuracy cannot be established 3/23/18 Turing Institute GDPR & Beyond 44

Automated decisions and profile transparency 3. Discriminate between exploratory and confirmatory research: Ø performance claims should be based on confirmatory research Ø this implies research into causalities Ø note: the more training data, the more spurious patterns 3/23/18 Turing Institute GDPR & Beyond 45

From Agnostic to Agonistic ML Hofman, Sharma, Watts on experimental and confirmatory research design: Exploratory ML researchers are free to study different tasks, fit multiple models, try various exclusion rules, and test on multiple performance metrics. When reporting their findings, however, they should: transparently declare their full sequence of design choices to avoid creating a false impression of having confirmed a hypothesis rather than simply having generated one, report performance in terms of multiple metrics to avoid creating a false appearance of accuracy. 3/23/18 Turing Institute GDPR & Beyond 46

From Agnostic to Agonistic ML Hofman, Sharma, Watts on experimental and confirmatory research design: 1. Confirmatory ML: researchers should be required to preregister their research designs, including data preprocessing choices, model specifications, evaluation metrics, and out-of-sample predictions, in a public forum such as the Open Science Framework (https://osf.io). 3/23/18 Turing Institute GDPR & Beyond 47

Automated decisions and profile transparency 4. Meaningful information about the logic of processing : Ø ex ante (info on research design, on relevant output, e.g. classifiers)? Ø research design info, think of DPAs Ø ex post (determination of parameters that informed individual decision)? Ø LIME, counterfactuals, built in transparency 3/23/18 Turing Institute GDPR & Beyond 48

Automated decisions and profile transparency Wachter, Mittelstadt and Russell propose counterfactual explanations : Ø Counterfactuals describe a dependency on the external facts that lead to that decision without the need to convey the internal state or logic of an algorithm Ø What would need to change in order to receive a desired result in the future, based on the current decisionmaking model 3/23/18 Turing Institute GDPR & Beyond 49

Automated decisions and profile transparency Ø the need to change concerns all sides (not just the data subject): Ø includes a redistribution of actionability and responsibility amongst developers, profilers and those profiled 3/23/18 Turing Institute GDPR & Beyond 50

From Agnostic to Agonistic ML Ø the need to change concerns all sides (not just the data subject): Ø includes a redistribution of actionability and responsibility amongst developers, profilers and those profiled 3/23/18 Turing Institute GDPR & Beyond 51

From Agnostic to Agonistic ML Ø Mouffe s agonistic democratic theory 1. Representative (treating votes like preferences or equal concern and respect ) 2. Deliberative (public reason, rational consensus) 3. Participatory (those who suffer the consequences) Ø Rip s agonistic CTA Ø Difference between science in the lab, science in society Ø Civilizing the scientists (Stengers) Ø Robust design: only if relevant voices are heard 3/23/18 Turing Institute GDPR & Beyond 52

Check my recent papers on ssrn, notably: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3081776 And my most recent book: https://www.elgaronline.com/view/9781849808767.xml Further papers at: https://works.bepress.com/mireille_hildebrandt/ 3/23/18 Turing Institute GDPR & Beyond 53

3/23/18 Turing Institute GDPR & Beyond 54