Activities of Jan Philipp ALBRECHT related to 2016/2225(INI)
Plenary speeches (1)
Fundamental rights implications of big data (short presentation) DE
Shadow reports (1)
REPORT on fundamental rights implications of big data: privacy, data protection, non-discrimination, security and law-enforcement PDF (311 KB) DOC (67 KB)
Amendments (19)
Amendment 11 #
Motion for a resolution
Recital A a (new)
Recital A a (new)
A a. whereas certain big data use-cases involve the training of artificial intelligence appliances such as neuronal networks and statistical models in order to predict certain events and behaviours; whereas the training data often is of questionable quality and not neutral;
Amendment 15 #
Motion for a resolution
Recital B
Recital B
B. whereas the progress of communication technologies and the ubiquitous use of electronic devices, monitoring gadgets, social media, web interactions and networks, including devices which communicate information without human interference, have led to the development of massive, ever-growing data sets which, through advanced processing techniques and analytics, may provide unprecedented insight into human behaviour and our societies;
Amendment 32 #
Motion for a resolution
Recital D a (new)
Recital D a (new)
D a. whereas media reports have revealed the scale of personalised and targeted communication in political campaigns, not only, but also in the context of elections in the United States for several years now, and more recently in the context of the UK "Brexit" referendum;
Amendment 58 #
Motion for a resolution
Recital G
Recital G
G. whereas the proliferation of data processing and analytics, the multitude of actors involved in collecting, retaining, processing and sharing data and the combination of large data sets containing personal data from a variety of sources, retained for unlimited amounts of time, have all created great uncertainty for both citizens and businesses over the specific requirements for compliance with general data- protection principleslegislation;
Amendment 69 #
Motion for a resolution
Paragraph 1
Paragraph 1
1. Emphasises that information revealed by big data analysis is only as reliable as the underlying data permits, and that strong scientific and ethical standards are therefore needed for judging the results of such analysis and its predictive algorithmnalysis;
Amendment 70 #
Motion for a resolution
Paragraph 1 a (new)
Paragraph 1 a (new)
1 a. Emphazises that even if such scientific and ethical standards are met, predictive analysis based on big data can only offer a statistical probability and by no means can predict individual behaviour;
Amendment 71 #
Motion for a resolution
Paragraph 1 b (new)
Paragraph 1 b (new)
1 b. Points out that sensitive information about persons can be inferred from non-sensitive data, which blurs the line between sensitive and non-sensitive data;
Amendment 82 #
Motion for a resolution
Paragraph 2 a (new)
Paragraph 2 a (new)
2 a. Stresses that big data may not only result in infringements of the fundamental rights of individuals, but also in different treatments and indirect discrimination for groups of persons that have similar characteristics;
Amendment 94 #
Motion for a resolution
Paragraph 3
Paragraph 3
3. Points out that Union law for the protection of privacy and personal data, as well as the rights to equality and non- discrimination, are applicable to data processing even when that processing is preceded by pseudonymisation andtechniques, and by anonymisation techniques, insofar as there are risks of re-identification, or, in any case, when use of non-personal data might impact on individuals’ private lives or other rights and freedoms;
Amendment 99 #
Motion for a resolution
Paragraph 3 a (new)
Paragraph 3 a (new)
3 a. Recalls that under the GDPR, the further processing of personal data for statistical purposes may only result in aggregate data which cannot be re- applied to individuals;
Amendment 101 #
Motion for a resolution
Paragraph 4
Paragraph 4
4. Takes the view that transparency, fairness, accountability and control over personal data are core values through which specific rights and obligations are derived, and which should guide the action of corporations, public authorities and other actors that use data to frame their decision-making procedures; emphasises the need for much greater transparency and for algorithmic accountability with regard to data processing and analytics by businesses; recalls that the GDPR already foresees a right to be informed about the logic involved in data processing;
Amendment 123 #
Motion for a resolution
Paragraph 5 a (new)
Paragraph 5 a (new)
5 a. Underlines that the essence of big data should be to achieve comparable correlations with as little personal data as possible; stresses in that regard that science, business and public communities should focus on research and innovation in the area of anonymisation;
Amendment 124 #
Motion for a resolution
Paragraph 5 b (new)
Paragraph 5 b (new)
5 b. Recognises that while the application of pseudonymisation, anonymisation or encryption to personal data can reduce the risks to the data subjects concerned when personal data are used in big data applications or cloud computing, any processing of sensitive data should take into account the risks of future abuses of these measures; recalls that anonymisation is an irreversible process by which personal data can no longer be used to identify or single out a natural person;
Amendment 150 #
Motion for a resolution
Paragraph 8
Paragraph 8
8. Acknowledges that data loss and theft, infection by malware, unauthorised access to data and unlawful surveillance are some of the most pressing risks associated with contemporary data processing activities, such as big data techniques, especially in the context of the "Internet of things"; believes that tackling such threats requires genuine and concerted cooperation between the private sector, governments, law enforcement authorities and independent supervisory authorities; as well as additional legal measures such as software liability;
Amendment 166 #
Motion for a resolution
Paragraph 9
Paragraph 9
9. Calls on the Union andCommission, the Member States and the data protection authorities to identify and minimise algorithmic discrimination and bias, including price-discrimination, and to develop a strong and common ethics framework for the processing of personal data and automated decision-making;
Amendment 173 #
Motion for a resolution
Paragraph 9 a (new)
Paragraph 9 a (new)
9 a. Calls on the Commission, the Member States and the data protection authorities to specifically evaluate the need for not only algorithmic transparency, but also for transparency about possible biases in the training data used to make inferences based on big data;
Amendment 181 #
Motion for a resolution
Paragraph 10
Paragraph 10
10. EncouragesCalls on all law enforcement actors that use data processing and analytics to ensure appropriate human intervention throughout the various stages of the processing and analysis of data, especially when decisions may carry high risks for individuals;
Amendment 193 #
Motion for a resolution
Paragraph 11 a (new)
Paragraph 11 a (new)
11 a. Points out that certain models of predictive policing are more privacy- friendly than others, e.g. where probabilistic predictions are made about places or events and not about individual persons; is concerned that almost all predictive policing tools are proprietary software, which limits transparency and accountability;
Amendment 209 #
Motion for a resolution
Paragraph 13
Paragraph 13
13. Warns that, owing to the intrusiveness of decisions and measures taken by law enforcement authorities in citizens’' lives and rights, maximum caution is necessary to avoid unlawful discrimination and the targeting of certain population groups, especially marginalised groups and ethnic and racial minorities, but also individuals who by coincidence have certain characteristics;