Data Ethics: Legal and Regulatory Aspects of Data Ethics

Miss Mabelle Bayard was Thomas F. Bayard’s daughter, a United States Senator from Delaware and former candidate for President. When Mr. Samuel Warren married Miss Mabelle Bayard in Washington on January 25, 1883, he was not probably aware of the long-lasting attention the press would have reserved to him and his wife’s family in the years to come. Indeed, such a noble family afforded a lot of stories to those scandalmonger newspapers that in those years were greedy of idle gossip. The constant attention the press pledged to the couple bothered Mr. Warren to the point of looking for a legal safeguard for his family private life.

Mr. Warren was a lawyer and for this reason, he was probably aware that the property right was considered as the main source to safeguards against the intrusion into a person’s private life. An observation borne by a well-established jurisprudence. This kind of protection was condensed in the proverbial expression: “A man’s house is his castle”. It is attributed to Sir William Blackstone (1723–1780) (one of the fathers of the English Enlightenment) and for more than a century, it stood as the bulwark for people’s enjoyment of a private sphere. This meant that a private sphere was recognized as long as the threshold of private property was not physically overstepped.

While this legal principle lasted for a long time, it started to lose its ground when new technologies made the intrusion into a person’s private sphere something possible. The second half of the 19thcentury was indeed a period of great technological advancement and innovation. Inventions such as the microphone and the camera gave the possibility to sneak into people’s intimate life with no need to trespass someone’s physical property. This led Mr. Warren and his colleague Louis Brandeis to look for new safeguards to protect people’s “right to be let alone”. As a matter of fact, in 1890 they published an article on the Harvard Law Review that laid the foundation of the long tradition of privacy and data protection doctrine. Their article, borne by elegant rhetoric, tried “to consider whether the existing law affords a principle which can properly be invoked to protect the privacy of the individual” [1]. They understood that when major technological advancements occur, well-established principles, even if affirmed by a robust set of judicial decisions, may soon become obsolete.

Whereas the core objective of Virt-EU is to identify ethical and social values and make them operational through the development of the Privacy, Ethical and Social Impact Assessment (PESIA)model, we have found pivotal to run a legal analysis about the European regulatory framework on data protection. Such an effort has been spent in the willingness to develop the PESIA consistently with the existing regulation and principles. For this very reason, the Polytechnic University of Turin and Open Rights Group have been vested with a similar task to that accomplished by Warren and Brandeis in their seminal article: i) looking at the limits of the existing regulatory framework and ii) ascertaining which ethical and social issues in data processing are taken into account by DPA’s, Article 29 Working Party, European Court of Human Rights, European Court of Justice and privacy practices.

The evolution of data protection regulation

Warren and Brandeis were fully aware that when major technological changes occur, the recognition of new rights is needed since technological innovation runs faster than the legislators’ capacity to hold technologies accountable for the effect they spread on society. Considering this, we performed the first task mentioned before exploring the evolution of the European regulatory framework in relation to the spread of new technologies.

Because of the challenges brought in the early 50’s by governments and big corporations with the creation of large databases, where personal information could be aggregated, retrieved and connected to a wide extent, the legal dimension acted as a mere instrument to express and harmonise these issues, shaping data processing to tackle the risks of new forms of discrimination and societal control.

This led to cast the notion of data protection on the idea of control over information, thus the first data protection regulations gave individuals a sort of counter-control over collected data. As a matter of fact, legislators pursued this goal by increasing the level of transparency about data processing and safeguarding the right to access to information. Yet with the advent of personal computers in the mid-80s, new forms of marketing based on customer profiling and extensive data collection prompted legislators to focus on the economic value of personal information. As a result, citizens claimed more power in terms of negotiation against businesses exploiting their personal data. The Directive 95/46/EC represented the answer to this issue, introducing the “notice and consent” model.

Today we are experiencing an advancement in the analysis of large amounts of data collected from multiple sources, facilitated by the development of cloud computing and big data analytics. These technologies make it possible to monitor social behaviors, infer patterns of behavior and apply such patterns to individuals, to predict their actions and take decisions that affect them, which might lead to discriminatory practices.

In this context, it is furthermore essential to recognize the nature of the fundamental right of data protection (Art. 8, EU Charter of Fundamental Rights), to create a barrier against the commodification and reification of personal information. This is probably the main contribution, in general terms, of the legal framework to harmonize societal values and the use of data to develop new devices.

The challenges brought by the IoT and why the law alone does not suffice to address them

Returning to the story of Mr. Warren, what particularly bothered him was the meticulous description newspapers gave about the parties his wife used to host, the color of her dress, the people who participated: this was the kind of information he would have kept private. Anyway, Mr. Warren was aware of the purpose of the exploitation of this information. This meant that he had no reason to fear an intrusion beyond those situated events.

In this regard, the IoT is a game changer. Since individuals voluntarily choose to introduce smart objects into their houses, someone might reasonably argue that they are acquainted with the hidden practices performed by the objects themselves: how the data gathered are analysed, the purposes of their processing, the kind of technology involved, the categories of data processed and the eventual imbalance of power between the data subject and the data controller. These are the issues that should matter nowadays to citizens

Yet the ubiquitous and invisible nature of IoT devices makes it difficult to understand to what extent the hidden practices they perform affect people’s daily life. In her book, Virginia Eubanks (Automating Inequalities) observes that

“many of the devices that collect our information and monitor our actions are inscrutable, invisible pieces of code. They are embedded in social media interactions, flow through applications for government services, envelop every product we try or buy. They are so deeply woven into the fabric of social life that, most of the time, we don’t even notice we are being watched and analysed.”[2]

This is due to the fact that the IoT is everywhere: we dress it (Body Area Network,e.g. wearable devices) we welcome it in our homes (Local Area Network, e.g. smart home appliances), we inhabit the cities where every step we take can be potentially registered and analysed at fast and growing pace (Very Wide Area Network, e.g. smart city). So, what if the seamless flow of information across these different dimensions was merged and analysed with no restrictions of any kind? Such a possibility is rendered even more troublesome by machine learning algorithms, as they pose new problems by reducing human intervention in the processing of personal data, increasing societal issues regarding decision-making processes. For example, an insurance company may deny the insurance coverage due to an error in the processing of clients’ data. Or even worse, it may set the price based on the social group we belong to.

In this regard, the law does not suffice to grant citizens adequate safeguards, because while protecting common values such as privacy, personal identity, and dignity, it falls short to address those ethical and societal issues ignited by the advent of the so-called “algorithmic society”. It works well protecting the individual, but it lacks the capacity to tackle those decisions based on obscure algorithms that end up being more biased towards certain social groups.

Thus, how can we go beyond these limits? Should we look, as Mr. Warren did, at courts’ ruling, waiting for the establishment of a solid jurisprudence from which to deduce new rights? Or should we rely on a strong regulation that avoids the aforementioned issues to occur?
European Union’s approach towards data protection

In this part of the project, Politecnico di Torino and Open Rights Group analysed the regulatory mix present in Europe (i.e. data protection legislation, judicial decisions, guidelines, charters of values, best practices and standards), to understand if it takes into account the social and moral values threatened by the rapid technological development. The picture that emerged shows that while the cornerstone balancing of interests achieved in the Directive 95/46/EC tried to satisfy the demand of data subjects to be in control of their data (recognising a prominent role to individual consent and imposing information duties on data controllers), over the years the “notice and consent” mechanism has shown its limits in providing an effective safeguard to moral and social values.

Such insight has been further confirmed by the empirical analysis based on the 2015 Eurobarometer on Data protection and on the 2016 Flash Eurobarometer on e-Privacy. According to these surveys, the notice and consent mechanism has fallen short to bestow people with sufficient control over their data: indeed, only a minority of data subjects fully read privacy policies. Furthermore, it is inadequate, given that people are largely unaware of the kind of data collected and analysed.

This is an important element that should be considered in the following stages of our investigation (deliverable 4.1). As a matter of fact, we will conduct an in-depth analysis to assess whether the GDPR offers new solutions to bolster the data subject’s interests. Indeed, our aim is to ascertain if the up-coming General Regulation affords protection to people for what concerns the collective dimension of data processing and the consequent risks it brings. In this vein, we will see how the data protection framework will be forged by the complex interplay between the requirements present in the GDPR and the future judicial decisions that will be issued by EU Courts (European Court of Human Rights and European Court of Justice), national data protection authorities and Art. 29 Working Party. Thus, a mixed approach that foresees the integration of the GDPR with a set of judicial decisions is the solution we retain more apt to grant citizens a legal safeguard against the unrestricted use of data.

First conclusions and further investigation

Based on these findings, it seems that the regulatory mix developed in Europe since 1995 on data protection is only theoretically able to take into account the social and legal implications of data uses. Values are often implicitly considered by the different components of the regulatory mix, but there is a lack of tools which can make values explicit and operationalise them.

This does not mean that the regulatory mix ignores the importance of ethical and social values, but it implies that it has difficulties in putting them into practice in a clear and direct manner. In this regard, the PESIA will represent a concrete tool in the hands of developers and designers to assess the risks at stake with data-intensive practices.


[1]Warren, Samuel D., and Louis D. Brandeis. “The Right to Privacy.” Harvard Law Review4, no. 5 (1890): pp 197. doi:10.2307/1321160.
[2]Virginia Eubanks, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, St. Martin’s Press, 2018.