“Managing Risk in the Digital Society” was the central theme of the 13th Internet, Law and Politcs conference scheduled for the 29th and 30th of June in
Barcelona. Virt-EU researchers attended an ad-hoc panel willing to display some preliminary research outputs.
The panel was focused on the development of the Privacy Ethical and Social Impact Assessment (PESIA) framework. Indeed PESIA represents Virt-EU’s core
endeavour aiming to introduce an ex-ante evaluation of ethical aspects within the practices of developer communities designing future IoT devices and applications.
The art. 35 of the new GDPR mandates an obligation for the data controller “to carry out an assessment of the impact of the envisaged processing operations”, through the so called Data Protection Impact Assessment. While this approach is meant to gauge the risk linked to the processing of personal data, in terms of substantial threats, It seems to be lacking from a procedural viewpoint. Thus, we asked Alessandro Mantelero, principal investigator at Politecnico di Torino, what’s the novelty of this approach, and how it will interact with the Data Protection Impact Assessment provision featured in the new General Data Protection Regulation.
What is the PESIA model?
The Privacy Ethical and Social Impact Assessment represents a new manner to assess the risk related to the uses of personal information. It is based on previous experience, like the Privacy Impact Assessment.
The PESIA wants to address a wider range of issues, taking into account the ethical dimension and the societal consequences of the use of data
The Privacy Impact Assessment procedures are already adopted in different countries, but they are mainly focused on privacy issues. In this sense, the PESIA wants to address a wider range of issues, taking into account the ethical dimension and the societal consequences of the use of datamainly when data, in some contexts (for example when we use big data analytics), are used to take decisions that may affect individuals and groups, with potential impacts in terms of discrimination that are not the traditional issues that we address when we talk about privacy and data protection.
What are the main differences between the PESIA model and the data protection impact assessment of the General Data Protection Regulation (GDPR) adopted by the European Union?
In the GDPR, the assessment is mainly focused on the risks in terms of data security, in terms of unlawful use of data.
The PESIA that we suggest adopts a different approach because it is focused on the publicity of the entire assessment, giving the user the chance to know the risk related to the potential uses of the personal information
Moreover, the impact assessment as described in the GDPR is an impact assessment focused on the individual dimension of data protection, it is not public because all the documents related to the impact assessment remain inside the entity that has made the assessment – (the private or public entity that play the role of data controller) and, finally, in the GDPR this assessment does not adopt a mandatory, participatory model, so the different stakeholders are not necessarily involved in the assessment process.
The PESIA that we suggest adopts a different approach because it is focused on the publicity of the entire assessment, giving the user the chance to know the risk related to the potential uses of the personal information, and moreover we want to create an impact assessment based on the engagement of the different potential stakeholders. In this sense, the PESIA adopts a participatory process in order to identify the different issues that may be relevant in terms of societal consequences of the use of data. For this reason, the model that we want to define is different from the PIA and also from the DPIA, as described in the GDPR.
What are the main challenges in developing the PESIA model?
Although we are in the initial stage of the project, we can say that the major challenge of this model of impact assessment is related to the fact that, taking into account the societal consequences of the use of data and the ethical values, we have a sort of variables that are different in different contexts.
When we talk about the values, when we talk about societal values, these elements necessarily change from a context to another
So when we talk about data security, when we talk about data protection in terms of legal protection of personal information, we are able to define general standards that are the same in all the countries, in all the social contexts. But when we talk about the values, when we talk about societal values, these elements necessarily change from a context to another. In this sense, we have two goals: the first one is to define a general framework that represents a sort of baseline for the impact assessment, and the second goal is to provide flexibility to this framework in order to take into account the specific issues, the specific forms and aspects related to the ethical and societal values that different communities may express. We imagine adopting some solutions like, for instance, an ethics committee to give voice to the persons that may be affected by the use of personal information.