ONGOING

PHD projects

Lorenzo Dalla Corte


SPOW – Safeguarding data Protection in an Open data World

We are on the wake of a revolution in urbanism – a shift from data-informed to data-driven, networked urbanism. An ever-increasing deluge of data is being collected, analyzed, and used to fuel what has been defined as “smart city”: an extended network of sensors, coupled with big data analytics, gather and process large amounts of data, allowing to manage and control the urban ecosystem. The instrumentation and datafication of the built environment warrant a cautious approach, and calls for clear-cut values for the design of the infrastructure on which smart cities will be based. On one hand, the data gathered by and through the smart city environment can revolutionize urbanism, and enable a plethora of positive consequences. On the other, the array of sensors and the extensive data processing that define smart city technologies raise issues that need to be tackled from the outset of its development. Privacy and data protection are naturally threatened by the deluge of data gathered by the smart city environment. The evolution of large-scale smart environments has the potential to shift the normality of urban dwelling from a paradigm in which anonymity is the norm and identification the exception to one in which inhabitants are identified by default, and anonymous by exception. The SPOW project, carried on together with TU Delft’s Open Data Knowledge Centre, aims at a balanced co-design between open data, the smart city’s development and its inhabitants’ rights to privacy and data protection.

Funding

Dutch STW Maps4Society program, project no. 13718

Duration

Sept. 2015 – Sept. 2019

People

Eleni Kosta, promotor

Bastiaan van Loenen (TU Delft), supervisor, PI

Lorenzo Dalla Corte, promovendus

Sascha van Schendel


Transparency of Risk Profiling by national Law Enforcement Agencies under criminal law & data protection legislation: examining through the lens of explainability

The increased use of Big Data analytics to extract information and patterns from large datasets, and construct predictions, contributes to the importance of data and the authoritative role of data in decision making. Especially in sectors such as that of law enforcement, Big Data analytics can impact the way processes work and decisions are made. In the law enforcement sector, decisions have a very serious impact on the human rights of suspects or other citizens in the case at hand. In the course of the general policing task, fundamental rights of individuals or groups can be impacted as well by the use of Big Data analytics, such as risk profiling and predictive policing. A specific issue is the opacity of these processes towards impacted individuals and the general audience, creating a lack of awareness as well as issues with regard to the exercise of procedural human rights, such as the right to an effective remedy.
The research targets the specific practice of risk profiling of citizens in the Big Data era by national law enforcement actors and analyzes the relevant transparency safeguards and requirements under the frameworks of criminal law and data protection legislation, both on the EU level and Dutch level. One of the important legal instruments here is the European Police Directive and its implementation in national law. Provisions on transparency are examined through the lens of explanations provided to the data subjects and other persons affected by these risk profiles.

Fuding

The Tilburg Graduate Law School

Duration

January 2017 – December 2020

People

Supervisor : Prof. Eleni Kosta
Co-supervisor : Prof. Bert-Jaap Koops
Daily supervisor : dr. Colette Cuijpers

Magda Brewczynska


The puzzle of personal data collected by private companies and processed by law enforcement authorities – filling the regulatory gap

The European data protection framework consists of two separate regimes with different thresholds for the protection of personal data. The first, governed by the General Data Protection Regulation (GDPR), applies to data processing operations carried out in both private and public sector, with the exception of activities that serve law enforcement purposes. The processing for those purposes, whenever performed by competent authorities, falls under the second regime, established by the Directive on the processing of personal data by Police and Criminal Justice Authorities (Police Directive).

This seemingly straightforward dualistic system is, however, challenged by increasingly common practices where private entities become engaged in providing information to law enforcement authorities, for instance, by entering into structured collaborative agreements and establishing Public-Private Partnerships (PPPs).

Magda’s PhD project aims at providing an exhaustive analysis of the legal uncertainties arising in the situations of sharing personal data within PPPs and developing safeguards that need to be set in place for the protection of individuals’ rights to privacy and data protection.

Funding

Tilburg Graduate Law School (TGLS)

(First stream of funding (paid by university) (i.e. eerste geldstroom, allocated by university) with a PhD contract (as a PhD candidate).)

Duration

September 2018 – September 2022

Masa galic


Conceptualising privacy in public space – How should the law regulate pervasive and suspicionless surveillance in public spaces of contemporary smart cities and living labs

‘Conducting research in regard to privacy and big data-based security in public spaces. This research is connected to the ongoing Living Lab “Stratumseind 2.0” project on the prominent Stratumseind nightlife street in Eindhoven. The Stratumseind project is a public-private partnership in principle established to increase security so as to renovate the public area, bars and restaurants. Innovative solutions such as lighting, social media and gaming technologies are being deployed and tested in order to meet the goals. Through these solutions a great amount of data will become available for mining, leading to serious privacy concerns, discrimination when used for decision making, unfair treatment, exclusion, stigmatisation, de-individualisation, loss of autonomy and confrontation with unwanted information. These issues are further complicated by the fact that this is happening in public space, since the need for privacy protection in public spaces is likely not adequately covered by existing legal frameworks. This research will delve into such and similar issues, emphasizing which actions and acts pose an actual threat to or harm human rights and freedoms (and people’s lives in general) and consequently develop a more adequate normative framework in this context.’

Fuding

50% TLS/UVT, 50% Dutch Institute for Technology, Safety and Security (DITSS)

Irene Kamara


Standardising the protection of personal data in the Internet of Things era: a European perspective in an interconnected world.

Irene’s PhD project explores the interaction of soft law instruments and human rights, through examining the case of technical standards in the regulation of the right to protection of personal data in the European Union. The research aims to introduce a framework of principles and safeguards for data protection technical standards. To that end, the project studies in-depth the evolution of standardisation in the field of data protection and analyses variables such as the policy and regulatory appraisal of technical standards as an instrument supporting the aims of EU secondary legislation, but also the legitimacy of the standards-development processes, and the human rights nature of the right to protection of personal data.

Funding

Funding granted Tilburg Graduate Law School (data science track) link

Joint doctorate supported by the University of Tilburg (Tilburg Institute for Law, Technology, and Society) and the Vrije Universiteit Brussel (Research Group on Law, Science, Technology, and Society)

Duration

Duration: October 2016 – February 2021

People

Supervisors : prof. dr. Paul De Hert (Vrije Universiteit Brussel)
                       prof. dr. Eleni Kosta (Tilburg University)
                       prof. dr. C. Stuurman (Tilburg University)

Damian Clifford


The legal limits to the monetisation of online emotions

Damian is a doctoral researcher at the KU Leuven Centre for IT & IP law under the supervision of Prof. Dr. Peggy Valcke (KU Leuven, Centre for IT & IP Law) and Prof. Dr. Eleni Kosta (Tilburg University, Tilburg Institute for Law and Technology). Damian’s research is funded by the Flemish Research Council (FWO) and his FWO Aspirant fellowship runs from October 2015 to October 2019. In short, Damian’s thesis explores the potential for emerging technologies that are capable of detecting emotion in real-time to undermine the decision-making capacity of citizen-consumers and is entitled, ‘The legal limits to the monetisation of online emotions’. Our online activity is constantly monitored by commercial entities. Given the rise in smart technologies our time spent online is set to increase thus allowing for almost 24/7 tracking. Based on this information marketers create detailed consumer profiles indicating preferences, behaviour and even emotions. This is referred to as profiling and allows commercial actors to sway consumers towards commercial transactions on the basis of tailor-made offerings (e.g. a person with a predisposition to purchase due to scarcity would be targeted with a personalised advertisement stating “limited supply” whereas a person profiled as a follower of trends would be targeted by a “best-selling” advertising campaign). Such practices can hold benefits for consumers (who lose less time on irrelevant offerings), but also raise significant legal and ethical concerns.

With this in mind the research will provide theoretical insights into the division between personal autonomy and the economic power of commercial entities. The current law relevant to profiled “market manipulation” or “nudging”, is contained in a variety of legal instruments. The PhD thus aims to provide a transversal analysis of the citizen-consumer protection safeguards in the form of privacy and data protection and consumer protection law in order to identify gaps and shortcomings, and as such suggest solutions.