The Article 29 Working Party’s new set of guidelines deals with the issue of profiling and automated decision-making. These practices have taken off recently due to enormous productivity benefits they offer, but there was also need for more extensive regulation.
Profiling is, according to the GDPR, any automated form of processing that is carried out on personal data and analyses and predicts certain aspects or behaviours about a natural person.
Profiling is employed to make predictions about people, such as their health, personal relationships, economic situations, or likely interests, based on data patterns obtained from various sources and individuals. However, even without predictive purposes, any classification based on individuals’ characteristics (age, sex etc.) is considered profiling.
There procedures are often statistical in nature. Marketers often employ profiling in a bid to make their ads more relevant. Data brokers, for example, create profiles based on data collated from various sources, and then sell them on to marketers based on their likely interests.
This type of decision-making does not include human intervention. It is a purely automatic, computer-based process. It often overlaps with profiling, but does not have to.
It can be based on one’s personal profile (such as credit score), previously collected data or data provided directly by the person, such as in a questionnaire (e.g. a loan application).
The WP29 lists speeding fines based on speed cameras as an example of automated decision-making. Such a practice is not profiling, but would be if the amount of the fine were based on the individuals’ previous driving habits.
Prohibitions on Automated Decision-making
Generally, as per Article 22 of the GDPR, meaningful decisions that are ‘based solely’ on fully automated decision-making are generally prohibited; i.e. the individual can object to such treatment.
The automated nature cannot be ‘disguised’ by token human involvement if the bulk of the decision-making is still automated and if the human cannot influence the outcome of the decision-making.
Note the word ‘meaningful’ in the first paragraph of the section. This provision essentially only applies to decisions that would significantly affect the individual. Trivial decisions can still be automatically reached.
The definitions are not all that clear, though.
To qualify for Article 22 protection, decisions should have ‘legal effects’ or ‘significantly affect’ the individual. It is presumed that legal effects include social benefits, border entries, guaranteed personal freedoms, voting rights, etc.
Even if there are no palpable ‘legal’ effects on the individual, the WP29 opines that there are similarly significant effects that could lead to discrimination or exclusion of individuals. Such decisions should legitimately influence the individual. For example, automatic refusals of online credit applications likely meet the threshold.
Online marketing could have a significant impact in certain cases. Most often, it will not, since the decisions are trivial. However, depending on the intrusiveness of profiling and granularity of ads, the impact could be significant for certain groups of individuals.
For example, vulnerable adults could be targeted with certain ads (such as for gambling) based on their profiles, which is detrimental to them.
Differential pricing could also produce a significant effect, if, as a result of such practice, the individual would be unable to purchase a service or an item they would otherwise be able to.
The prohibition described in Article 22(1) is void in these circumstances:
- Necessary for the performance of a contract
In these cases, automated processing could reduce the potential for human error and ensure consistency, as well as improve efficiency and processing speed. However, it is upon the data controller to prove such processing really is necessary, and whether there are less intrusive processing methods available. If there are, then those must be used.
- Authorisation by EU or Member State law
This exception should be rare. The potential for its use is outlined in Recital 71. If it ever gets used, its purpose will be to prevent fraud and tax evasion.
- Explicit consent
The GDPR never does explain what ‘explicit consent’ means, but it is likely a higher threshold of consent than usual. This means that such consent should be specifically confirmed, instead of ‘bundled’ with several other data use cases.
The issue of consent will be further addressed by the WP29 in separate consent guidelines.
Data subjects have a set of rights they can exercise in order to minimise the risks stemming from profiling and automated decision-making practices.
As usual, all other rights – to object or to be forgotten, remain in force.
The Right to Information
Data controllers must be especially transparent when performing profiling operations. They must notify the data subject that they are engaging in this activity, explain the logic and reasons for doing so, and outline the potential impacts of the processing.
As a data controller, you should provide informative, practical examples and provide meaningful, understandable information.
This is essential and should be done before any kind of processing takes place in order to ensure the consent is freely obtained.
Information can be presented in visual and interactive form as well, as long as it helps individuals understand the principles behind data processing.
The Right Not to Be Subject to Automated Decision-making
Even if there are no legal obstacles to automated decision-making, the individuals still have the right to demand human review and to contest the decision by expressing their points of view. Exercising this right should be simple and easy. The review must be thorough, not just token and cursory.
The WP29 also explained how the general data processing provisions apply to profiling and automated decision-making.
Data Protection Principles
Fairness, Lawfulness, Transparency
Transparency is fundamental to the GDPR. Data subjects often cannot see profiling taking place, so the nuances and complexities can sometimes be hard to grasp. That is why it is essential to be clear and forthcoming in explanations.
Processing must not result in discrimination or be unfair. For example, ethnic minorities can be targeted based on profiling, and as a result they might get offered different services, especially by the financial companies.
Consent is valid only for purposes listed and explained to the data subject. For example, apps that use one’s location to determine the nearest store around or track one’s running progress must not reuse that data for marketing purposes unless the person is explicitly notified about it and consents.
Sometimes, the data can be used for other purposes, depending on their similarity and impact. It is always recommended, however, to obtain further consent.
Profiling can lead to ‘data hunger’, where companies store vast amounts of data for the explicit purpose of more accurate profiling. This data is actually unnecessary for immediate processing, and should be deleted.
Data should at least be pseudonymised or preferably anonymised for use in a profiling database.
Incorrect input data leads to incorrect outcomes and computer-based decisions. That is why ensuring accuracy is essential, since there are not any mechanisms that can correct it during such processing.
Transparency is essential here: the better the purposes of processing are explained to data subject, the more likely it is that they will give out accurate and current data.
Storing substantial amounts of data on a single individual could prove to be a huge invasion of privacy that does not justify the potential gains for the data controller. Profiling based on such data can be so accurate that its potential for misuse is too great to be contained.
Data protection impact assessments are a requirement in most cases under the GDPR. The requirement applies not only to solely automatic processing, but all processing with automated elements.
Lawful Bases for Processing
As usual, consent is the crucial component for data processing. In case of profiling, data controllers must make sure the consent is given freely and that the individual fully understood what they consented to.
Consent is not appropriate when it is a requirement in order to use a service or if there is a power imbalance, such as with an employer/employee relationship.
Legitimate interest can be used as a basis if the balancing exercise shows that there are greater benefits to the data controller than risks to the data subjects. This depends on the granularity of profiling, i.e. the level of detail and the amount of data. Potential impact on persons also must be evaluated, as well as whether proper safeguards are enacted.
Note that legitimate interest cannot serve as a basis for profiling that would otherwise be unlawful, as per Article 22(1).
Profiling based on such data is generally unlawful, unless the requirements from Article 9(2), such as explicit consent, are satisfied.
Such processing must be conducted with extreme care and security measures. Data controllers must also be wary of accidentally discovered sensitive data, which can be gleaned from correlations that may appear when several datasets are combined.
Individuals must, of course, be notified if that occurs.
Recital 71 states that profiling with significant effects should not apply to children. The WP29 clarifies that, since the prohibition is not listed within the Articles themselves, it is not absolute. Still, the recommendation is to refrain from profiling based on children’s data.
However, in circumstances such as to protect their vital interests, such processing would be allowed, as long as all the appropriate safeguards are in place.
These protections are in place since children are a vulnerable group. They often do not understand the full consequences of their actions as well as their rights.
The WP29 essentially reiterates and puts a fine point on what we felt would be a requirement. Profiling should not be performed without adequate safeguards and explicit consent. Automated processing, likewise, is something individuals have the right to decline.
Data controllers must prepare for the increased transparency requirements by providing clear and informative notices. Their data security practices must also be up to par, which can be tested with regular audits and checks. This will ensure that profiling does not create unnecessary risks both for the company and its users.
You can download the entire report here.