The clock for the implementation of the General Data Protection Regulation (GDPR) is ticking. The new piece of legislation is set to replace the aged Data Protection Directive in May 2018. This means that there is little time to ensure everything is in perfect order. The changes are numerous, but the end-goal is praiseworthy: to strengthen both the consumer rights and to ease doing business for companies within the entire EU.
We hope that the GDPR will accomplish what it promises, but in order for it to do so, companies and organizations must adapt. One of the significant changes in the GDPR pertains to profiling and it is worth digging deeper into.
GDPR vs DPD
If you ask the Data Protection Directive, well, we do not know. The terms such as ‘profiling’ exist, but they have not been concretely defined. ‘Automated decision taking’ is referred to in passing, but ‘profiling’ is not. This is understandable, as 20 years ago concerns about such pervasive collection of data and automated decisions obviously were not as strong.
The scope of the GDPR is also wider. These provisions affect the companies from outside the EU if they process the data of users from the European Union. These companies are also bound by these provisions.
Today, however, we have an extremely all-encompassing slew of data-collecting companies and the surge of big data, so the EC saw the need to take action.
What is Profiling Exactly?
Profiling is the use of an individual’s personal information to predict their tendencies and behaviour based on data currently at hand. Many businesses employ profiling to some extent, such as when serving ads or offering certain services. Users might see ads tailored for them based on the interests they submitted to the service, or they might be offered a premium banking service based on their income.
The Significance of Profiling
The sad fact is many of us do not know that we are being profiled at all. All kinds of data are used for profiling, and many of these we freely give away. Online, our web-history and internet search data is used. Our location can be tracked via our phones and our payment history is also available, as well as our buying habits if we use store loyalty cards. This is without mentioning biometrics where extremely sensitive personal data can be used for ill purposes.
Obviously, profiling today is mostly used to sell people things. Your company has certainly experienced its effectiveness if you advertise online at all. However, it can also be used for more sinister purposes and affect human lives in a very measurable way. Considering that in the modern world most profiling decisions are made by a computer, the intention of the GDPR is to prevent exactly those negative effects.
Benefits and Risks of Profiling
With a good profiling network, the extent of security fraud could be reduced. Marketers can employ better market segmentation and tailor their services to the specific user profiles. Giving out loans could be more prudent with less risk of bad decisions, and any and all decisions can be made more consistent.
However, fundamental human rights and freedoms can be breached. Access to health care could be denied, and sensitive personal data could be easily guessed from ‘ordinary’ data by a sophisticated system. Also, some people would rather not be profiled – and according to the GDPR, they have the right to demand that.
Profiling and the GDPR
Profiling is defined in the GDPR, more precisely in Article 4(4). It defines profiling as any form of automated data processing where personal data is used to analyse or predict certain aspects concerning a person. These can be personal preferences, interests or behaviour, but also more sensitive aspects such as work performance, health and economic situation.
The definition was originally set out to be much wider. Significant reductions in all types of profiling had been planned, but eventually the plan fell through, for reasons ‘unknown’. In the future, such legislation may be adopted, but as of now, not all profiling is objectionable.
Significant Profiling
The GDPR is mainly concerned with what it deems ‘significant’ profiling. Such profiling would have a very pronounced effect on one’s life. The examples within the GDPR itself are scarce, but include recruiting practices and refusals of online credit applications.
However, it is plausible to include here any type of profiling that can cause damage and loss to individuals, or cause significant stress. Any type of discrimination is certainly forbidden too. Decisions affecting their well-being and financial status are also not allowed. Children are especially protected and they should never be automatically profiled at all. An individual must be notified that profiling of their data is taking place. Implementation of this provision in practice could prove to be troublesome.
If you own a company that does this kind of processing, bear this in mind to ensure that your profiling practices are fair. Your algorithms should not discriminate. If you are doing processing through a third party, make sure to ensure that they are compliant as well. You should always be ready to address any and all user complaints.
Prohibitions
The individuals can opt not to be subject to an automated decision that is a result of profiling. These individuals have the right to obtain human intervention. Literally, a pair of eyes must look at their data and make a decision. They also have the right to an explanation for a decision. The exact interpretation of this provision is ambiguous, though. It can mean that such processing cannot take place at all, or the former. Member states approach this issue differently and it remains to be seen what the practice will be in the future.
Not all automated decisions are protected. If a decision is required for the performance of a contract, based on explicit consent given by the individual or authorised by a law agency, these rights do not apply. In any case, individuals have the right to object to profiling and it is up to the data controller (the company) to establish why such profiling is necessary. When it comes to direct marketing, data subjects should have the right to object to such processing at any time, for any reason. This right should be explicitly stated so that the person receiving it can be assumed to have seen and read it.
Profiling can NEVER take place based on special categories of data without explicit consent or extreme public interest. Special categories of data include ethnic origin, religious beliefs, health and sex life data, and union membership.
Conclusion
In a nutshell, profiling is allowed when required for the purposes of performing a contract. In these cases, it can even be beneficial, and the limits the GDPR places upon this practice are very reasonable. If you are a member of an organisation doing profiling, put yourself in your users’ shoes. Would you feel uneasy if such profiling were to take place on you? If yes, perhaps it is the best not to do it. If you’re the user, always read the terms to which you are agreeing. You might inadvertently agree to more than you’ve bargained for.
If you opt for profiling, ensure it is bias-free, fair and accurate. Always obtain explicit consent and notify the users of the nature of such profiling. Non-compliance is costly: Up to EUR 20 million fine or 4 percent of the company’s global turnover, whichever is greater, so it does pay to ensure everything is in order. Any gains produced by legally shady profiling are dwarfed by the threat of enormous fines.
For more information regarding profiling, consult our article featuring updated guidelines on profiling and automated decision-making.