Data may well be touted as the ‘new oil’ but how can you buy data sets without falling foul of data protection law?
August 2019 | EXPERT BRIEFING | DATA PRIVACY
financierworldwide.com
Where data sets contain information that can directly or indirectly identify living individuals, data set buyers need to comply with data protection laws. Data protection regulations may be seen as a threat to certain data-dependent business models, especially since the European General Data Protection Regulation (GDPR) became applicable on 25 May 2018. The GDPR introduced stricter rules and the potential for enhanced regulatory scrutiny and high fines. The GDPR applies to organisations established in the EU and also has extraterritorial effect, meaning that it applies to organisations outside of the European Economic Area (EEA) that carry out certain processing activities.
Accountability is key
Buying data sets from a third party does not remove the responsibility to comply with the GDPR. Indeed, the GDPR requires that individuals’ whose personal data is processed, should be given certain information regarding the use of their personal data – normally in the form of a privacy notice – when their personal data is collected directly from them or where the personal data has been collected by a third party.
Buyers must carry out due diligence checks. In particular, buyers should ensure that the data subjects were provided with the relevant information by the organisation that originally collected their personal data. This includes notifying data subjects that their personal data would be transferred or accessed by certain categories of organisations, i.e., the buyer’s type of business, or providing them with the identity of a buyer, which, in practice, will not be realistic.
Relying on contractual commitments that the data have been obtained ‘in accordance to applicable data protection laws’ will not suffice. Obtaining due diligence evidence may be particularly challenging for ‘Big Data’ given the size of the data set, the way in which the personal data may be collected and the variety of sources of the personal data that composes these data sets.
Data protection impact assessments (DPIAs) are an important GDPR accountability tool, which help organisations identify the potential risks from the intended processing activities and measures to mitigate those risks. One of the scenarios where a DPIA may be advisable, or even required depending on the circumstances, is where a buyer will be processing personal data acquired from third parties, particularly where Big Data analytics are involved.
Transparency
If the seller has not provided all the GDPR-required information to the data subjects, the buyer will need to provide such notice to the data subjects within strict timescales, i.e., within a reasonable period after obtaining the personal data, but at the latest within a month. If organisations are planning to contact the individual or to disclose their data to another recipient, the privacy notice will need to be communicated to the individuals prior to such communication or disclosure.
There are exemptions to this information requirement, namely, if the provision of such information proves impossible or would involve a disproportionate effort, in particular for processing for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes. Where relying on this exemption, an organisation needs to take appropriate measures to protect the rights and freedoms of individuals, such as making the necessary information available publically. There have been instances of European regulators taking enforcement actions against organisations relying on this exemption, so any organisation wishing to rely on it must carefully study its merits and document their decision.
The complexity of Big Data processing activities can create the so-called ‘back box’ effect, where it can be difficult to understand, and, in turn, explain to data subjects, the reasons for the decisions made. This ‘opacity‘ is directly at odds with the GDPR transparency requirements, including informing data subjects about the logic involved in any ‘automated decision making’ – i.e., decisions made by algorithms without human intervention, where such decisions will significantly affect them.
Is consent the ‘golden’ standard?
The GDPR sets out six potential general legal bases for processing personal data and one of these must be relied on. The legal bases include consent of the data subject, necessity for the performance of a contract and necessity for the purpose of the controller’s or a third party’s legitimate interests, as long as they are not overridden by the data subject’s rights. Where a data set contains ‘special category data’ – personal data which reveals racial or ethnic origin, political beliefs, health data and sexual orientation, among other categories – one of a different list of legal bases must also be in place.
If the seller has relied on consent, it is important for buyers to check how and when consent was obtained (they should be reasonably recent), by whom, the scope of the consent and that the consent meets GDPR standards. In particular, consents should be unambiguous, freely given, specific and informed. If the seller cannot provide this information, a buyer should not use the list.
In Europe, if the contact details of individuals will be used to send electronic marketing, organisations need to ensure they comply with the e-privacy rules. Electronic marketing communications – e.g., by email or text – generally requires the consent of the recipient above and beyond the GDPR legal basis for processing. The ICO advises that organisations buying data-sets for marketing purposes: “must make checks to satisfy [themselves] that any list is accurate and the details were collected fairly, and that the consent is specific and recent enough to cover your marketing”. The ‘soft opt-in’ exception for email or text marketing cannot apply to contacts on a bought-in list. Where consent is not required – for instance, for telephone marketing via live calls – buyers will have to screen the bought-in lists against publicly available suppression lists, such as the Telephone Preference Service (TPS) in the UK.
Purpose: think carefully about what you use the data for
It is essential to know what purposes the seller’s privacy notice told data subjects that their personal data would be used for. If data subjects were not informed that their personal data would be used for the buyer’s intended purpose, the buyers will need to notify individuals, unless the ‘new’ purpose is compatible with the initial use and within the reasonable expectations of the data subjects involved. The problem of ‘repurposing’ the use of personal data is exacerbated in a Big Data analytics context as the use of algorithms may result in the finding of correlations that are not related to the original purposes of the processing activity.
Where buyers will be using the data set for marketing, buyers must ensure the data subjects were notified that their details would be passed on for marketing purposes.
Combining data sets
If buyers combine their purchased data set with other data sets they hold, this could lead to greater identification of individuals (a ‘jigsaw effect’), make the data more intrusive, lead to buyers holding special category data – which has extra data protection requirements – or reidentify personal data which had been ‘anonymised’ before.
Buyers should look carefully at the data they are buying and the effects of combining it. Buyers have to ask themselves about the ‘fairness’ and the ‘proportionality’ of the processing activities they intend to carry out. These also have to be in line with the data minimisation requirement, i.e., data must be limited to what is necessary for the processing purposes. In the UK, it is also a criminal offence to knowingly or recklessly reidentify personal data that had been de-identified, without the consent of the controller responsible for de-identifying the personal data.
Security
Organisations are required to have technical and organisational measures in place appropriate to the risk. This could include pseudonymising the data – so that it is not possible to identify data subjects without a key – and encryption.
If personal data is not needed for the intended processing purpose, the relevant data set must be anonymised. Buyers will face fewer hurdles if they ask the seller to only provide them with truly anonymised, aggregate data. Where data is truly anonymous and there is no key allowing anyone to identify data subjects from it, it will not be personal data and buyers therefore will not need to comply with data protection law requirements.
Controller v processor
Organisations should identify their role when handling personal data. Where a third party is obtaining personal data on a buyer’s behalf, it will likely be acting as a data processor and the responsibility of complying with most of the requirements under the GDPR, including the transparency requirements, will fall on the buyer. Controllers have to instruct processors on how to comply with certain obligations. Controllers must ensure that processors are contractually prevented from processing personal data they obtain on the controller’s behalf for their own purposes.
Conclusion
There are several data protection challenges and hurdles that pose a risk of regulatory action. The good news is that careful planning, embedding data protection right from the start of projects (in line with the GDPR data protection by design requirement), only collecting the minimum personal data needed, and due diligence activities, will enable buyers to reap dividends from data processing while remaining GDPR compliant.
Nuria Pastor is a director and Camille Ebden is an associate at Fieldfisher. Ms Pastor can be contacted on +44 (0)20 7861 4565 or by email: nuria.pastor@fieldfisher.com. Ms Ebden can be contacted on +44 (0)20 7861 4014 or by email: camille.ebden@fieldfisher.com.
© Financier Worldwide
BY
Nuria Pastor and Camille Ebden
Fieldfisher