Data analytics tools are more widely available and accessible than ever before, and many organisations are examining the benefits they can deliver. But if you are thinking about data analytics for the first time, what should you consider to ensure you remain on the right side of compliance with the General Data Protection Regulation (GDPR)?
What are data analytics and what are the benefits?
Data analytics is an overarching term for the use of analytical and statistical automated tools to evaluate information, and to compare and contrast datasets. Organisations have found that these tools help them to understand patterns, trends and anomalies within datasets they are analysing. The resulting intelligence can be used to improve business decision making through a clearer understanding of market and customer trends and behaviour etc.
Data analytics includes data mining, which uncovers patterns by sifting large amounts of data to obtain key information quickly. It can identify repetitive results and determine which information is relevant, helping to predict outcomes or support decisions. Business intelligence tools can collect and analyse data to run queries and generate visual reports to improve management reporting and develop strategic plans.
Organisations use these tools to reduce costs, refine product and service offerings, provide more targeted marketing, improve supply chains or just-in-time stock control, and predict what will happen in the market. Police forces in the UK are using these techniques to analyse crime trends to help them improve crime prevention and detection, deliver better safeguarding, and understand patterns of offending behaviour.
However, there are issues to be considered in order to maintain compliance, and avoid analysis becoming intrusive or straying into unethical practices. Analysis can introduce discrimination, bias, and complexity, as some data analytics can process an enormous amount of data which can be complex and confusing in its presentation (particularly if it is trying to draw trends and inferences). If it is used to make unreliable decisions about living individuals (data subjects) or if the analysis isn’t sufficiently transparent, there is a risk that this will adversely affect the relationship with data subjects and decrease trust and confidence in organisations.
What is the easiest way to safeguard personal data in analytics?
It may seem like a trite answer, but the easiest way to protect personal data is not to process it. If an organisation needs to understand trends or analyse its business but doesn’t need to attribute that intelligence back to the data subjects, then the safest way is to remove any identifiable information and work with anonymous data. Thereafter, any analysis of that anonymised data will not involve processing personal data, so there is no need to comply with data protection (DP) legislation.
Let’s look at some DP themes which might be new to you if you haven’t considered data analytics before.
What is the difference between anonymisation and pseudonymisation?
There is no definition of anonymised data in the legislation, but essentially it is data that has undergone a process which permanently removes the identifiers and removes it from the scope of the legislation. The legislation covers personal data which relates to an identified or identifiable living individual. If all identifying data is removed from a data set, this effectively removes the data from the scope of the legislation. This sounds relatively simple, and it can be, but if it involves large and complex datasets or special category personal data you may need to obtain some specialist help. It is worth remembering that if using personal data isn’t necessary for your aims, anonymisation will allow you to comply with the data minimisation principle in the UK General Data Protection Regulation (UK GDPR) Article 5(1)(c) which requires data to be adequate, relevant and limited to what is necessary.
The Data Protection Act 2018 (DPA2018) doesn’t define pseudonymisation for the DP regime as a whole, but UK GDPR Article 4(5) defines it as:
‘the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person.’
As such, pseudonymisation uses techniques such as masking or removing direct identifiers, or replaces them with a code or reference which can be used to convert pseudonymised data back into personal data.
In summary, anonymisation means that individuals are not identifiable and cannot be reidentified by any means reasonably likely to be used. Anonymous information is not personal data and DP law does not apply.
Pseudonymisation means that individuals are not identifiable from the dataset itself but can be identified by referring to other information held separately. Pseudonymous data is therefore personal data and DP law applies. Recital 26 of the UK GDPR makes it clear that personal data which has undergone pseudonymisation remains in scope of the law. It is also worth noting that attempting to re-identify pseudonymised data without the permission of the controller is an offence.
Is encryption the same as anonymisation?
In short, no. Encryption is a function using a key to encode data so that only users with access to that key can read the information. Encryption can provide an appropriate security technique to protect against the unauthorised or unlawful processing of personal data, especially in cases where it is not possible to implement alternative measures.
However, encryption is not an anonymisation technique and encrypted data is still subject to the UK GDPR because if you have encrypted data and the key, you can reidentify individuals through decryption of that dataset and, in that respect, encryption can be regarded as a pseudonymisation technique.
Can data analytics techniques be classed as profiling?
Yes, sometimes. Article 22 of the UK GDPR, which is part of the suite of data subject rights under the legislation, states that ‘the data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.’
Let’s look at profiling first.
Article 4(4) of the UK GDPR states profiling analyses aspects of an individual’s personality, behaviour, interests, and habits to make predictions or decisions about them. So, if you intend to find something out about individuals’ preferences, predict their behaviour or make decisions about them, the chances are that you will be profiling. A common example of this would be the analysis of a person’s shopping or purchasing habits to present more targeted marketing to them.
If you collect and analyse personal data on a large scale, if you use algorithms, artificial intelligence (AI) or machine-learning, if you identify associations to build links between different behaviours and attributes, if you create profiles that you apply to individuals or if you predict individuals’ behaviour based on their assigned profiles, then you will be profiling them and Article 22 applies.
Now let’s look at automated decision-making.
Automated decision-making is the process of making a decision by automated means without any human involvement. This poses some difficulties because the process is often invisible to individuals, they might not expect their personal information to be used in this way, and they might not understand how the process works or how it can affect them.
However, the UK GDPR doesn’t prevent you from carrying out profiling or using automated systems to make decisions about individuals unless the processing meets the conditions in Article 22(1).
For profiling or automated decision-making to be restricted by Article 22 there needs to be:
- no human involvement; and
- a legal or similarly significant effect on the individual.
So, you need to decide whether the decision you are making has a legal or similarly significant effect on people, remembering that decisions that might have little impact generally could have a significant effect for more vulnerable individuals, such as children. If you decide that it doesn’t have such an effect, you should record your rationale in your data protection impact assessment (DPIA) and be prepared to defend it. A DPIA is a very useful tool to record your thought process, but in some cases it may be mandatory to conduct one if the processing results in a high risk to the rights and freedoms of data subjects. To learn more about when a DPIA is necessary and how to perform one, read our blog on Conducting Data Protection Impact Assessments (DPIAs).
To avoid the first of the Article 22 conditions, you may introduce a human element to the process, whereby the analysis suggests decisions about individuals, but a human reviews them and makes the final decision.
Finally, Article 22 might not apply if the processing is covered by one of the exceptions in Article 22(2) – for example, if it is necessary to enter into a contract or if you have explicit consent.
Do the research provisions apply to my data analysis?
Article 89 of the UK GDPR states that ‘processing for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes, shall be subject to appropriate safeguards… for the rights and freedoms of the data subject’.
These safeguards are generally technical and organisational measures (TOMS), and Article 89 specifically mentions measures to ensure respect for the principle of data minimisation. This may involve, where possible, anonymising or pseudonymising data.
Section 19 of the DPA 2018 adds to these safeguards by stating that research-related processing does not satisfy Article 89 if the processing is likely to cause people substantial damage, substantial distress, or is carried out for the purposes of measures or decisions about particular people, except for approved medical research.
So, if you are considering the research provisions, you need to have appropriate safeguards in place to protect the rights and freedoms of the people whose personal data you are processing. These should be TOMS to meet the principle of data minimisation and, where possible, you should carry out research using anonymised information so that it is not personal data and DP law does not apply. Where you can’t do that, you should consider pseudonymisation, but pseudonymised data is still personal data and DP law applies. Finally, you should not use the research provisions if the processing is likely to cause someone substantial damage or distress.
What help is available?
The Information Commissioner’s Office (ICO) has developed a toolkit for organisations considering the use of data analytics, and covers different DP regimes, including: the Data Protection Act 2018 (DPA2 018) Part 2 for processing under the UK GDPR; processing under Part 3 for law enforcement; and under Part 4 for the three intelligence services. However, there is a lot to consider in this area and many organisations utilisiing this for the first time may need some specialist help with data protection aspects they haven’t encountered before.
The key considerations are:
Think about why you want to carry out data analytics.
This will help you balance the benefits you hope to gain against the effort required to achieve GDPR compliance and the risks to data subjects and your organisation.
Can you proceed with truly anonymised data?
If so, this removes the processing from the scope of DP legislation.
Conduct a DPIA.
The structure of a DPIA will help you assess risk, balance competing interests, and identify measures to implement the DP principles and safeguard rights. It is also a useful way to record your thought process as part of the obligation for accountability.
Establish a lawful basis.
If you already process personal data, you should have established your lawful bases for doing so, but will these cover any proposed analytics work, especially if you are introducing new datasets and processing? If not, should you go back to your data subjects, for example to obtain consent for the further processing?
Establish whether the processing is fair.
Would the data subjects – who may be your customers, members of the public or your employees – expect this type of processing and be comfortable with it?
Determine whether the processing complies with the purpose limitation principle.
UK GDPR Article 5(1)(b) states that data should be ‘collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes’, so if you are analysing data you have already processed for another purpose, check compliance with this principle.
Decide whether you will be making any solely automated decisions or profiling as a result of the analysis.
Will you be carrying out profiling or making automatic decisions which have a legal or similarly significant effect on an individual, and if so, do any exemptions apply to the requirements of Article 22 or can you introduce a human review?
Consider whether you can use the research provisions.
This will apply if you are conducting the analysis for archiving purposes in the public interest, scientific or historical research purposes, or statistical purposes.
Decide how you will prevent and monitor discrimination and bias.
Consider whether the analysis will introduce or exacerbate any bias or discrimination, and how you will carry out checks to make sure this doesn’t happen or how you will deal with it if it does. Record this in your DPIA. For public bodies, the public sector equality duty in section 149 of the Equality Act 2010 requires them to have due regard for equality considerations when exercising their functions, and depending on the purpose of the analysis, there may be a need to consider an equality impact assessment.
Decide how you will need to amend your privacy notice.
It is important that data subjects know how you use their data, and you may need to alert data subjects to the new notice.
Decide the security measures you will need to take.
This might include a decision about who has access to the data, to the results of the analysis, and whether personal data should be anonymised when a wider audience is provided with access to it.
Decide how you will ensure the rights of individuals.
This will include the provisions for subject access rights – how will you provide copies of the data which have been subject to analysis and what information will you need to share with them about the processing?
What are the potential upcoming changes?
At the time of publishing this blog, the Data Protection and Digital Information (No.2) Bill is at the committee stage in the House of Lords, with its progression halted for the duration of the recently announced General Election. The Bill proposes changes to the way data collected in the UK is governed and that the regulation of data works for those gathering and using data. It is the Government’s stated aim to ‘simplify the UK’s data protection framework’.
Clause 11 of the Bill would change Article 22 of the UK GDPR, broadening the range of contexts in which solely automated decision-making can be used by removing the general prohibition on fully automated decision-making that has a legal or ‘similarly significant’ effect on individuals, and which includes extensive safeguards. It reframes Article 22 to generally allow automated decision-making and with reduced safeguards.*
In written evidence at the House of Commons committee stage, the Equality and Human Rights Commission expressed concerns that the proposed changes do not offer sufficient safeguards to protect individuals from unfair or discriminatory outcomes of automated decision-making because the data used to help make decisions may contain existing biases. The example they cite is that a system used to monitor employee productivity may make automated decisions that do not take account of the legal requirement to make reasonable adjustments in respect of an individual’s disability. Also, the Commission is concerned that humans making any final decision may have little understanding of how the technology has made a decision, and may not have the expertise to question its output.
In short, the Bill may – or may not – change the data protection aspects of data analytics. If the Conservatives are re-elected on 4 July, it will continue its passage through Parliament and may be enacted in the next Parliamentary session. However, if there is a change of Government, it is extremely unlikely to be passed without significant revision, and may be scrapped entirely. So, watch this space…
* For a comprehensive breakdown of how the Bill (in its current form) will change UK data protection law, see our blog on The Data Protection and Digital Information Bill No.2.
How URM can help?
If your organisation would benefit from guidance and support ensuring its use of data analytics tools is compliant with the Regulation, URM can provide GDPR consultancy to assist you that is informed by both in-depth knowledge and a wealth of experience. Leveraging the expertise gained over nearly 2 decades of helping organisations comply with DP legislation, URM can offer your organisation a wide range of GDPR consultancy services to help ensure your use of data analytics, and processing practices in general, is fully aligned with the Regulation’s requirements. For example, our large team of GDPR consultants can conduct a gap analysis of your current processing activities against GDPR requirements, allowing you to identify where and how you are not currently compliant with the Regulation. If you would like to receive ongoing support, we can offer a completely flexible virtual data protection officer (DPO) service, which will provide you with access to an entire team of DP practitioners, each with their own specialist area of expertise.
URM can also assist with the completion of more specific compliance activities, such as helping you produce a record of processing activities (ROPA), the cornerstone of any controller’s GDPR compliance effort, and with conducting DPIAs. Meanwhile, if your organisation receives data subject access requests (DSARs), we can support you to respond in full compliance with the Regulation by providing a GDPR DSAR redaction service.
If you would like to improve your own understanding of the GDPR and UK DP regime in general, URM regularly delivers a range of DP training courses, all of which are led by a practicing DP consultant. Our courses on conducting data transfer impact assessments (DTIAs), DPIAs, and on responding to DSAR requests, will expand your professional skillset and teach you how to complete these key compliance activities. To gain an industry-recognised DP qualification, we regularly deliver a BCS Foundation Certificate in Data Protection (CDP) course, which will fully prepare you to take the BCS invigilated exam.
URM can offer a host of consultancy services to improve your DP policies, privacy notices, DPIAs, ROPAs, privacy notices, data retention schedules and training programmes etc.
By attending URM’s online BCS Foundation Certificate in Data Protection course, you will gain valuable insights into the key aspects of current DP legislation including rights of data subjects and data controller obligations.
If uncertain, URM is able to conduct a high-level GDPR gap analysis which will assist you understand your current levels of compliance and identify gaps and vulnerabilities.
URM’s blog reviews ICO enforcement activities for the 1st half of 2024, highlighting trends & shifts in how it enforces against data protection breaches.
URM’s blog breaks down which Articles of the GDPR have seen the greatest number of enforcement actions by the ICO, and which have gone largely unenforced.
The need for guidance on how organisations should best protect privacy and manage personal information has never been more pertinent.