We have been told repeatedly that the only way to ease the Coronavirus lockdown restrictions safely is to have a workable contact tracing system in place, ensuring that those showing symptoms are swiftly identified and those who have been in contact with them alerted, thereby ensuring that the virus does not again start to spread exponentially.
No one can dispute that a workable Test and Trace system is imperative and urgent. But any system must be designed and implemented in a way which protects our right to privacy.
Test and Trace and privacy laws
Our privacy rights must, by law, be at the core of any system design, not an afterthought. Article 35 of the GDPR requires that a Data Protection Impact Assessment (“DPIA”) is carried out when the “processing of data is likely to result in a high risk to the rights and freedom of natural persons”. This must be done before the processing takes place. This is important in ensuring that organisations processing data examine what is being done with the data, whether it needs to be collected, how long it needs to be kept for, who should be able to access it and what could go wrong. They thereby ensure that risks are pre-empted and mitigated at a preparatory stage.
Test and Trace is a complex system with huge risk of privacy breaches – people hand over their data including their date of birth, sex, NHS number, email, telephone, Covid-19 symptoms and the contact details of those who they’ve been around. Not only the NHS but a number of private companies are involved in processing the data.
Of concern, it does not appear that privacy has been central to the overarching planning of Test and Trace or to the development of discrete elements of the programme. Only under threat of judicial review by the Open Rights Group, has the Government admitted that it has conducted no overarching DPIA of the system prior to launching it on 28th May 2020. It accepts that such a DPIA was and is required, stating in its response of 15th July 2020 that this is “in the process of being finalised”.
The Government made much of the serious risk to life and health meaning they had to act at high speed, that the impact had in practice been assessed as they went along, that the obligation was a “procedural one” and maintained that the scheme is nonetheless GDPR compliant. The reality is that there was no over-arching DPIA to consider and ensure GDPR compliance and DPIAs relating to separate parts of the system have seemingly been an add-on tick-box exercise.
Test and trace App
The ill-fated NHSX App was hurriedly launched, without a DPIA, on the Isle of Wight on 28th May 2020. There have been widespread concerns about the Government’s plan to introduce a stand-alone NHS app, with centralised data storage. People are worried that this data would then be used for other purposes, such as law enforcement or providing the data to third parties (such as US healthcare organisations).
The Government has not been up front about the thinking behind this App or the purpose of centralised data, and responses to challenges have been confused and confusing. For example at one point it was claimed that the data would be pseudonymised (and at another that it would be anonymised). However the plan was seemingly to integrate the App into the wider manual Test and Trace programme and despite Public Health England’s insistence that the two data sets would be kept separate, if the App is linked and structurally integral to the Test and Trace programme, it is hard to see how any degree of separation is realistically possible.
There have also been admissions that the App itself contained flaws that left it vulnerable to cyber-attack; of concern these were acknowledged at the outset, but claimed to be the result of an active decision to get the trial version operating without delay.
Thankfully NHSX seems to be undergoing a fundamental re-design, but the manual Trace and Track system has fared little better. Under that system, individuals gather contacts from Covid 19-patients and trace those people by phone or email to try and slow the spread of the disease.
Track and trace data protection breaches
Serco successfully bid for a contract to be at the heart of the Trace and Track system (along with a number of other private companies) but had not even trained its Track and Trace recruits before a potentially serious data breach arose in the form of putting the email addresses of 300 recruits in the CC rather than the BCC box when emailing them all about training. It was stated that Serco would not be reporting itself to the ICO, merely giving reassurance that this would not happen again.
More recently, it has been reported that some contract tracers have been sharing details of Covid-19 patients on Facebook and Whatsapp, in the context of getting colleagues to help them do their jobs. This is a clear data breach. The risk of up to 21,000 inadequately trained contact tracers, under pressure with targets, to inadvertently share data is immense.
The impact of data breaches
It is easy to think that in the scheme of things, minor data protection breaches (email addresses or personal details being inadvertently shared with others) are a small price to pay when compared to the risks of the virus spreading exponentially. But as a lawyer specialising in privacy law, I see with painful regularity the life-changing and devastating impact that a ‘minor’ privacy breach can have. Below are just a few examples:
- The erroneous dot added to a Skype username by a police officer when under pressure (and typing freehand rather than cutting and pasting), leading to an innocent male being arrested and investigated for online Paedophile offences, with devastating and lasting consequences for him and his family;
- A hospital administrator giving rather than eliciting the address of a child, when accompanied by their father, thereby giving him the address of the mother and ex-partner, a domestic violence victim who has spent years getting away from her abuser (who is not permitted to know her address) and who must now seek alternative housing, living in fear in the meantime in the knowledge that she is no longer safe and that he could show up at any minute;
- A well-meaning nurse, asking a hospitalised patient while in the company of visitors (who he has chosen not to tell about his HIV positive status) if he has “taken his HIV meds”;
- The nurse emailing an individual about his mental health diagnosis and medication, inadvertently clicking “reply all” to an unrelated message sent to her which copied in a number of others (and thereby widely publicising a condition he had chosen not to share).
- The police officer mistakenly confusing the address of the suspect and sexual assault victim, sending the suspect her documents, complete with name, address and other personal information.
In each case the breach itself is ‘minor’ and technical, accidental, absent-minded and often the result of working under pressure and not for one minute considering or intending the consequences.
For the victims however, the breach is life-changing. Even once deleted, data like this cannot be un-known. And the sense of violation, indignity, humiliation and helplessness, aside from the massive practical, personal and relational life impact cannot be overestimated.
On rare occasions victims get a proper meaningful apology; but all too often it is mealy-mouth and insincere, and the data processor then seeks to downplay the impact on the client, in an attempt to minimise compensation. For the victim, even significant amounts of compensation (and damages payments for privacy breaches are usually pretty modest) cannot begin really to put them back in the position they were in before.
The scope for privacy breaches through Trace and Track is vast and terrifying. We are being asked to share our personal data with an increasing number of people and organisations in an increasing number of contexts; for example pubs and restaurants are now required to keep a register of all visitors for 21 days to help test and trace infections (with real risks that under-resourced businesses, with no experience of handling data in this way, and who may lack the infrastructure or training, inadvertently share data, or save it on platforms which are vulnerable to attack).
We want to feel we are part of the solution to slowing the spread of the pandemic and may be more inclined than at other times to openly share our data. But the risks are huge, particularly when we cannot be confident that the privacy implications and safeguards have been properly thought through.