Trust will be crucial to the success of contact tracing apps
This article first appeared in the Business Post on 1 May 2020.
We will be required to co-exist with COVID-19 for the foreseeable future. Sustaining the current lockdown presents significant challenges for society and government from an economic and social perspective and the search for a safe and sustainable path out of lockdown becomes more urgent with each passing day.
The medical and scientific communities have been highlighting the key steps on that path for some time: mass public testing; rapid identification and isolation of infected individuals; and comprehensive tracing of those with whom infected individuals have been in contact.
Contact tracing as a concept
One of those steps, namely the development, roll-out and use of contact tracing technology, presents some real challenges. The tracing of those who may have been in contact with infected persons is crucial. Without real-time information on who has been exposed to, and who is at risk of spreading, the virus, we are travelling the path blindfolded and our lockdown exit may be short-lived.
Contact tracing, at its most effective, will include the use of technologies that are now almost ubiquitous to supplement an otherwise laborious process, the most controversial and widely discussed of these being the release of a contact tracing app.
Without getting into a detailed discussion on the technologies that could be employed, it is likely that the Irish version of such an app will use Bluetooth to detect devices that are in close contact with each other. It is a deceptively simple mechanism.
Use and protection of personal data – a constant trade-off
In recent times, we have become accustomed to our smartphones knowing where we are, what we are doing and, often, with whom we are doing it. A very uneasy détente has been reached between our desire to maintain a degree of privacy over our lives and our willingness to offer up that privacy in return for information, facilities and services making our lives ever easier. We engage in constant trade-offs and data privacy laws sit between those two competing objectives, seeking to provide a bulwark against the exploitation of our data without inhibiting technological progress and commercial enterprise.
In this COVID-19 world, the trade-off is between an invasion of privacy in which data may be shared and analysed outside of the normal course, and the health (physical, mental and economic) of individuals and our nation as a whole.
Our current crisis, and the technology which is likely to be used as part of the response, brings a new dimension to this ongoing uneasy balance. Most people accept the concept of technology, such as contact tracing apps, as a tool to accelerate significantly the next steps in moving towards a society living with, and after, COVID-19. However, many also have expressed very real concerns that such technology could introduce, legitimise and ultimately normalise a level of digital surveillance that most of us would, in normal circumstances, be unwilling to accept.
Developments to date
In Ireland, Nearform have been tasked with developing our version of a contact tracing app, and executives at the HSE are reportedly engaging with the European Commission, the Attorney General and the ODPC on how this app will be built and how it will operate. However, there is a lack of information on key matters relating to the Irish app, including what information will be collected and how, who will be responsible for data collected and, as basic as it might appear, the precise extent of tracing facilities that the app will offer.
Just as crucially, little information has been released on the development process itself. Privacy advocates in Ireland have advocated the publication of the Data Protection Impact Assessment prior to the app being released as well as the release of the relevant source code.
In Europe, bodies such as the European Commission and the EDPB have released guidance and toolboxes for use by Member States, discussed in further detail below. Coalitions of EU scientists and technologists are currently developing standards and protocols such as PEPP-PT to provide standardised approaches which may be employed by national governments. Apple and Google have, in an unprecedented step, come together to develop both an API for incorporation into software and a broader cross-platform system that would use wireless signals to inform people if they encounter someone who has, or is later diagnosed with, COVID-19.
Already, use of similar apps has generated debate and protest in other countries. In France, Germany and the United Kingdom, privacy advocates as well as the general public have resisted elements or all of proposed apps. In France, 45% of respondents to a survey by Jean Jaurés Foundation said they would not be willing to download a contact tracing app, with commentators saying that it represented “a risk of sliding towards a form of digital tyranny”.
Meanwhile in Asia, countries such as China, Hong Kong, South Korea and Singapore have all introduced versions of this software, varying hugely in terms of technologies employed and the range of surveillance involved. By way of example, in China health QR codes are assigned to citizens and data may be shared with police and in South Korea, the app “checks in” twice daily with citizens and uses GPS to collect location data. By contrast, Singapore has taken a more privacy-focused approach, using temporary IDs assigned to app users and stored on their devices. These are only used if contact must be made with potentially exposed third parties and that information may only be provided to authorities with the permission of the user in question.
The EU has, unsurprisingly, been quick to engage with these questions and with stakeholders. On 17 April, the European Commission issued a guidance note on apps “supporting the fight” against COVID-19 in relation to data protection. This guidance followed its recommendation on the development of a common “toolbox” for the EU on the use of technology in the fight against the virus. The toolbox, released by the eHealth Network (a platform for Member States’ competent authorities on digital health) focuses on two areas, one of which is the development of a pan-European approach for the use of apps, including essential requirements, interoperability, safeguards, governance and accessibility. This focus on governance and safeguards is to be welcomed.
A new angle on the question of surveillance
Employing technology to accelerate contact tracing, even in the context of a prevailing global pandemic, has provoked a debate on the ethics of digital surveillance. This debate has been ongoing in the European Union since before the introduction of the GDPR on 25 May 2018. The GDPR takes a rights-based and principles-based approach to the protection of personal data, mandating lawful and fair processing, purpose limitation, data minimisation and storage limitation as fundamentals of any processing activities on personal data. It also leads with umbrella principles such as transparency, proportionality and data protection by design and by default as key elements of systems, proposals and communications which involve or in any way relate to our personal data.
Notwithstanding ongoing questions surrounding matters such as collection and analysis of data by big business and governments, technological advances allowing for greater, and more opaque, surveillance and the effectiveness and sufficiency of enforcement measures, it should be noted that GDPR’s mere existence demonstrates that as a society we value privacy and offers some comfort to those of us alive to the risks and opportunities associated with sharing our data.
The data protection framework has been created within the EU to build technologies, and an environment in which they may operate, which upholds fundamental principles and protects personal freedoms. It may even be the case that successful creation and management of technologies such as contact tracing apps validates the GDPR and data privacy as a concept by showing how data use and privacy may co-exist to the benefit of all.
Data protection principles: the foundation for success
The protection of public health is already provided for, both in the GDPR and in the Irish national implementing legislation, the Data Protection Acts 1988-2018. For example, Recital 52 of the GDPR specifically refers to “the prevention or control of communicable diseases and other serious threats to health” as a reason for which Union or Member State law may derogate from the prohibition on processing “special category” personal data (encompassing health data) and Recital 52 cites contact tracing as a valid ground for relying on derogations from principles relating to the transfer of personal data outside of the EU. Even if contact tracing technology does not specifically involve the processing of “special category” health data, it has been acknowledged at European and national level that the Regulation envisaged and provided for the processing of data for public health purposes.
If government and key stakeholders apply fundamental data protection principles in building a digital contact tracing system, in communicating with the public regarding that system and in ongoing implementation and management, the integrity of both our data and our data governance framework can be ensured without compromising the effectiveness of that tracing system. Organisations such as data protection supervisory authorities, data privacy campaigners and bodies and, ultimately, citizens will be central in ensuring and demanding that those principles are respected and applied from the outset.
It is obvious that strictly adhering to a number of core principles of the GDPR will be vital to ensuring that not only does the technology exist and its use occur within the legal confines of the Regulation, but that it is seen to do so, something which is key to ensuring public support and adoption.
Anonymisation of any personal data collected by contact tracing apps has been identified by many as being a fundamental component of the data protection framework. If successfully achieved, this would take the data outside of the scope of the GDPR entirely. However, without engaging in detailed technical analysis, it is notable that the ability to truly and completely anonymise data collected in this manner will always be questionable. Accordingly, other processing principles can, and should, be given the consideration and importance they require.
The requirement of transparency in the processing of personal data, a concept which envelops the GDPR and data privacy as a whole, should comprise the essence of the technology from concept to roll-out and beyond. Calls for the release of the app source code and publication of DPIAs in advance of the Irish app’s release have already been made. Such steps would lay the foundations for a subsequent data governance regime that retains transparency at its core, both in the design of the technology itself and the communications between data subjects and those responsible for the technology and the data processed. Citizens should be kept fully informed at all times as to data held, the status of that data and the processing being employed on it.
In that vein, effective accountability will also be a deciding factor in the technology’s success. In order to engage with it, the Irish public will need to know to whom collected personal data is being provided and whom is responsible for that data’s security, restricting its use to the app’s clearly identified purposes and, ultimately, its destruction. Those persons or bodies will, and should, be required to fulfil their obligations under law and demonstrate that they have done so.
Data protection by design and default
Without respecting the doctrines of data protection by design and by default from the outset, this project will fail – unlike in many Asian and other countries, it is not within the gift of European governments and agencies to introduce wide-ranging and intrusive technologies in a coercive (or even merely obligatory) manner. National governments and the EU will have one chance to implement this technology successfully, with a failure to do so setting back the entire COVID-19 response across the continent. By taking a privacy-first approach from the outset this technology will not falter before it begins.
Other fundamental processing principles
At a slightly more practical level, matters such as purpose limitation (restricting the processing of personal data to the purpose(s) for which it was originally collected), data retention (processing the personal data only for so long as it is required for the purpose(s) for which it was collected), and data minimisation (collecting only such personal data as is necessary for those purpose(s)) will also determine whether the technology will succeed or fail.
As a single example, one element of such apps that has generated discussion is the location of the storage of collected data. A decentralised method of storage, through which no single entity would hold collected data, has received the endorsement of hundreds of academics over systems which would instead store data on centralised servers. PEPP-PT, mentioned above, says it supports both mechanisms, although many of those academics have questioned the privacy credentials of centralised servers. Of late, Germany has confirmed its support for decentralised information storage on its own technologies, and it appears Ireland will follow suit. This should be welcomed as a demonstration of the victory of a privacy-led approach in one element of the contact tracing technical framework.
The meaning of “success”
Nonetheless, what is meant when we speak of technology succeeding or failing? In this context, success or failure means securing the public’s trust to a level necessary to ensure it is adopted in sufficient numbers to give the platform a chance of making a difference. Without sufficient support, the tracing mechanisms facilitated by that technology are of little use. It is, therefore, public buy-in rather than technical superiority or mass data gathering that is needed. Such buy-in will only be secured through comprehensive and obvious implementation of, and adherence to, these data protection fundamentals.
Recital 4 of the GDPR notes that “the processing of personal data should be designed to serve mankind” and that the protection of personal data is not an absolute right. This statement really resonates in this time of crises when personal data provides the key to the lockdown exit door. Nonetheless, it remains the case that a balancing of rights is not the same as an expropriation of rights. The statement that data processing should serve mankind is made at the outset of the foremost piece of legislative protection given to personal data globally. The statement exists within that context and so, therefore, should the technology it must facilitate.
The primary advantages of this technology are as clear as they are significant: getting populations out of lockdown; lowering risks for individuals and those around them; and saving lives. However, the only way to ensure the success of contact tracing technologies within Member States, including Ireland, is to ensure that European citizens are asked to place their trust in a technological system which is transparent, principles-driven, secure and effective. As importantly, however, it must be demonstrably so – in an increasingly suspicious and isolated world, we can no longer expect public acceptance as a matter of course. As has been shown in countries such as New Zealand, Germany and Finland, it is communication, clarity and compassion which lead to more successful outcomes in terms of public trust, widespread adherence to requests for compliance and, ultimately, the upper hand in the battle against the virus.
This is a question of leadership. Europe has been, and continues to be, a leader in promoting and protecting data privacy as a 21st century right. You and I, as European citizens, are now being called upon to exercise this privacy leadership as a central part of the effective use of technology in our fight against COVID-19 across the Union.