top of page
Search

Can the universal right to personal data privacy be upheld through the COVID-19 pandemic?- Part 2

This post is a a direct continuation of the commentary from 'Can the universal right to personal data privacy be upheld through the COVID-19 pandemic?- Part 1'. The commentary was originally written for a course at Yale-NUS College on human rights.


While HIPAA and CCPA offer a compelling defense for prioritising healthcare over privacy laws, they also highlight loopholes like data ethics, potential breaches and misuse. Macdonald offers: “We often forget, during times of great upheaval, how many of our quality assurances and basic rights protections are embedded in public institutional review” (Macdonald, “The Digital Response to the Outbreak of COVID-19”). In short, our argument for privacy vs healthcare isn’t as black and white as it seems. Thus we reach the second, more nuanced, aspect of the argument: governments and public health institutions must account for privacy and ethical concerns by restricting data use to healthcare purposes, creating robust mechanisms against data breaches/misuse, and customising apps to country needs. This broadly summarizes the viewpoint adopted by several NGOs and IGOs such as Amnesty International and the OECD. In 2020, Amnesty International’s Security Lab tested 11 contact tracing apps across the Middle East, North Africa and Europe (Amnesty, “Bahrain, Kuwait and Norway contact tracing apps among most dangerous for privacy”). It broadly categorized apps into (i) those enabling users to track and monitor any symptoms without any contact tracing (Lebanon, Vietnam), (ii) those using decentralized means of contact tracing like collecting Bluetooth data and storing it on the device itself (Austria, Germany, Switzerland etc) and the most serious, (iii) contact tracing apps which report GPS or Bluetooth location and medical data to a centralised government server (ibid. In addition, Bahrain’s app was linked to a gameshow which would award prizes if the app could verify that random users were at home, while Qatar’s compulsory EHTERAZ app revealed the sensitive data of millions of citizens in May 2020 (ibid). Amnesty International emphasized that while apps are critical to pandemic control, they must be moderated in their usage (ibid). Apps ought to be designed to require only the bare minimum data, store data in a secure fashion, and prevent data leakage to third parties for usage outside COVID-19 purposes (e.g. immigration, or crime). The OECD forms a similar framework for digital tools used to curb COVID-19, including those using biometrics, face recognition, geolocation and medical data (OECD, “Ensuring data privacy as we battle COVID-19”). It recommends that governments account for privacy-by-design, limit the storage time of data and maintain transparency about data use (ibid). Similar frameworks have been suggested by other organisations in light of COVID-19. Thus the guidelines provided by NGOs and IGOS provide a more nuanced view of the issue. States don’t just need contact tracing measures for good pandemic management. They need effective digital tools which account for concerns regarding appropriate data usage and security concerns like breaches/hacks.


To explain how governments and public health institutions can account for data privacy and ethics concerns, we look at three illustrative examples of apps across Singapore, Norway and Qatar. First, Singapore’s TraceTogether app. Created by Singapore’s Government Technology Agency (GovTech) and Ministry of Health (MOH), TraceTogether collects temporary Bluetooth location data to inform close contacts of people who test positive for COVID-19, based on their duration and proximity with the patient (Chong and Velpula, “Data Protection and Privacy in COVID-19 times”). The app had a high adoption rate of over 80%, and was internationally lauded for features like anonymization and encryption of data, and the temporary nature of data storage (upto 21 days only) (ibid; Reuters, “Singapore COVID-19 contact tracing data accessible to police”). Organisations using the app’s data are subject to Singapore’s basic acts on data privacy in the PDPA legislation (ibid). Thus, TraceTogether was also perfectly in line in line with Amnesty International and OECD’s recommendations of short storage time and privacy-by-design. Thus, it was an effective solution for the conflict of rights between the need for effective healthcare to curb COVID-19, while maintaining citizens’ data privacy. On January 2021, it was revealed that the police could also access this data for non-COVID purposes (Reuters, “Singapore COVID-19 contact tracing data accessible to police”). This highlighted the debate between privacy versus pandemic care. What ensued was a lengthy parliamentary session, with the outcome that TraceTogether data can only be used for seven serious crimes (ibid). While this was accepted by society, it is critical to acknowledge the importance of transparency about the usage of medical/biometric/location data, and different uses of data. If this is not recognised, it could disincentivize citizens from downloading an app in the first place; thereby, failing to control the spread of the pandemic.


Yet another illustrative example is Norway’s Smittestopp app, which was used by roughly 10% of the country’s population before getting suspended on June 16th 2020 (Lomas, “Norway Pulls its Coronavirus Contact Tracing App After Privacy Watchdogs Warning”). Similar to Singapore, Norway approached contact tracing via a centralised method which would upload data to a single system operated by health authorities (ibid). From the legal perspective, this was in line with Europe’s General Data Protection Regulation (GDPR) which allows personal data to be collected in urgent health crises (ibid). Unlike other apps, however, the key criticism of Smittestopp was its constant collection of real-time GPS location data rather than mere Bluetooth signals for proximity data (ibid). This was clearly against the European Data Protection Board’s guidelines, which valued the less invasive proximity data over constant location tracking (ibid). This was also linked to a host of smaller issues including difficulties in anonymising location data, compelling users to share data for multiple purposes (not just COVID-19) and user privacy concerns. Last but not the least, the Data Protection Agency also raised doubts over the accuracy of the data itself given low infection numbers in Norway at the time (ibid). The same set of concerns was espoused by the aforementioned Amnesty International Survey as well. While some of the issues overlap with other apps, Norway’s app had to adhere to a completely different set of standards under the European GDPR. Thus, we acknowledge that different apps have to be customized according to their population’s specific necessities and their specific data laws.


A final case would be that of Qatar’s EHTERAZ app. From its initial stages, the Qatari application has shown some prominent differences from other contact tracing apps like (i) being compulsory for all citizens and residents in Qatar, (ii) requiring a phone number, a PIN and access to the mobile’s files in order to operate, (iii) constant use of Bluetooth and GPS for location tracking (Aljazeera, “Qatar makes COVID-19 app mandatory, experts question efficiency”). Several sources noted these differences, including MIT, who debated the efficacy of making the app permanent, and Privacy International, who questioned the need for phone and PIN numbers (ibid). Eva Blum-Dumontet, senior researcher at Privacy International, stated: “ID numbers are often almost like biometric data, and you can’t change them easily. So if it’s out there after a leak or a hack, for example, consequences are a lot bigger” (ibid). And in a rather ironic twist, this is precisely what happened. Amnesty International discovered that the Qatari app had released medical information of over a million residents due to a security flaw (Amnesty, “Bahrain, Kuwait and Norway contact tracing apps among most dangerous for privacy”). While flaw was fixed within the day, it was still a data breach. And so we repeat the question: what implications does this have for the debate between data privacy and healthcare rights? The answer is twofold. Most prominently, the data breach highlights the need for robust tech mechanisms and legal frameworks against data hacks/breaches/leakages. Sensitive medical, personal, and geolocation data is always susceptible to threats by hackers; it is then upto the government engineers or agencies creating these apps to incorporate strong security measures to protect its citizens. Lawmakers also ought to prepare for legal ramifications of such situations. While this hasn’t been explicitly mentioned in the recommendations by Amnesty International or the OECD, it remains a valid proposition given the rising need for such apps. Secondly, there is a need to trim data collection down to the bare minimum necessary for pandemic management. While these three countries are by no means an exhaustive investigation into data privacy during COVID-19, they do provide insight into how each country tackles the conflict between privacy and health. And so across the illustrative examples, we note different aspects of data privacy and ethical concerns that government institutions must keep in mind.


Despite this exploration of society’s rights to effective healthcare versus digital privacy, we hit a conundrum in drawing a conclusion. Across all the resources, Macdonald probably captures this dilemma best: “One of the major risks — both of the platform economy and of the growingly private dimension of disaster response — is that we lose the ability to know whether decisions were “good”” (Macdonald, “The Digital Response to the Outbreak of COVID-19”). Ultimately, every country’s efforts right now are a compromise between rights- making it difficult to deem any of them “good” or “correct” against any universal standards. In the long term, this investigation has broader connotations for human rights literature as a whole. It highlights the need for a universal set of 4th generation digital rights which would make the UDHR more inclusive of all kinds of privacy/rights violations. Such universal standards are critical given the dynamic power of technology (or “tech”) today. Without these, it is impossible to determine whether the right to personal data privacy can be upheld during a pandemic. Nonetheless in the interim, this exploration strives to illustrate the best a country can do right now- reinforce their contact tracing efforts via restricted data use, preventions against data breaches/misuse, and customised apps for country needs.

 
 
 

Comments


© 2023 by Train of Thoughts. Proudly created with Wix.com

bottom of page