COVID-19 must not normalise digital surveillance

By David Paris | 15 Oct 20
woman in mask using phone

After a huge campaign to encourage every smartphone user in Australia to install and use the ‘COVIDSafe’ app, the effectiveness of the app has been negligible. But the impact of the app is much more significant, writes David Paris from Digital Rights Watch.

After a huge campaign to encourage every smartphone user in Australia to install and use the ‘COVIDSafe’ app, the effectiveness of the app has been negligible. The ‘sunscreen’ we needed to stay safe has done almost nothing. The number of at-risk people located through the app that weren’t previously found via traditional contact tracing methods is in the single digits in every state.

But the impact of the app is much more significant. The processes that led to its introduction, the changes the government was compelled to implement and the permanent change in the way future governments may approach electronic tracing will long outlive the virus itself.

Digital Rights Watch partnered with law firm Wotton + Kearney to further explore the human rights implications of the COVIDSafe app. The full brief authored by Associate Johanna Lawlor and Pro Bono Consultant Leanne Ho can be found here. The processes around the introduction, launch and use of the app have exposed a number of issues with the frameworks that safeguard human rights in Australia.

The limited effectiveness of the app is further reduced on older hardware. Older phones and operating systems are more likely to be owned by people on lower income and the elderly. Many more do not have a smartphone at all. Any such mechanism in future must not be wholly reliant on a technological solution. Doing so risks entirely excluding our society’s most vulnerable people from any protections and benefits.

The Commonwealth Government was on the backfoot around privacy and other rights concerns from when it first announced the COVIDSafe app. It’s not the first time that the government has been criticised about privacy rights. Public trust in government management of personal information suffered through the census debacle, Centrelink’s appalling ‘robodebt’ scheme and a series of bills granting more surveillance powers.

The government took extra steps to reassure the public. However it wasn’t until the legislation governing the app — the Privacy Amendment (Public Health Contact Information) Act 2020 —became law some two months after launch that groups including the Australian Human Rights Commission and the Law Council of Australia stated that the many privacy concerns had been addressed. Even then they warned of the need to do more.

The government at first also stated the app source code would be opened to review, they then walked this back only to change their minds again. It wasn’t until two weeks after the app launched that the code was made available, with little supporting documentation. There was no audit trail of changes, no way to suggest them or notify the developers about vulnerabilities.

The second half of the system, the COVIDSafe National Information Storage System, remained off limits. This left uncertainty around what happened to the information once it reached the government. The government was at pains to state that this data was being collected for one purpose only, yet the whole data custody chain is not available for audit. Transparency and accountability were effectively applied regarding what data was being collected, but information on how that data is used remains opaque.

A more rights-focused decentralised app design was possible from the outset. Apple and Google collaborated on a framework that was both more reliable and offered greater privacy protection through this approach. Other governments have abandoned their plans for an Australian-style centralised design and adopted the Apple/Google approach. The Australian government ruled this out.

Even with the improvements that led to the protections the government introduced in the Biosecurity Determination 2020 and later through the COVIDSafe legislation, some basic good practice was still ignored. A number of questions were not sufficiently addressed in the government’s privacy policy, including disclosures around what is recorded on a user’s device, how long it is retained, what data is sent to government servers, who controls the servers and who can access it, what laws it is subject to, when the app will shut down, when data collection will cease and when data will be destroyed.

The design of the app and the legislation governing it both contradict statements made by government officials that the app only collects data of other app users within 1.5 metres for at least 15 minutes. However, the Privacy Impact Assessment of the app revealed it collects and — with the consent of the user — uploads to servers, data about every other user who came within Bluetooth range within the preceding 21 days. The COVIDSafe bill does not address this and several other ambiguities.

The centralised data management model the government deployed also requires users to register their basic information and data exchanges are handled by a server in Australia provided by Amazon Web Services (AWS). AWS infrastructure is subject to US laws governing data, as well as their surveillance regime. This means that data from the app could be accessed by US law enforcement, regardless of protections here. Information exchange arrangements between US and Australian agencies could see that data eventually end up in the hands of local law enforcement as well.

The Australian government’s response to this concern has been inconsistent. The Deputy Secretary of the Attorney-General’s Department, Sarah Chidgey, responding to questions at the Senate hearings, could not give a “complete guarantee” that the data would not be accessed by US law enforcement. Prime Minister Scott Morrison said that any information collected would only be used by State and Territory health agencies for COVID-19-related purposes. Attorney-General Christian Porter stated police will be barred from accessing metadata from the app.

There is language within the legislation governing the app specifically to override some of the extraordinary surveillance powers available to government agencies. These include metadata retention, and the Assistance and Access Act 2018, which grants agencies the power to compel technology providers to make changes to their platforms to allow for surveillance. The new laws for the use of app data “cancels the effect” of any other law that might enable such access.

However, the Privacy Act 1988— which protects privacy rights on a federal level — does not regulate state government agencies. Further, protections for the data from a court order or warrant are not explicitly laid out in the legislation. Activities that federal agencies are specifically barred from doing could be undertaken by state agencies.

A tool for recording the time and duration that people of interest come into contact is undeniably compelling for security agencies. Former Communications Minister Stephen Conroy said that based on security briefings he had while in government, he is not confident the app wouldn’t be compromised in this way. Deputy Chief Medical Officer Nick Coatsworth revealed that the government had to refuse multiple requests from law enforcement agencies for “added capabilities” to be included in the system.

If a breach does occur, the remedies available to affected people are limited and costly. The app legislation emphasises criminal prosecutions over civil proceedings and there is no avenue available if a government is responsible for misuse of personal information.

Privacy concerns raised in the lead-up to the release of the app quietened as the curve flattened and the need for such tracking and tracing technology became less urgent. However, those concerns have served as a case study for evaluating the frameworks which currently exist in Australia for the protection of human rights. The need to strengthen them to avoid the kind of risks to human rights which were posed by the potential misuse of the app technology and data collection is clear. It is critical that these risks are addressed before the next pandemic or crisis leads to renewed calls to use of such measures.

Inconsistent attempts to reassure the public and protect privacy illustrate the clear need for a legislated Bill or Charter of Rights that includes protections for privacy. The Wotton + Kearney brief shows how exposed we are to breaches of human rights without this clear standard that governments can be held to, whatever their objective. Efforts overseas demonstrate that technological solutions can contribute to a widespread and effective pandemic response while still affording individual people their privacy.