Australia’s two most populous states are trialling facial recognition software that lets police check people are home during COVID-19 quarantine, expanding trials that have sparked controversy to the vast majority of the country’s population.

Little-known tech firm Genvis said on a website for its software that New South Wales (NSW) and Victoria, home to Sydney, Melbourne and more than half of Australia’s 25 million population, were trialling its facial recognition products. Genvis said the trials were being conducted on a voluntary basis.

The Perth, Western Australia-based startup developed the software in 2020 with WA state police to help enforce pandemic movement restrictions, and has said it hopes to sell its services abroad.

South Australia state began trialling a similar, non-Genvis technology last month, sparking warnings from privacy advocates around the world about potential surveillance overreach. The involvement of New South Wales and Victoria, which have not disclosed that they are trialling facial recognition technology, may amplify those concerns.

NSW Premier Gladys Berejiklian said in an email the state was “close to piloting some home quarantine options for returning Australians”, without directly responding to questions about Genvis facial recognition software. Police in NSW referred questions to the state premier.

Victoria Police referred questions to the Victorian Health department, which did not respond to requests for comment.

Under the system being trialled, people respond to random check-in requests by taking a ‘selfie’ at their designated home quarantine address. If the software, which also collects location data, does not verify the image against a “facial signature”, police may follow up with a visit to the location to confirm the person’s whereabouts.

Though the technology has been used in WA since last November, it has more recently been pitched as a tool to enable the country to reopen its borders, ending a system in place since the start of the pandemic that requires international arrivals to spend two weeks in hotel quarantine under police guard.

Aside from the pandemic, police forces have expressed interest in using facial recognition software, prompting a backlash from rights groups about the potential to target minority groups.

While the recognition technology has been used in countries like China, no other democracy has been reported as considering its use in connection with coronavirus containment procedures.

‘Keep communities safe’

Genvis Chief Executive Kirstin Butcher declined to comment on the trials, beyond the disclosures on the product website.

“You can’t have home quarantine without compliance checks, if you’re looking to keep communities safe,” she said in a phone interview.

“You can’t perform physical compliance checks at the scale needed to support (social and economic) re-opening plans so technology has to be used.”

But rights advocates warned the technology may be inaccurate, and may open the window for law enforcement agencies to use people’s data for other purposes without specific laws stopping them.

“I’m troubled not just by the use here but by the fact this is an example of the creeping use of this sort of technology in our lives,” said Toby Walsh, a professor of Artificial Intelligence at University of NSW.

Walsh questioned the reliability of facial recognition technology in general, which he said could be hacked to give false location reports.

“Even if it works here … then it validates the idea that facial recognition is a good thing,” he said. “Where does it end?”

The government of Western Australia has said it banned police from using data collected by COVID-related software for non-COVID matters. The WA police say they have put 97,000 people through home quarantine, using facial recognition, without incident.

“The law should prevent a system for monitoring quarantine being used for other purposes,” said Edward Santow, a former Australian Human Rights Commissioner who now leads an artificial intelligence ethics project at University of Technology, Sydney.

“Facial recognition technology might seem like a convenient way to monitor people in quarantine but … if something goes wrong with this technology, the risk of harm is high.”

© Thomson Reuters 2021


You May Also Like

Researchers Reveal How Mars Has Discreet Auroras Without Presence of Global Magnetic Field

Auroras are natural light displays, forming dynamic patterns of brilliant lights in…

Spaceship Network for Intercontinental Travel? Yes, Japan Plans to Have One Ready by 2040

A few years ago if someone asked you to imagine spaceships as…

NASA Administrator Bill Nelson Calls Richard Branson’s Space Flight a ‘Great’ Milestone

NASA administrator Bill Nelson has lauded billionaire Richard Branson, the founder of…

Elon Musk Decides Not to Join Twitter Board, CEO Parag Agrawal Reveals

Elon Musk won’t be joining Twitter’s Board of Directors, a move that…