Blog page img

Our Blog

Learn About The Latest Issues Facing The Technology and Telecommunications Industries. Subscribe To Our Blog And Get Regular Updates Automatically!

Some of our Satisfied Clients

Startups, SMEs, Public Listed Entities, Multinational Corporations and Government

featured in

Human Technology Institute releases a model law for facial recognition in Australia

face detection or facial recognition grid overlay on face of young beautiful woman

Following the alleged misuse of facial recognition technology by local retailers this year, the Human Technology Institute has released a report proposing a model law for facial recognition.

Facial recognition technology in the report refers to “any computer system or device with embedded functionality that uses data drawn from human faces to verify an individual’s identity, identify an individual and/or analyse characteristics about an individual.” This type of technology is often used in biometric devices, smartphones, computers, and surveillance systems. Despite this widespread use, there exists a gap in Australia’s legislative regime. There are presently no legal guardrails to ensure the use of facial recognition technology does not impose on basic human rights. The report suggests reform is needed to enable the responsible use of facial recognition technology via a new model law.

The model law to facial recognition technology places the onus on anyone who develops or deploys facial recognition technology to assess the level of human rights risk associated with the technology’s application. By doing so, developers and deployers are tasked with assessing risk from the outset, particularly considering:

  • how the facial recognition technology application functions
  • where and how it is deployed (eg. the spatial context)
  • the performance or accuracy of the application and the effect of any decisions made in reliance on the application’s outputs, and
  • whether affected individuals can provide free and informed consent to such use.

The authors of the report also propose a ‘Facial Recognition Impact Assessment’. This assessment will require developers and deployers to assign a risk rating to their technology which can then be challenged by regulators or members of the public. Moreover, the model law sets procedural and substantive requirements. For example, a procedural element includes that a Facial Recognition Impact Assessment be registered with the relevant regulator and made publicly accessible, whereas a substantive requirement includes extending the model law’s application to existing privacy law obligations.

The report lastly calls upon Australia’s Federal Attorney-General to lead the reform by:

  1. introducing a new Bill into Parliament based on the model law
  2. assigning responsibility to the Office of the Australian Information Commissioner (the most suitable candidate if no new regulator is appointed) to govern the facial technology regulation standard
  3. initiating a state and territory process to harmonise the law across Australia, and
  4. establishing a government task force on facial recognition technology, particularly in terms of legal and ethical standards as well as international alignment.

Additionally, CHOICE Australia is seeking public support to cement a dedicated facial recognition technology regulation via a public petition.

"Stellar Results Through Technology Contract Negotiations"

Are you putting your business at risk with lawyers who don’t understand Technology Contracts?

free book