When a Tesla vehicle injured a Melbourne pedestrian whilst in autopilot mode, the question arose as to who is responsible when autonomous vehicles crash?
Self-driving cars are exponentially gaining adoption in Australia. However, the law has not yet caught up with the technology. Modern negligence case law has determined that a manufacturer is responsible for its product. This means that manufacturers, and even developers of self-driving cars, should at least implement testing, management, and monitoring practices to ensure safety and accountability. Although, this creates the issue of how much testing and risk management is enough? It is almost impossible to test every driving scenario, especially as vehicles are subject to different driving standards across the world. This will require precedent in establishing standards of care and responsibility for risk.
After standards of care and responsibility for risk are identified, there must be a clear method for enforcement. Aaron J Snoswell, a post-doctoral research fellow in computational law and artificial intelligence accountability, suggests one approach to empower a regulatory body to impose penalties for accidents caused by self-driving vehicles. Additionally, victims of accidents must be able to sue the manufacturers of self-driving vehicles. However, it is difficult for a court to understand the intricacies of artificial intelligence used in self-driving cars and such information may not be easily revealed for commercial purposes. Moreover, the decision-making behind artificial intelligence may be hidden, where Tesla’s autopilot is built on ‘deep neural networks’ that developers cannot even rationalise.
Going forward it is likely that we will require close collaboration with courts, regulators, manufacturers, and industry experts. Following that, Australia may need to amend or introduce new legislation to regulate self-driving vehicles to ensure accountability in cases of accidents.