folienfeuer - Fotolia
Uber autonomous vehicle death raises questions for UK law review
Uber suspended testing of autonomous vehicles in North America after one of its cars struck and killed a pedestrian this week, and the tragic accident raises big questions in the light of the UK’s ongoing legal review of self-driving cars
The suspension of tests of autonomous vehicles in the US and Canada by minicab operator Uber after one of its cars hit and killed a pedestrian in Tempe, Arizona, has again raised important questions over safety and culpability relating to autonomous vehicles, particularly in the context of an ongoing review of the law in the UK.
While a full investigation into the fatal incident is ongoing, the known facts are as follows. According to local news, the victim, who was named as Elaine Herzberg, was wheeling a bicycle across a street outside of a pedestrian crossing at about 10pm local time on Sunday 18 March, when she was struck by an Uber Volvo SUV operating in autonomous mode, travelling at about 38 miles per hour, well under the speed limit for that road.
The vehicle’s operator, named as Rafaela Vasquez, was understood to be unharmed, and no other passengers were in the vehicle at the time.
In a brief statement on Twitter, Uber said: “Our hearts go out to the victim’s family. We’re fully cooperating with Tempe Police and local authorities as they investigate this incident.”
The organisation had not issued any further comment at the time of writing, although CEO Dara Khosrowshari also acknowledged the death, and committed to working with the authorities to find out what had happened.
Tempe mayor Mark Mitchell said: “The City of Tempe has been supportive of autonomous vehicle testing because of the innovation and promise the technology may offer in many areas, including transportation options for disabled residents and seniors.
“Testing must occur safely. All indications we have had in the past show that traffic laws are being obeyed by the companies testing here. Our city leadership and Tempe Police will pursue any and all answers to what happened in order to ensure safety moving forward.
“I support the step that Uber has taken to temporarily suspend testing in Tempe until this event is fully examined and understood.”
Read more about autonomous vehicles
- Gatwick Airport plans to trial driverless technology to move staff around the airfield, in the hope of creating an Uber-like service to cut costs and emissions.
- Thanks to the miniaturisation of 3D cameras and increasingly powerful AI software, says Sensible Vision’s George Brostoff, interaction between people and their cars will never be the same.
- Dubai’s transport authority has enlisted Here Technologies to build a high-definition map of the city to support its goals around self-driving cars.
A Tempe Police spokesperson said preliminary investigations had suggested it was highly unlikely that Uber was at fault.
However, this assessment was disputed by Dmitry Bagrov, managing director of technology consultancy DataArt UK, who said the reasons for failure were likely to be almost identical to what might be seen with a human driver.
“One of three things failed – the car didn’t recognise the obstacle, or it didn’t recognise it as a threat, or it failed to react,” said Bagrov.
“The systems that detect an object could have failed. This is extremely unlikely, as it is a major target of all the driverless car’s algorithms, and the principal objective, to detect moving obstacles. My guess is that the unit that was supposed to make a decision to stop wasn’t working.
“Either way, all clues currently point to a failure in the central ‘brain’ of the car.”
Previous incidents
This is not the first accident involving an Uber autonomous vehicle, and the company has previously suspended testing following other accidents, although these have tended to be the result of outside factors, such as mistakes made by nearby human drivers.
The Tempe incident is also not the first known fatality linked to an autonomous vehicle incident, although it is the first involving a pedestrian.
In 2016, Florida resident Joshua Brown, a tech entrepreneur and self-driving car enthusiast, was killed behind the wheel of his Telsa Model S when the vehicle was in autopilot mode.
According to reports, Brown was not paying attention to the road and failed to spot a lorry crossing ahead, while the Tesla’s systems also failed to spot the white trailer against a brightly lit sky. The vehicle hit the trailer, shearing off its roof and killing Brown.
The US National Transportation Safety Board (NTSB) concluded in September 2017 that despite Brown having ignored six audible warnings to keep his hands on the wheel, Tesla bore some responsibility for his death by selling him a system that he was easily and readily able to misuse. The firm has subsequently updated the system with checks and limits on hands-free driving.
UK implications
TechMarketView analyst Georgina O’Toole wrote that the Tempe incident had “significant implications” for the development of autonomous vehicle technology around the world.
Questions of who is liable in the event of an accident have already shaped much of the debate and will continue to do so, said O’Toole, and the government will no doubt be keen to resolve this issue so it can continue to build on the UK’s strong position in this field.
“Of course, the other issue is whether such an accident will dent consumer confidence in the technology,” she added.
Legal review
Earlier this month, the Law Commission was directed to put in place a package of legislation to enforce the safety of driverless cars, but this is not expected to be in force before 2021, and numerous trials, some on public roads, are already taking place in the UK.
The government said that the legal review – which is part of the Future of Mobility Grand Challenge set out in the recent Industrial Strategy – would be crucial to examine how laws laid down with traditional motoring in mind will have to evolve to support autonomous vehicles.
“With driving technology advancing at an unprecedented rate, it is important that our laws and regulations keep pace so that the UK can remain one of the world leaders in this field,” said roads minister Jesse Norman.
“The Law Commission’s joint project will examine difficult areas of law in order to develop a regulatory framework that is ready for self-driving vehicles.”
Among the questions under consideration will be who is the driver or responsible person, as appropriate; how to apportion civil and criminal responsibility when there is shared control; autonomous vehicles within the context of public transport and car sharing; the possibility of new criminal offences to deal with new types of conduct or interference; and the impact on other road users, such as cyclists and pedestrians, and how to protect them. This final point will become even more crucial moving forward.
“Regulation, safety standards and vehicle insurance models all have a key part to play in enabling change, while giving society confidence that these new products and services can be introduced safely,” said Rob Wallis, CEO at the Transport Research Laboratory (TRL), which is overseeing much of the work on autonomous vehicles being done in the UK.
“The [Greenwich] GATEway project, led by TRL, is providing vital scientific insight to help shape future regulatory standards and to better understand public perceptions associated with these new mobility solutions,” he added.