Deputy editor George Barker looks at automated taxi service provider Waymo, some of the issues it has faced across the pond and what, if anything, is being done to avoid repeating the same mistakes when it comes to London.

Automated vehicles (AVs) may not be science fiction anymore, but just like your typical dystopian fantasy, artificial intelligence and the machines they operate are not perfect.

Last October, London was greeted with the announcement that Waymo, the American autonomous driving technology company owned by Alphabet Inc. (Google's parent company), is expected to introduce its AVs to the capital's roads this year. The Department for Transport (DfT) is operating pilot schemes to test the services and see how they function in the UK.

Transport secretary Heidi Alexander said: ‘I'm delighted that Waymo intends to bring their services to London, under our proposed piloting scheme. Boosting the AV sector will increase accessible transport options alongside bringing jobs, investment and opportunities to the UK. Cutting-edge investment like this will help us deliver our mission to be world leaders in new technology and spearhead national renewal that delivers real change in our communities.'

Investigations

While this appears on the face of it to be a big step forward for vehicle technology, the history of Waymo's operations has been marred by a number of collisions with both objects and people.

To date, the US Department of Transport, National Highway Traffic Safety Administration (NHTSA) has opened three investigations into the Alphabet-owned company's vehicles, with two of them still ongoing.

The first was in 2024, when the Office of Defects Investigation (ODI) opened a preliminary evaluation to investigate the Waymo 5th Generation automated driving system's (ADS) performance based on 22 reports of unexpected driving behaviours.

ODI identified 367 total incidents during the investigation, with 109 crashes, of which 102 met Standing General Order 2021-01 (SGO) reporting criteria and were reported by Waymo under the SGO.

For these incidents, the ODI chose to focus on a ‘pattern of crashes' where Waymo vehicles struck stationary or semi-stationary objects such as gates, barriers, chains and other objects with similar physical characteristics.

During the investigation, Waymo issued two recalls intended to improve the vehicles' ability to perceive and respond to objects, including one which updated software for 672 cars after a May 2024 crash in which an unoccupied Waymo vehicle struck a utility pole.

The company also updated its ADS system to ‘improve the detection and avoidance of roadway barriers', according to the ODI.

In a summary of the investigation, ODI stated: ‘In view of the recall actions taken by Waymo and ODI's analysis of the available data, ODI is closing this Preliminary Evaluation. NHTSA will continue to monitor crash reports and other sources of data and will take additional action if warranted.'

The two ongoing investigations focus on the safety of children around Waymo's vehicles. The first concerns an instance when a Waymo vehicle drove around a stopped school bus that had red lights flashing, stop arm deployed and crossing control arm deployed – safety measures designed to indicate to traffic when children are boarding or leaving the bus and which require all vehicles to wait.

Another investigation was opened when a Waymo vehicle was involved in a collision with a child in the area surrounding a school in California. While the child escaped the encounter relatively unscathed, the NHTSA did note that they sustained ‘minor injuries'.

The investigation was opened by the safety body to examine whether the Waymo AV ‘exercised appropriate caution given, among other things, its proximity to the elementary school during drop off hours, and the presence of young pedestrians and other potential vulnerable road users'. 

Driving improvement

When approached on how it planned to prevent these issues from recurring in the UK, Waymo stated that it couldn't comment on open investigations.

However, it did add: ‘We are committed to continuous improvement and take several steps to ensure our technology operates appropriately on public roads. That includes ongoing learning and improvements to our software and operations, a rigorous, methodical approach to growth and deep partnerships with regulators, policy-makers, and emergency officials to ensure we achieve our shared safety mission.

‘While rare incidents will occur over the more than four million miles we drive every week, our entire fleet can learn from these events and continue to make roads safer. The data to date indicates we are already making roads safer where we operate, showing 90% fewer serious injury crashes, and 92% fewer injury crashes involving pedestrians compared to human drivers.'

Similarly, a spokesperson for the DfT said: ‘All autonomous vehicles must meet an extremely high safety bar before they are allowed on our roads and get consent from the local licensing authority where they will operate.

‘Currently, 88% of crashes on our roads involve human error, so self-driving vehicles could significantly improve road safety.'

Highways understands that the DfT has set two requirements for a prospective deployer looking to operate a self-driving vehicle service; firstly, the Vehicle Certification Agency (VCA) will have to assess the self-driving capability of the vehicles to ensure they meet the bar for listing under the Automated and Electric Vehicles Act 2018 and for the issuance of a Vehicle Special Order.

Once this assessment is complete, the operator will then be required to gain the consent of the local licensing authority (i.e. Transport for London [TfL]).

On top of this, there is also further guidance expected from the DfT, specific to pilot schemes, but this has not yet been published and is expected ‘in due course'.

The 2018 Act focused on insurance liability, with the 2024 Automated Vehicles Act building on this foundation to expand the regulatory and safety framework. The 2024 Act will require self-driving vehicles to achieve a level of safety at least as high as careful and competent human drivers, as well as meeting rigorous safety checks. It also sets out a procedure to authorise self-driving vehicles but this procedure is not yet in force. Under the current law, vehicles that are capable of safely driving themselves must be listed under section 1 of the Automated and Electric Vehicles Act. 

The Automated Vehicles Act Implementation Programme was launched in 2024 to secure the safe deployment of AVs on roads in Great Britain. It is responsible for carrying out the full policy, legislative and operational programme to implement the Act in 2027. 

It is this legislative background that the capital's road authority is falling back on when it comes to the burden of managing safe AV operation. A TfL spokesperson said: ‘TfL recognises the challenge of legislating in response to changes in automated vehicle technology in a timely manner to ensure benefits are delivered and risks are mitigated. Legislation must set a high benchmark and consider the impact on all road users. For example, the safety ambition for automated vehicles should align with Vision Zero and support the goal of eliminating all deaths and serious injuries from collisions on London's streets by 2041.'

It is this Vision Zero that is the ultimate test for AVs on our roads and the ultimate reason why so many people feel we have a duty to pursue their deployment.