Experts have told MPs that the biggest challenge to the adoption of self-driving vehicles is the public perception that they are not as safe as they should be.
At a hearing this month into the barriers to the introduction of self-driving vehicles in a range of contexts, the Transport Select Committee heard from a range of experts – and a well-known sceptic on the issue.
Dr Siddartha Khastgir, head of verification and validation for connected and autonomous vehicles at Warwick Manufacturing Group' told MPs: ‘The biggest challenge we have right now is the public perception of safety of this technology.
'We can have the safest technology at a technological level, but if we cannot convince the public that it is safe and get them to trust the system, they will never use it, so we will never reap the benefits of the system and these technologies.’
Dr Khastgir argued that it is necessary ’to take a true systems-thinking approach to technology development, where people are part of the system’.
He said: ‘We should not develop the technology in isolation and just hope that people will accept it. We need to bring people along as part of the technology development process, to understand what the requirements are and how to get them to trust these systems.’
Dr Khastgir added: ‘One of the biggest issues right now across the different types of use cases is the definition of what is safe enough.
‘Whatever we come up with as something that we say is safe will be unsafe if we do not define the threshold correctly.’
The discussion often focused on whether the public perception needs to be for self-driving cars to be as safe as human drivers or safer.
Dr Khastgir argued that it is necessary to turn public expectation levels, in engineering terms, into the safety threshold.
He said: ‘If they want it to be as safe as them, that is the benchmark we will try to work to. If they want it to be safer than them, that is the benchmark we want to work to.’
Lisa Johnson, UK director of public affairs at Starship Technologies, which operates small delivery robots, said the biggest risk for the service is how it integrates, ‘because not everybody around us understands how we operate’.
She told MPs: ‘We end up with people stopping at crossings to wait for our robots to cross the road because they like the little guys.
‘They say, “You go, robot; you go,” but the robot is not standing there waiting for someone to say, “You go, robot.” The robot has seen a car so it’s not moving anywhere.'
She added: ‘I suppose that is not a safety risk; but it is the biggest challenge we come across—how humans accept us into their environment. Ultimately it takes a couple of weeks, but eventually, people know that they are supposed to drive because the robot will just wait.’
At a later session, author Christian Wolmar, a longstanding sceptic over driverless cars, raised what he called the Holborn problem.
He told MPs: ‘Try to get a driverless car to go through Holborn at 6 o'clock in the evening. Somehow, you have to build into the vehicle so it can drive through very difficult situations with lots of pedestrians, cars, cyclists and whatever, and do it safely.
‘I think that might be insuperable because, as in Isaac Asimov’s rules of robotics, the car must not be programmed to injure somebody. Therefore, you have to programme it to stop when people walk in front of it and the like.’
Simon Morgan, chair of the Institute of Highway Engineers Traffic Signs Panel, told MPs: ‘We say that anything that makes roads safer is a great benefit because the toll of casualties and fatalities on UK roads is still unacceptably high.’