Highways partnered with AI-driven field software provider FYLD for the sector’s first AI symposium. This fascinating debate was both optimistic and pragmatic and provided key insights into this new era. Dominic Browne reports
Attendees:
- Tony Delaney, Colas, Associate Director Digital & Data Innovation
- Jonathan Saunders, Kier, Head of Business Systems
- Chris Bailey, Milestone Infrastructure, Project Lead
- Paula Claytonsmith, LCRIG, Chief Executive
- Lyle Andrew, IHE, Chief Executive
- Shanmeng Wei, Connect Plus, Asset Investment Manager
- John Warne, WJ Group, Business Development and Marketing Director
- Amanda Richards, Surrey County Council, Assistant Director of Highways
- Matt Peck, AtkinsRéalis, Innovation Director
- Marcio Rodrigues, FYLD, VP EMEA
- Joshua Wood, FYLD, Client Director
- Tom Gould, Skanska, Operational Efficiency Director
- Dominic Browne, Highways Magazine, Editor
- Mike Head, Department for Transport, Head of AI Policy and Strategy
- Hazel MacMahon, FYLD, Value Consultant.
The bigger picture
No one would deny that AI has massive potential benefits for all sectors, including highways; the issue is whether we are in a position to take full advantage of it.
Our AI symposium brought together government, professional bodies and thought leaders from the public and private sectors to outline the big wins and the barriers to success.
The prime minister has unveiled an AI action plan based on three pillars: foundations for growth and investment; boosting adoption across public and private sectors; and keeping us ahead of the pack. The use of AI on the road network can help with all of this.
The paint ‘is not yet dry’ on this technology as one attendee said, but if we are to seize the opportunity, we have to stand back and see the bigger picture.
The big message
Attendees agreed that we need a strong, overriding narrative to present to central government, elected members and indeed the public. This could include a limited number of key points or one core message but it should capture the opportunity cost surrounding the technology – the cost of inaction as well as the huge benefits of implementation.
It was agreed that a good place to start is clarity on the problems we need to solve. This is also recommended as a good way to approach the IT market in general. The most obvious issues attendees raised were maintenance and safety. Naturally, the public will want to know what this technology can do to prevent potholes forming, as well as make our roads safer. In fact, it was noted that they would not thank us for spending millions on AI technology if it had no benefit to these core issues. One of our ultimate goals is to finally deliver a fundamental sector-wide shift towards proactive delivery. If we can make the case that AI will deliver proactive asset and risk management, with easier, safer travel as a result, we will have a message the sector and the public can get behind and politicians respond to.
On a functional level, AI is powered by data. Not for nothing has data been dubbed the new oil. Road authorities want an ‘integrated data system’ that allows them to build in and cross-reference different datasets. This could start with inspection and condition monitoring data combined with asset management lifecycle and deterioration modelling. It could then build out to include road safety data; road space, utilities and street works information; traffic counts and weather data. As authorities start to integrate these datasets into real-world functionality, they will start to release the value of the data itself e.g. making interventions ahead of time in a way that aligns with any other works on the network and minimises disruption.
Barriers
The symposium identified four key issues that present barriers to the use of AI: finance, skills, commercial models, and data.
Finance is no surprise. The next ALARM survey comes out this month and there are probably few in the sector who expect the £16.3bn carriageway backlog to have reduced much. Currently, the use of AI has a high cost of entry for highways because it is a ‘risk of life’ sector. Any system or solution deployed on the network must be robust; often this involves products and systems being trialled, tested, proved and certified over years, sometimes decades.
This is not a sector where workers can simply use a free trial of ChatGPT and expect much actionable assistance.
Then comes skills. The process of upskilling staff across government, councils and the private sector in AI has only just started. Plus, this is a cross-cutting agenda with the potential to disrupt and evolve numerous policy areas.
But the public sector is still, as always, locked into its siloed departments and divisions. While basic training can be cascaded down, each particular team will have different needs and interests and be trying to work out exactly what this technology means for them.
It will take time to raise the general knowledge of the sector to appreciate the use of AI, let alone master it.
It was also noted by the attendees that even in the most enabled authorities, public sector knowledge is far behind the market, leaving it vulnerable to risk.
5 top considerations for decision-makers:
• Start with the problem you want to solve.
• Build a strong narrative around why AI can help and the cost of inaction.
• A functional data structure is key.
• Can contractual innovation match technological innovation?
• Can financial gains from productivity be recycled for continual development?
Taking a risk
The general response seems to shift all the risk for innovation onto the private sector. This presents issues, especially when term maintenance contractors are struggling to both add value and improve margins. On the one hand, the main Tier One contractors are asked to demonstrate innovation, long-term lifecycle value and even social value. On the other, they are often told to do everything at the lowest cost.
Innovation requires a tolerance of risk and indeed failure over time that public bodies seldom have. In addition, short to medium-term funding cycles on term service contracts struggle to reward the long process of innovation within the same timeframe. Big ideas like AI, run into the ‘jam tomorrow’ problem, especially when councils are faced with shoestring budgets, attendees said. Another interesting commercial issue is that AI may not present contractors with an opportunity for ‘billable hours’. Good old analogue physical work represents a clear, chargeable service. Despite longstanding pain-gain models, there is a general commercial disincentive against disruptive cost savings.
Plus, while AI is already showing it can drastically boost productivity in inspections, for instance, authorities are less confident about where the money saved will end up – likely reinvested in other local services like social care.
In short, some ‘commercial innovation’ may be needed to match the technological innovation. However, the sector should be in no doubt about the benefits. A delegate who used FYLD’s services reported that it saved them approximately two and a half hours a shift, as well as its digital workflows eliminating 600,000 sheets of paper annually. Its AI risk assessments also improved operative safety behaviour, plus it streamlined end-to-end processes and brought real-time data into their operational control rooms.
Finally, data remains a huge barrier because of the complexity of gathering, structuring and storing it. Councils don’t always even have the data they really need. On top of this, there is another commercial barrier around sharing the data. Once a contractor’s term of service is up, it takes all of its equipment and data with it, presenting a problem to maintaining local robust datasets, let alone central data lakes.
There was some debate about how much data could or would be shared. It was suggested that it would certainly be good to share core health and safety information, though how this would happen remains unclear.
The sheer volume of data coming from the road network makes it impossible to properly store it all and while a central data repository sounds positive in theory, outfits are often trying to create clean ‘operational datasets’ outside the noise of the surrounding messy and federated data architecture.
There are data regulations in place, such as the Network and Information Systems Regulations, and standards around innovation and collaborative working. But there is very little in the way of standards and regulations in AI as it is such a new technology. National governments are still scrambling to keep up with its implications.
Attendees expressed little appetite for an AI standard at this early development stage. However, it was suggested that some intellectual or functional ‘guardrails’ could be put in place to foster the right culture and behaviours. This should be supported by the development of a good ‘AI supply chain’ that companies and authorities can work with and go to for sense checking. It could include academia in a similar model to National Highways’ Roads Research Alliance (RRA) – a unique collaboration between the national operator, the University of Cambridge and industry partners.
Another idea was for an AI board, operating at an academic and expert technical level.
Highways can also learn from other sectors and pre-established systems in IT, security and defence. For instance, principles such as ‘secure by default’ in cyber security, could be readily adopted.
Another idea was having a sector ‘sandbox’ that companies could plug into to test-run their systems to ensure that they can work and integrate safely and effectively. This is already used in the UK’s defence sector as a prerequisite to procurement.
It was noted that these challenges would require major investment from central government into improving the underpinning infrastructure and data architecture.
Conclusion and initial ask
Highways is a good place to trial AI because it is a physical asset, with a proven ability to train machine learning models to our needs such as those used to spot potholes. It does take time to train them, sometimes years, but the process should accelerate and will produce major efficiencies when successful.
However, there are barriers to working in a sector with a safety-critical statutory responsibility, reduced funding, problematic data architecture and a risk-averse culture with little ability to share the commercial risk of innovation.
Also, it is fragmented and not a closed system of infrastructure like buildings. So, it is a complex and shifting environment to begin with – almost a wicked problem in itself.
However, it is also a network of huge value to the economy and interest to the public, with potentially enormous productivity gains to be made.
The key ask that came out of the symposium is the need for government to establish an AI ‘capability fund’. This would be offered to roads authorities on a non-competitive basis to help develop skills, test concepts and start a cultural shift towards embedding AI awareness and better behaviours.
In short, this fund would give the sector the breathing space to step back and see the bigger picture, and start thinking about the challenges and the many benefits that lie on the road ahead.