Self-driving vehicles: how safe is safe enough and what happens when they crash?
They sought views on their proposal that vehicles should be authorised to 'self-drive' once they reach "an equivalent level of safety to a competent and careful human driver". That is pretty worrying - and the Government's clarification that "This is safer than the average human driver" is only marginally reassuring.
In fairness, it is worth clarifying that, legally, a vehicle can only be classed as 'self-driving' if it doesn't require a human driver to monitor it. It is possible to authorise a vehicle to 'drive itself' only under certain defined conditions (e.g. on a fixed route, or on motorways only, or as long as it's not snowing). But if such a vehicle reaches the limits of its 'operational design domain', it needs to be able either to hand back control to a human driver, or else come to rest safely.
Until that point, legal liability for any safety failures will rest solely with the vehicle - or, in practise, with the insurance that will have to be taken out by an Authorised Self-Driving Entity (ASDE, a company that will have to be set up by the manufacturer of the vehicle and/or its operating system). The Government has rightly accepted that it would be really dangerous to authorise vehicles as being capable of 'self-driving' if they still require a human to be able to take over in an emergency. Hence their entirely correct decision that the ADSE should be legally responsible for any safety failures that arise while the vehicle is 'driving itself', until the human driver actually resumes control.
Yet there are still a lot wrong with the Government's proposed 'safety ambition', as Cycling UK's consultation response explained.
Our first concern - though by no means the most important one - is a legalistic issue with the words "a competent and careful driver". They already form part of the legal definition of 'careless' and 'dangerous' driving offences, and they in themselves are hugely problematic.
'Careless driving' (or, to be legally accurate, 'driving without due care and attention') is supposed to be that which falls "below what would be expected of a competent and careful driver', whereas 'dangerous driving' supposedly falls "far below" that standard.
Yet Cycling UK's report "Failure to see what is there to be seen" has documented the huge inconsistency in how the terms 'careless' and 'dangerous' driving are applied, by both prosecutors and the courts. Prosecutors, judges and juries can clearly have very different ideas of "what would be expected of a careful and competent driver", or whether the driving in specific cases fell "below" or "far below" that standard. The terminology is evidently not fit for purpose.
Cycling UK has spent many years urging the Government to clarify or amend this terminology, as part of the review of road traffic offences and penalties that it first promised in 2014, then again in late 2021, both times in response to Cycling UK's campaigning. Yet, despite the Government telling Parliament that they expected to start the review in the first half of 2022, in November we're still waiting.
Meanwhile, it would be wholly wrong to use such vague terminology to decide when self-driving vehicles are safe enough for use on our roads.
Safer for vehicle occupants but more dangerous for everyone else?
Our second concern is a more serious one. The manufacturers and advocates of self-driving vehicles are unsurprisingly keen for their vehicles to become the norm as soon as possible. We need to guard against this carefully.
In Cycling UK's response to the Law Commission's 2019 review of the legal framework for self-driving vehicles, we recognised that self-driving vehicles could be very good news or very bad news - both for road safety and for sustainable transport - depending on how they are regulated. We readily acknowledged that there could be very significant safety, equality and environmental benefits, once vehicles can self-drive safely for the whole of any journey (whatever the weather). At that point, people would have far less need to own a car, if they could summon a self-driving vehicle to their door when they needed it. We could also use road space much more efficiently, freeing up space for cycling and walking facilities.
But, if we are to achieve these health and sustainability benefits, we must require self-driving vehicles to deliver a major step-change in road safety - not just a marginal improvement. In particular, these safety standards need to support (and not undermine) efforts to boost walking and cycling.
Unfortunately, the industry and its supporters want self-driving vehicles to be authorised as soon as they are even marginally safer than human drivers. They argue that, if we withhold authorisation from self-driving vehicles until they are a lot safer than humans, many lives would be lost in the meantime.
The problem with this argument is one of inequality. Self-driving vehicles might soon reach a point when they are better than human drivers at avoiding collisions with other motor vehicles. However it will take a lot longer before they can match human drivers' ability not just to detect the presence of pedestrians and cyclists, but also to predict their movements. After all, pedestrians and cyclists communicate using eye contact and hand signals. Self-driving vehicles still aren't able to detect these cues.
So, if vehicles are authorised to self-drive as soon as their overall safety record overtakes that of human drivers, we could end up with improved road safety for vehicle-occupants being offset by an increased risk for everyone else - with pedestrians and cyclists, children, older and disabled people paying the price.
In response to Cycling UK's concerns, the Law Commission very sensibly recommended that the Government should consider the equality implications of any decisions to authorise self-driving vehicles (see their summary report paras 2.17 and 3.14). We strongly urge the Government to heed this recommendation.
A route to full self-driving
Whatever the benefits of reaching full automation, there are huge pitfalls during the messy 'partial automation' stage - i.e. when vehicles can only 'self-drive' under limited conditions (and therefore still need a human driver to be present), and when they are still sharing the roads with human-driven vehicles. During this phase, we get all of the downsides of automation, with very few of the upsides.
One benefit though could come from allowing self-driving vehicles, particularly lorries, onto our motorways at this relatively early stage - especially if we can also build depots at the edges of our towns and cities, so that these lorries can then transfer their cargos onto light urban delivery vehicles (including cargo-bikes) for the 'last mile' of their journey.
At this point though, self-driving vehicles would have to be programmed so that they can only drive themselves in situations where there is a very low risk of injuring pedestrians, cyclists or other non-motorised road users.
This phase might then need to last for several years - possibly decades - before self-driving vehicles would surpass the ability of human drivers to interact safely with pedestrians and cyclists. Indeed, it might never happen at all.
But if and when we do reach that point, we should switch as quickly and completely as possible to full self-driving, in order to maximise its environmental, equality, safety and other benefits.
Still, an advantage of this two-step approach is that, by the time we are ready to take the second step, self-driving vehicles would already be widespread. So a large part of our vehicle fleet would need nothing more than a software upgrade to convert it to full self-driving capability - allowing us to make the switch-over relatively quickly and painlessly.
What if a self-driving vehicle injures a pedestrian or cyclist? The case for 'presumed liability' rules
The final point though is what happens when, despite their (hopefully excellent) safety systems, a self-driving vehicle collides with a pedestrian, cyclist or other non-motorised road user? How does the law then decide who is liable for the resulting injury damages?
The Automated and Electric Vehicles Act 2018 already makes it clear that, when an "accident" [sic] "is caused by an automated vehicle when driving itself on a road or other public place in Great Britain" and anyone suffers injury or other damage as a result, then the vehicle's (i.e. the ASDE's) insurance scheme would be liable for the resulting damages. So far so good.
But who would have the legal onus of having to prove (or disprove) that the "accident" was "caused" by the vehicle, and that it was "driving itself" (rather than being controlled by a human driver) at the time?
If we go back to when self-driving vehicles were pure sci-fi, Cycling UK and other road safety groups were already arguing that the UK should adopt the legal principle of 'presumed liability' for civil compensation where road collision results in injury to a pedestrian or cyclist. In other words, the driver (or, in practise, their insurer) would be presumed to be liable for the resulting damages, unless they could prove that the injuries were at least partially the victim's own fault.
There are many good reasons for adopting this approach. Firstly, the pedestrian or cyclist is more likely to be injured in any collision with a motor vehicle. That might then leave them unable to recall how the collision occurred, and thus to prove to a court's satisfaction that it was the driver (and not the victim themself) who had caused the collision. Secondly, the driver is almost certain to be backed by an insurance company to help them fight their corner legally, whereas the injured cyclist or pedestrian might well not be.
But the advent of self-driving vehicles makes this 'inequality of arms' much more problematic. Injured pedestrians and cyclists will first have to work out whether to bring their legal claim against the driver or the ASDE. The ADSE will have all the information recorded by the vehicle's sensors and cameras, while the injured person would have none of this evidence. The ASDE will also have invested billions in their technology, and thus have a very strong incentive to avoid disclosing any evidence that could highlight a safety flaw in their vehicles or operating systems - think Volkswagen's pollutant emissions cover-up, but for road safety.
There has long been a very strong argument for introducing a presumed liability laws in the UK. The arrival of self-driving vehicles surely makes it crucial to adopt it.
Awaiting a Transport Bill
The Government's Queen's Speech last May promised a Transport Bill - though there have been two changes of Government since then! The main aim of the Bill was to introduce the Government's planned reforms to Britain's railways. But Ministers have also suggested that the Bill could include legislation on e-scooters and other 'micromobility vehicles', a road safety investigation branch, pedicabs, pavement parking and self-driving vehicles.
More recently, they have announced that the rail reforms are now being delayed till a future session of Parliament, however transport ministers still hope to go ahead in the coming months with a Bill covering at least some of these other issues.
In short, we don't know when new legislation on self-driving vehicles might be tabled. But whenever it comes, we need you, our members and supporters, to be ready to urge MPs to ensure that it includes the 'presumed liability' principle that Cycling UK has long been calling for.