https://www.expressandstar.com/resizer/OYK0HjVXz7XPLwBh2W0281Mo8xc=/1000x0/filters:quality(100)/cloudfront-us-east-1.images.arcpublishing.com/expressandstar.mna/25DCHDYHUFEVTD2C7IXGOFVAFI.jpg
An empty road

Researchers claim to develop concept to ensure ‘legal safety’ of driverless cars

Several UK academics expressed concern about the limitations of the work.

Researchers claim to have created a new technique for ensuring the “legal safety” of driverless cars.

Academics at Germany’s Technical University of Munich say their system can guarantee an autonomous vehicle will not cause accidents as long as other road users abide by the law.

The algorithm attempts to verify that a driverless car will maintain “fail-safe trajectories at all times”, meaning they move at a speed and direction allowing them to avoid collisions with other vehicles being driven legally.

Researchers led by Christian Pek presented an analysis of their concept in the journal Nature Machine Intelligence, stating that it can “drastically reduce the number of traffic accidents”.

They went on: “We present a formal verification technique for guaranteeing legal safety in arbitrary urban traffic situations.

“Legal safety means that autonomous vehicles never cause accidents although other traffic participants are allowed to perform any behaviour in accordance with traffic rules.”

But several UK academics expressed concern about the limitations of the work, which was part-funded by car maker BMW.

Professor Noel Sharkey, emeritus professor of artificial intelligence and robotics at the University of Sheffield, said: “This research pushes autonomous vehicle road safety in the right direction by verifying safe trajectories for avoiding obstacles such as other cars, cyclists and pedestrians.

“However, the authors oversell the real-world usefulness. It is all conducted in computer simulation that doesn’t address the dynamics of the real world.

“It also makes the mistaken assumption that all other road users are obeying the rules of the road.

“While this is a useful project, considerably more work in the real world is required before it can be considered for certification.”

Dr Ron Chrisley, director of the Centre for Cognitive Science at the University of Sussex, said: “The system’s assumption that other road users will always behave legally could potentially lead to collisions that a different system, based more on how road users actually behave, would prevent.

“Saying that a collision wasn’t strictly caused by the autonomous vehicle will be of cold comfort to the families of accident victims in cases where the autonomous vehicle could have avoided the collision if it had taken predictable but non-legal driving/walking behaviour of others into account.”

A number of crashes involving autonomous and semi-autonomous vehicles have been reported in recent years, including the first widely publicised fatal incident in Florida in May 2016 when a Tesla driver using its semi-autonomous Autopilot feature hit the trailer of a large lorry.

The Government has backed several trials of the technology in the UK in recent years, but fully driverless cars are banned from regular use on public roads.