Autonomous cars ... considering the human factors

Louise Smyth

Boris Sedacca explores why autonomous cars might well be able to operate alone, they cannot truly be driverless

The death by accident of a motorist who was driving a Tesla car in semi-autonomous mode in June 2016, was shortly followed by another, non-fatal accident, and this will undoubtedly add to the public's unease with so-called driverless cars. Here, we report on recent developments.

According to a Florida Highway Patrol statement, the Tesla accident involved a man who was killed when his car drove under the trailer of an 18-wheel vehicle on a highway. The lessons to be learnt from this accident will be eagerly scrutinised as details surface, particularly by the ‘Atlas’ initiative, a UK consortium formed in February 2016 that secured funding from Innovate UK to examine the data requirements needed to support autonomous navigation.

Richard Cuerden, senior academy fellow at the Transport Research Laboratory (TRL), is an experienced road safety expert with a background in engineering and injury biomechanics, and the lead researcher for some of the UK’s most in-depth road crash investigation studies.

Cuerden argues that the Tesla Autopilot system can be misunderstood to mean that the driver does not need to drive, whereas in reality it is an additional driver-assistance mechanism that kicks in when there are road hazards. The system was released as beta software.

He says: “In the majority of cases, the Tesla Autopilot system has been proven to work as intended. Autonomous vehicles use a combinations of sensors such as stereo cameras, radar and so on, and the technology is developing quickly as a result of the processing power now available for recognising hazards and targets.

“Algorithms are getting better at spotting danger by discerning what a camera has captured and fusing the data with that obtained by radar. Urban environments present many more challenges than in rural areas and motorways because of hazards such as cyclists.”

Cuerden also adds, “When motor manufacturers want to sell vehicles into Europe, they have to meet the General and Pedestrian Safety Regulations for type approval in Europe on vehicle safety, emissions, and so on.

“What the European Commission does in many instances is reference UNECE regulations, which has working groups at the United Nations level, so the general safety regulations for Europe to meet frontal impact testing references UN Regulation 94 Occupant Protection in Frontal Collisions, but are approved through Europe – not all UN rules are mandated for type approval in Europe and Europe mandates things that the UN does not have rules for yet.”

Cuerden writes and presents scientific papers at international safety conferences. He also chairs the Vehicle Design Working Party for the Parliamentary Advisory Council for Transport Safety (PACTS), and is a technical expert on the European Enhanced Vehicle Safety Committee for Accident Studies (WG21).  He provides technical support to the European Commission, providing independent written reports and presentations at International Motor Vehicle working groups and has successfully managed large work programmes involving multi-disciplinary teams. As for Brexit, he believes that the work will nevertheless continue as before.

“Autonomous vehicles are coming to our roads and there will be work going through Europe and the UN to ensure that they are regulated to the correct standards. We run the Gateway programme at the Royal Borough of Greenwich, which involves three separate trials of different types of zero emission automated vehicles. We are going to have autonomous pods going around the O2 Arena area as a relatively low speed shuttle service for pedestrians and tourists. We do 3D mapping of the area, which helps us understand how autonomous vehicles detect their environment.

“We work with Jaguar Land Rover and Bosch among others in a project to have members of the public drive a Discovery around the area. They will feel just like normal vehicles. The vehicles pick up from sensors what is going on in the environment and correlate what the vehicle would do if it was driving autonomously with what the driver is actually doing. Then we gather all the data via 3G communications overnight.”

TRL’s DigiCar driving simulator will be used in parallel to investigate driver behaviour in automated vehicles using a photorealistic 3D model of the Greenwich peninsula. Risk, liability and insurance issues will addressed, while pedestrian interaction with automated vehicles will be modelled and adaptation to traffic lights will also be explored.

Ordnance Survey

The Atlas Consortium leader Ordnance Survey (OS) works with Satellite Applications Catapult, TRL, Sony Europe, Gobotix, OxTS, and the Royal Borough of Greenwich.

Jeremy Morley, chief geospatial scientist at OS is head of a consortium for Innovate UK that is examining the data needed to support driverless vehicle navigation in the UK. He says: “Connected autonomous vehicles (CAVs) will combine the power of advanced sensors to detect road conditions, cutting-edge 5G communications technology to access a stream of data about the world around them, and geographical databases of routes, destinations and points of interest.

“Autonomous operation can be retrofitted to existing vehicles or incorporated in new vehicles designed for autonomous operation from the ground up. There is a consortium called Venturer that is funded by Innovate UK and uses an adapted Land Rover for trials in the Bristol area, so they are effectively adding autonomy to an existing vehicle.”

Venturer includes Atkins, Bristol City Council, South Gloucestershire Council, AXA, Williams Advanced Engineering, Fusion Processing, Centre for Transport and Society, University of the West of England (UWE Bristol), University of Bristol and Bristol Robotics Laboratory, a collaboration between the University of Bristol and UWE Bristol.

Morley continues: “In Milton Keynes, there have already been a series of trials of lower speed autonomous pods, some of which have been upgraded for testing at the Royal Borough of Greenwich in London. If you look at Google’s approach, the company started by adapting Lexus vehicles but now has its own specifically designed autonomous vehicles.

Turning to the Tesla accident, Morley understands that the vehicle’s front-facing radar was designed to not face too far up in order to avoid problems of detecting bridges. In this case, the vehicle was coming up a hill when there was a truck at the top of the hill and the radar beam was passing underneath the truck.

“Meanwhile, the driver was using auto-pilot and not paying attention to the road,” adds Morley. “When people have an accident, it is almost impossible for us to learn from each other the individual details of the accident to avoid future accidents, but autonomous vehicles can learn from their mistakes and spread the learning to other autonomous vehicles.

“Before there is an official accident investigation report published, it is difficult to know what changes will be needed in the design of the radar or the vehicle. As the vehicle is not fully autonomous, the driver is still expected to be aware of what is going on at all times.

“They are more intended for cruising on freeways than smaller roads such as the one where the accident happened. While you could argue that it was a freak accident, it could happen again but not if the vehicle was used in the operating environment for which it was designed.”

Connected vehicle cluster

A connected autonomous vehicle (CAV) features a cluster of four key sensing technologies. Apart from radar, there are standard optical camera systems that may include infrared sensing or act in tandem with thermal imaging and scanning laser beam or light imaging detection and ranging (LIDAR).

LIDAR may come in two different scanning geometries: sweeping LIDAR that rotates around the vehicle or scanning LIDAR in a plane at the front of the vehicle.

The fourth sensing technology is GPS or more specifically, global navigational satellite system (GNSS), which includes for example the European Galileo System among others. GNSS on its own doesn’t locate a vehicle to sufficient accuracy and so relies on additional sensing systems to locate a vehicle within centimetre accuracy.

GNSS has the advantage of being ubiquitous within built-up city centres. Different systems used different combinations. For example, some research groups are relying on advances in computer vision and cameras, matching images against a bank of previously registered images, which will have been taken in all different weather conditions from snow to sun.

Jeremy Morley explains: “GNSS systems start with digital mapping but employs vehicle sensors for precise location detection and collision avoidance. Other systems employ computer vision and artificial intelligence as their starting point. All systems have to be capable of operation in the event of communications failure, for example where there are tall buildings.

“However, the full advantages of GNSS can be best realised in connected mode where vehicles can receive updates about road works and so on. The vehicles may also intercommunicate on local links about specific road conditions, for example where a vehicle traction system detects black ice.”