Gesture ‘recognition’ could improve automotive safety

Paul Boughton

Not too long agowhen the instrument panel on a popular car consisted of just five or six instruments and five or six auxiliary (secondary) controls to operate the radio and heating systemthe idea of making a hand gesture in a designated space to operate one of these controls would rightly have been seen as an unnecessary extravagance at best.

Todaydriver workload is already heavy in modern carswith an ever-increasing number of vehicles on the road. This is compounded by a constant stream of new auxiliary devices such as navigation systemsactive safety systemsnomadic devices (personal digital assistants and mobile telephonesfor example)advanced telematics systems and infotainment systems. The potential for distraction can clearly be seen in the comparison between the JaguarMkV Saloon of 1948-51 (Fig.1) and the 2007 model year JaguarXJ (Fig.2).

Consider that a driver is to perform a specific in-car task. In some cases a single sample of the in-car task is sufficient butin other casesmore than one sample is required. Glance times typically range from 0.6s to 1.6swith a mean glance time of about 1.2s. A car moving at 48km/h (30mph) will travel over 15m (50ft) in that timeso driver distraction is now one of the major causes of road accidents.

In 1999The Commission of the European Communities recognised the potential dangers and issued a recommendation on safe and efficient in-vehicle information and communication systems.

In an attempt to reduce distractionsmany car manufacturers have introduced menu-driven systemscontrolled by touchscreens or central controllers. Voice recognition control systems have also been introduced.

Gesture recognition systems have the potential to offer substantial safety benefitssince commands can be made without taking the driver’s eyes off the road.

Several manufactures are now carrying out research to develop new gesture-operated interfaces using camera-based systems that utilise image recognition software. Howeverthis research has identified several inherent difficulties associated with in-car camera-based gesture recognition systemssuch as adapting to uncontrolled variations in lightingmaintaining accuracy with dynamic backgroundsand real time operation. Nonethelessthe author believes that there is a reasonable probability that gesture recognition technologies will be in widespread use in numerous automotive HMI applications by 2020.

Engineers at Jaguar Technical Research have considered the potential advantages of gesture recognition systems but are well aware of the limitations of camera-based systems. So insteadthe engineers are pursuing an alternative sensor-based approach that has less technical complexity yet can achieve the same safety benefits. Sensor-based gesture recognition technology largely builds on pioneering work undertaken at the Massachusetts Institute of Technology in the USA. Low-frequency electric field sensors are safedo not require line-of-sightoffer fast response times and high resolutionconsume little power and are low cost. The intervention of a human hand entering the path between the transmit and receive electrodes cause a change in the displaced current measured at the receive electrode; this can be used to transmit control inputs to a HMI.

Carnegie Mellon University is developing the iWave gesture recognition system in collaboration withand funded byGeneral Motors. The primary objective was to create an innovative human-car gesture interface to support information or entertainment goals without compromising safety. The initial interface was designed and tested in a driving simulator with 18 subjects using one-handed gestures in front of the centre console.

The Institute for Human Machine Communication at the Technical University of Munich has carried out a research study in collaboration with BMW to evaluate differences in driver distraction while controlling different input interfaces. In this studyhaptic (touch) and gesture input modes are compared with regard to distraction from a controlling task similar to steering a car.

MeanwhileDaimler Chryslertogether with Visteonis also funding the Georgia Institute of Technology to develop a systemcalled the Gesture Panelthat allows a driver to control secondary devices using simple gestures.

The Daewoo Motor Company is collaborating with the University of Dundee to develop a non-contact pointing interface for control of non-safety-critical systems inside a vehicle with the aims of improving safetydecreasing manufacturing cost and improving the ease of driver migration between different cars.

Renault Research Department has also collaborated with the Université de Bretagne-Sud in VannesFranceto evaluate the performance of subjects using a small gesture touchpad interface versus conventional rotary controls to execute given tasks.

Toyota’s Compact Sports and Speciality (CS and S) concept car made its debut at the Frankfurt Motor Show 2003 featuring the company’s ‘Space Touch It’ concept. This is an integrated multimedia interface system operated by a series of holographic projections that the user ‘touches’. Spheres of information appear to float in space butwhen touchedthey allow the user to operate the vehicle’s secondary controls.

Carl Pickering is with Jaguar and Land Rover Technical ResearchCoventryUK.

"