Skip to content Accessibility statement

Aeroplane cockpits would be safer if more human-friendly

Posted on 7 January 2004

Aircraft could achieve an even higher level of safety if cockpit designers took more of the psychological characteristics of pilots into account, according to researchers who studied one of Britain’s worst air disasters.

Although the air accident rate has been constantly decreasing over the last decades, many modern aircraft, like the Boeing 737 involved in the Kegworth air crash, which killed 47 people in 1989, have computerised controls systems which are so complex that they even over-tax the mental capabilities of fully-trained pilots, say the researchers.

The team, from the University of York and the University of Newcastle upon Tyne, report their findings in the January edition of the International Journal of Human-Computer Studies*.

They say that, during emergencies, pilots are overloaded with technical information which they are unable to process quickly enough. This could mean that wrong decisions are made at crucial moments – and these cannot be reversed.

The team, including Dr Gordon Baxter of York’s Psychology Department, analysed the disaster in which a British Midland aeroplane bound for Belfast crashed onto the M1 near the village of Kegworth in Leicestershire on 8 January 1989.

Researchers, who carefully studied the Air Accident Investigation Branch report, say the British Midland flight crew made crucial decisions based on an ‘over-simplified picture of reality’. During the incident the flight crew reacted to a mechanical fault in the left engine by mistakenly shutting down the right engine. They thought they had made the correct decision because this action coincided with a cessation of vibrations and fumes from the left engine. They then sought to make an emergency landing at East Midlands airport.

During the approach, however, the crew lost power in the left engine and a fire started. The crew attempted to restart the right engine but they were too late and the plane crash-landed half a mile from the runway.

“People generally control complex dynamic systems, like an aircraft, by monitoring data signals, and responding accordingly” said Dr Baxter. “In such systems the amount of data generated by the system is often too large to be comprehensively monitored and responded to in a timely manner. So what people do is generate a simplified mental representation of the real world, based on their experience, and use this as the basis of deciding what to monitor. When things start to go wrong, however, the limits of this mental model get exposed, In the Kegworth case, the pilots exhibited confirmation bias where they came up with a theory that would explain what was happening to the plane, and then gave greater emphasis to the data that confirmed this theory, and overlooked data that may have led them to reject it.”

Historically, classical aircrafts had cables and mechanical devices linking the cockpit with parts such as the flaps, slats, elevators and rudder, meaning the pilot was physically linked with his machine.

However, in the late 1980s, aircrafts became ‘fly-by-wire’ aircrafts, which means that pilots modify the configuration of the aircraft via electrical and hydraulic devices.

“The authority for making some of the decisions has been passed to the computer system, whilst the burden of responsibility for those decisions still falls on the pilot” said Dr Baxter. “Although some of the tasks performed by the computers have helped to reduce the pilot's workload, they have paradoxically also increased their workload because the pilots now have to manage the computer systems in addition to their other tasks associated with flying the aeroplane.”

A 1996 Federal Aviation Administration report said pilots have to train extensively partly because they have to adapt themselves to cockpits, but states that the reverse should be the case.

Dr Baxter added: “One of the characteristics of experts, such as experienced pilots, is that they can spot and overcome errors before they lead to accidents. This management of errors is more difficult when the computers make decisions or perform actions that are not expected by the pilots. The onboard computer systems should be designed to facilitate the management and recovery of errors by anticipating potential problems where possible.”

The Newcastle and York research team is part of a six-year, £8m project. It is sponsored by the Engineering and Physical Sciences Research Council, is led by Newcastle University and is looking at the dependability of large computer-based systems.

* Besnard. D., Greathead, G. & Baxter, G. (2004). When mental models go wrong. Co-occurrences in dynamic, critical systems. International Journal of Human-Computer Studies, 60, 117-128. Available via email (Word file) from Newcastle University press office or by fax (seven pages without refs.)

Notes to editors:

  • Fifty researchers are taking part in the DIRC project, which is led by Newcastle University. They come from five universities (Newcastle, York, Edinburgh, Lancaster and City Universities). Other projects have included the designing and testing of a computerised parcel tracking system, examining ways of profiling the skills of IT specialists and provided advice for doctors caring for premature babies http://www.dirc.org.uk/
  • Dr Gordon Baxter is a research fellow, working in the Human Computer Interaction Group in the department of Psychology. His general area is human factors, but with particular interests in expertise, decision making and human error mainly with reference to complex dynamic systems (aeroplanes, power plants, chemical plants, hospital departments and so on.)
  • The University of York’s Psychology department has a 6* rating and scored top marks of 24/24 for teaching quality.

Contact details

David Garner
Senior Press Officer

Tel: +44 (0)1904 322153