« Witness to a Barn Raising | Main | What I Believe... »

When the User Interface Really Matters

I had a chance to talk shop with an interesting fellow recently, a retired Boeing test pilot. It turns out that test pilots don’t just test fly planes – in modern aviation they are involved in testing all aspects of the plane, including the user interface.

Yes, modern aircraft have user interfaces. With all of the complex navigation, health and welfare, and fly-by-wire systems, there is plenty of computing power in the cockpit. Things can happen fast, and there is a strong need for an effective user interface. The test pilot in question – Fred – had worked on the V-22 Osprey vertical takeoff and landing aircraft, and had some comments and insights that resonated with me in my work on medical systems interfaces.

Fred related how the user interface systems – displays, alerts, and warnings – had been developed by systems and avonics experts, and input from the test pilot community (the users) had been brushed aside. The systems and issues were complex, and the pilots just didn’t understand what needed to be handled by these systems.

True, the V-22 Osprey is a complex beast, with twin rotor/prop/turbines that are mounted on a pivoting wing that allows vertical and rapid horizontal flight. The systems required to run these components are generations away from stick and rudder cable systems used from the days of the Wright Brothers. And when something goes wrong in flight, the situation can degrade in a hurry. So the engineers built in a sophisticated set of limits, tolerances, alerts and alarms to let the Pilot know what was going on. These systems turned out to hinder, rather than help the pilot in an emergency.

There have been several crashes in the program that were widely publicized, and threatened the future of this multi-billion dollar weapons system. Fred was involved in the analysis of one of these incidents, and related how they were initially puzzled that the aircraft crashed in a situation where the proper steps could have avoided a catastrophe. Upon deeper analysis the team realized that there was a fundamental user interface problem.

Much more about Osprey crashes and software Here:

In this complex aircraft, all of the systems are interrelated. When a component fails, a situation of cascading alarms can disguise the problem, and distract the pilot to the point where the proper course of action is unclear. When the whole cockpit goes red and starts beeping, do you feather number one engine or switch to the backup electrical system? And there is no time to run diagnostics – you better do the right thing right now. In a situation like that, even the right stuff is not enough – with so many interconnected, cascading failures, you need help to know what to do.

Plus, the alerts in the Osprey systems were over-tuned, such that in some cases eight out of ten alerts and alarms were false alarms. After a while, the pilot ignores the alarms, or tapes them over, until one day a real one gets him killed.

As designed by the engineers, each with their own system to manage, the user interface correctly communicated the status of each sub-system. But that wasn’t enough – sometimes there were not enough clues to tell the pilot the root cause, and thus what action to take. Based on this realization, the test pilots lobbied for and got a redesign of the pilot user interface – a process which set the program back many months.

A conclusion from the official report on one of the V-22 Osprey accidents:

Conclusion: The North Carolina mishap identified limitations in the V-22 Program's software development and testing. The complexity of the V-22 flight control system demands a thorough risk analysis capability, including a highly integrated software/hardware/pilot-in-the-loop test capability. Integrated pilot-in-the-loop means that testing of the flight control systems with the pilot as a participant is an unidentified need.

The redesign team focused on several things: Prioritize failure modes – everything isn’t of equal weight and seriousness. Avoid information overload – too many alerts are worse than no alerts. Think like a pilot – situational awareness (what is happening) is the key. Use graphics to communicate rather than distract. Layered levels of detail are more effective than everything on the surface. More smarts in the Alerts to show root or actionable alerts rather than cascading failures.

What are the lessons for we groundling systems designers? I take away a few. I think of error messages that are technically correct, but don’t indicate to the user how to fix the problem. I think of cluttered screens where every inch of screen real-estate is used and abused. I think of alerts that go off too often, and become routine. I think of systems that are designed in pieces, without a thought for how the overall user experience fits together.

Fred left the program during the redesign, so he couldn’t relate the happy ending. But when the Osprey enters the armed forces in significant numbers, I wonder how many lives will be saved just based on a more effective pilot user interface.

Posted on Wednesday, July 5, 2006 at 08:12PM by Registered CommenterLarry Cone in | CommentsPost a Comment

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments

There are no comments for this journal entry. To create a new comment, use the form below.

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
All HTML will be escaped. Hyperlinks will be created for URLs automatically.