[Header]

HF Home | Outline | Research | Summary | Group Members | E-mail Feedback 
Back to the Research 




Human Capabilities in Design
Nigel Eames

People and computers have quite different often diametrically opposite capabilities. With a good interface, human and computer should augment each other to produce a system that is grater thatn the sun of its parts. Computer and human share responsibilities, each performing the parts of the task that best fit its capabilities. The computer enhances our cognitive and perceptual strengths, and counteracts our weaknesses. People are responsible for the things the machine cannot or should not do. Designing simultaneously for both human and computer capabilities requires trade offs and compromise. Unfortunately concessions in system design are often made at the expense of the person because people are excellent at adapting to suboptimal situations, whereas computers are inflexible and their performance does not degrade gracefully.

"Creativity, adaptability, and flexibility are human strengths." (Norman 1990). Indeed, it is remarkable how people can accommodate to and even enjoy the very worst interfaces, to add at the expense of training time, inefficiencies, and error. Just consider how many people still use and how many programmers swear by the UNIX interface. Although even our best interfaces require people to adapt to some extent to the computer, designers have to recognize critical areas where the computer has to support human capabilities and augment our weaknesses. When people do have to adapt to the computer, the computer should assist that process.

Some aspects of a system interface must be designed to accommodate particular human characteristics, for neglecting to do so would seriously compromise the effectiveness of the human computer system. Failing to account for the certainty of human error in a human - computer system is a recipe for disaster. Interfaces should be constructed to minimize the number and the effects of errors. Human knowledge is incomplete, difficult to acquire, and hard to remember. Since all systems require learning, we should support this process with high - quality training programs, documentation, active exploration, and on-line interactive help. A surprising number of people have some kind of physical or mental disability. An interface can block such individuals from using a computer or it can minimize and even overcome the effect of these disabilities.

The consequences of human error range from minor annoyance and frustration to loss of productivity, loss of profits, and loss of life. In our search for blame, many fault ourselves for our own errors, we laugh when we see others make petty mistakes, and we indict people for serious error. But who is really at fault? Our propensity to err is a fact of life and that good system design should predict, accommodate, and minimize the effects of errors. Maybe "human error" a misnomer, an artifact arising from system design that does not account for natural human behavior. Researchers suggest that we should not search for blame when so called human error occurs. Instead we should regard failure as a breakdown in design that should be identified and remedied (Denning 1990).

Norman and Denning argue that computer professionals should have mechanisms for reporting and publishing accounts of breakdowns, and for creating standard that minimize human errors. Interface professionals should honor standard, alter them when recurrent breakdown is observed, and admit and learn from our mistakes. This is beginning to happen at the Internet grass roots level, where people post and discuss usability problems (e.g. in the newsgroup comp. human factors).

To design error resistant systems, we must first understand what causes errors. Errors are divided into two broad categories; mistakes and slips. A mistake occurs when we have an incorrect mental model of what will happen when we do something. We consciously believe our action will meet our intention, but the result is not what we expected. With a slip, we intend to do the correct actions, but somehow our performance gets misdirected and a different action is substituted. Slips usually occur in skilled behavior when we are not paying attention; we often notice the error to late. Lewis and Norman further categorize these errors and proceed to suggest methods for minimizing the possibility of errors or misunderstandings, methods for detecting errors, and methods for correcting errors (Levis and Norman 1986).

There are other good sources for reading about error in human computer systems that develops and applies the ideas in the Norman and Levis reading to usability heuristics. This section of human factors deal with errors, namely error prevention. Make it impossible or very unlikely for the user to make mistakes. Help users recognize, diagnose, and recover from errors. Provide good error messages. Let the user know what the problem is and how to correct it. While preventing human error is optimal, it is also impossible to eliminate errors. The focus should be on controlling their effects. One method is to produce understandable error messages. Therefore, error message wording is important.

Users require constant reminders of computer syntax. Computer users must remember a wide variety of interface an device dependent details. Command based systems were notorious for this; people had to remember particular commands and their parameters. GUIs have lessened, but not eliminated, the need to know computer syntax. Windows still have to be adjusted, menus traversed, icons understood, and shortcuts remembered. Quick reference guides or cheat sheets are common aids, reminding people of the system offerings as well as the correct incantations to invoke actions (Norman 1990).

Within this framework of four reasons people require training and help. Computer users need training and help in order to: understand their evolving role in an organization and the changing context in which they work, understand task domain concepts, understand computer domain concepts, and to be aware of the computer syntax required to complete the interaction. Emphasize a user centered approach to help. Researchers argue that the information delivered to the user and the way in which it is presented should depend upon the kind of question the user is asking. Others articulates the process that people follow to solve problems with documentation. Before reading any help text, people must formulate the problem as a question and find the text section that many hold the answer. When reading the text, they have to understand it, and as a result, decide what action they should take. Finally, they execute the action and evaluate the outcome to see if the answer was correct (Carroll and Mack 1984).

Training, documentation, and help must also match the way people actually learn computer applications in the real world. Carroll and Mack (1984) studies human learning of complex materials in a realistic setting. Although the system learned by their subjects is an old text based word processor, their observation on how people learn are still applicable to today's interfaces. In essence, people are active learners, and they teach themselves by, trying things rather thatn only reading about them, developing and testing hypotheses about why the system operates the way it does, and applying their knowledge of their tasks and prior experiences in interpret their experiences. Those researchers believe that people learn by doing, thinking, and by knowing. Their observations explain the success of active exploration through good tutorials and through simplified, forgive interfaces (for learning by doing); interface and training systems that promote good conceptual models (for learning by thinking); and appropriate reference documentation that helps people answer specific questions (for learning by knowing).



Reference Cited

Carroll, J., and Mack, R. 1984.  Learning to Use a Word Processor: By Doing, by Thinking, and by Knowing.  Human Factors in Computer Systems, 13-51

Denning, P. 1990. Human Error and the Search for Blame. Communications of the ACM 33, 6-7

Lewis, C., and Norman, D. 1986. Designing for Error. User Centered Systems Design, Lawrence Erlbaum Associates, 411-432

Norman, D. 1990. Human Error and the Design of Computer Systems.  Communications of the ACM 33, 4-5




Back to the Research 


HF Home | Outline | Research | Summary | Group Members | E-mail Feedback