Profil SchagerSvenska

UNDERSTANDING THE HUMAN FACTOR


Bengt Schager M. Sc.
Profil Schager & Co AB
Marine Profile Sweden AB
Marine Profile UK Ltd.
1998


Some 80 percent of maritime accidents are thought to be the result of "the human factor". This was the finding of a study of reported accidents conducted by a British marine insurance company a few years ago.

Within the maritime industry, references to such investigations often carry the implication that we have obviously made considerable progress in developing reliable technology, while much remains to be done in the area of training of officers and crew, i.e. the operators. The high percentage of human errors onboard ships has caused the entire industry to be concerned about the quality of the people who run the ships.

Although these human errors, the so-called "human factor", are a major reason why accidents happen, it is still worth taking a critical look at the findings of such studies.

Differing Definitions
If about 80 percent of all accidents are caused by the human factor, what causes the remaining 20 percent? The most common interpretation is that most of the remainder is due to "technical errors", with only a negligible portion consisting of "unforeseeable factors". The division into human error, technical error and unforeseeable factors is not particularly useful. Improvements or even perfection in technical systems should then mean that the proportion of human error would increase as the proportion of technical errors decreased. In other words, 100 percent human error would mean that the technology was perfect.

Percentual accident frequency merely indicates relative change. If the total number of accidents in the global maritime industry is decreasing, a stable 80 percent share would mean an actual improvement, while if the total number of accidents is increasing, it would mean a worsening.

The lack of a scientific definition of "the human factor" also makes it difficult to interpret the findings of such investigations. A review of the literature shows that the authors of the most widely read standard works on the subject have neither defined nor limited the concepts they describe. It is therefore difficult to be certain that different investigators are analysing the same thing. An attempt to define concepts is generally regarded as a prerequisite in scientific contexts. Within the maritime community, however, where people are prepared to legislate and make investments in order to deal with accidents, the lack of definitions makes it difficult to plan effective measures.

Since the human factor is treated separately, there is a risk of viewing technology as a physical fact, as if it were given by nature instead of being the product of the human mind. In line with such a view, technical design errors, errors in technical judgement or even shortcomings in technical maintenance are not always considered to be attributable to "the human factor". Once we separate technology from people, we also tend to diminish the responsibility of technicians in an accident, and place the blame on the operator.

We also tend to analyse accidents on the basis of the polarity between technology and operators. At the same time, we dismiss the major reason for accidents, namely our limited ability to control and predict nature.

Man and Nature
In the context of most accidents, whether in the nuclear power industry, the transport industry or when buildings collapse, the laws of nature play the decisive role. It seems never to occur to anyone, however, to attribute such accidents to natural causes.

Nature is what it is, we are forced to accept it and we are also forced to adapt ourselves to it. It is here we find the polarity: between man and nature, i.e. between people and the laws of physics. These are the laws we seek to tame and to utilize with the help of technology. When things go wrong, we cannot blame physics. Instead we must analyse our own shortcomings, those of the operator and the technician and look upon technology as a human product.

To the extent we fail in nature and accidents occur, we ought to study the causes integrally, i.e. where our analysis of the technology is as thorough as the suitability of the operator’s handling of it. We must analyse both the technician’s and the operator’s roles in the chain of events as well as the compatibility between operators, technology and nature. The question is thus no longer whether a human factor is involved in an accident, but where in the chain of events the human factor is to be found.

When ice loosens from the wings and is sucked into jet engines, causing a crash. When a bow visor is torn off and a ship sinks, or when a nuclear power plant goes haywire and contaminates the surroundings, it can never be the fault of nature or physics. All events leading up to an accident obey the laws of nature and develop logically. The problems are in the weaknesses in our systems, often due to operator errors, sometimes due to technical shortcomings, nearly always due to inadequate foresight and knowledge. We should therefore expand the "human factor" concept to include the entire socio-technical system.

To complicate the matter further, we should also take into consideration the often overlooked fact that man himself is a product of nature. Human abilities - and weaknesses - should therefore be studied carefully. They should be identified, analysed, described and made known so that we can also learn to take them into account. We all know that when people become tired, their ability to concentrate suffers. We know that forgetfulness is a fact of life. We know very well about difficulties in interpersonal communications. We know that strong emotions affect our precision and that it is difficult to carry out numerous tasks simultaneously. This list could of course be made much longer, but the point is that we should strive to adapt technology both to man’s and to nature’s terms.

The Human Psyche
Why, then, is the term "the human factor" assigned solely to the operator while the people behind the technology are excused? To find the answer, it may be necessary to go back about a hundred years.

The human factor, how people function, belongs to the domains of psychology. This is a science that in its present form has only existed for slightly more than a hundred years and whose body of knowledge has unfortunately developed relatively slowly, particularly compared to technology, which is also young but which has accelerated much faster.

The unconscious is by far the greatest discovery in the history of psychology. The fact that the greater part of our psyche is beyond the reach of our consciousness and that the main part of what could be called psychic energy exerts a strong yet unnoticed influence over our thoughts and actions. This insight came from Siegmund Freud.

Although Freud did not know the actual term "the human factor", it may nevertheless be attributed to him. He noticed that everyday errors did not always occur by chance but bore a deeper psychological message. They revealed part of a person’s psychic dynamics and could provide important information about unconscious influences on thoughts and actions.

In 1904 he published a book cataloguing, designating and exemplifying a number of human errors. His main purpose was to study the influence of the unconscious. At that time, errors were regarded as embarrassing, possibly amusing or even piquant. Technology had not developed so far that human error needed to be taken so seriously. They seldom caused damage greater than a misunderstanding, a broken vase or embarrassment.

Freud’s mapping of human errors had a profound effect on the way people look at themselves and others. People began to be regarded as imperfect, with the potential for making mistakes, mixing things up, saying the wrong things, misinterpreting, forgetting things, misplacing things, misreading things, making written errors, and in the middle of an activity forgetting its purpose.

Such errors gradually acquired a more serious role, to the point of becoming life-threatening, as technology placed increasingly powerful forces in the hands of people.

Adapting People to Technology
At the same time as the shortcomings of the human psyche were being mapped out, technology was developing rapidly. It was regarded as charming, wonderful, and irresistible by those who understood it and others were impressed, full of admiration and even worship of this technical progress. Accidents happened, but they were regarded as inevitable.

A turning point seems to have come during the Second World War. Despite functional technical systems, airplanes crashed, bombers with modern sights missed their targets and technically superior weapons systems were defeated by inferior ones. As a result, psychologists on both sides of the front were called in to analyse the connections between man and machine. The result was in-depth studies of human errors and factors which affect the relationship between people and technology. The term "the human factor", however, was not coined until nearly a decade later.

The prevailing strategy to eliminate errors was to adapt people to technology. The means were education, training and experience, but studies from the Second World War showed that even well-trained, experienced operators could make mistakes. This gave rise to questions about which tasks were suitable for people and which were more suitable for technical solutions. The interface between operator and technology was given the highest priority.

Technical systems have gradually become more complicated, with the capacity to solve increasingly complex tasks. At the same time, reliance on human ability has diminished. The human factor is now the most common explanation for accidents and the operator is often regarded as the weakest link in the system.

To eliminate the role of human weaknesses, systems are nowadays even being designed to correct human error. Back-up systems are being developed to recognize anomalies and call them to the attention of the operator. Systems can even take over when required.

This development brings new human weaknesses to the surface. Limitations in operator consciousness due to boredom, monotony, day-dreaming, lack of stimulation etc are today common causes of accidents. Technology is assigned more of the responsibility for the execution of tasks that used to be assigned to people. The operator is thus becoming less of an actor and more of a monitor.

Another modern development is the use of computers to facilitate the operator’s work. Between the operator and the reality he manipulates is a computerized information system. This system is given the dual task of relaying the operator’s intentions to the technical system he is operating and relaying information about the state and reactions of the system back to the operator via sensors, monitors and displays. There are, of course, many advantages to such systems. They provide comfort and eliminate many physical risks for the operator, but they also have drawbacks.

Computerized support means that the operator no longer has direct physical contact with what he is operating. The information presented via displays and monitors is predetermined by the designers of the systems and is not freely chosen by the operator himself. Such information is often of poorer quality and has fewer dimensions than the direct contact with reality. One example within the maritime industry is the fact that the ship’s commander has no direct view of the ship’s sides and movements when docking, but gets his information via TV cameras and monitors.

Moreover, the systems are often so complicated that the operator does not understand them. This may make him uncertain, since as a rule he cannot determine whether the system is functioning satisfactorily or whether he should take hands-on action if the system fails.

There seems to be a trend towards decreasing faith in human ability and increasing faith in technology. This can result in an artificial distinction between human factors and technical causes of accidents. The operator and the man-made technology are split up and analysed separately when accidents are investigated. In the future, there will hopefully be better integration between operator and technology, enabling them to interact better.

Adapting Technology to People
The operator should be regarded as a human information-processing system that uses available information to act in any given situation. The quality of the operator’s actions is determined by the quality and relevance of the information to which he has access. Action based on erroneous or bad information will, of course, be erroneous or bad. Conversely, action based on good and adequate information has a far greater chance of being appropriate.

Information can, of course, be defined in different ways. On the one hand, information is that which reaches us via our senses. On the other hand, information is "that which reduces uncertainty". As a processor of information, the human brain is far superior to anything else. No other system comes even close to human capacity. The way in which information is processed, however, is rather complicated and actually occurs in many steps.

It is not enough to receive information with our sensory organs. It must also be interpreted and evaluated. Education and experience place information in a context. As they provide the mental framework that determines what information we must seek, how we process it and how we interpret it. As an event is unfolding, we also have continuous feed-back via our sensory organs and senso-motoric systems, enabling us to correct and adjust our actions as we go along.

It is therefore important to provide an operator with the best possible information and to give him the opportunity to choose it himself. We should also train operators carefully and help them to make use of experience. Furthermore, since people are different we should choose for critical tasks those people who have good perceptiveness, generally high capacity, maturity and judgement.

Systems and Organizations
In order to achieve better socio-technological solutions, we should in the future adapt technical systems to the operator. It is difficult for a designer of technical systems to determine what information is relevant and how it should be presented to best utilize the operator’s training and experience. The technician, of course, designs on the basis of his own experience and knowledge, which are not normally the same as those of the operator.

The operator needs to have greater influence on technology and to play a role in determining what information he needs to do his job and how that information should be presented. He should also be given the opportunity to influence what tasks will be assigned to technical systems and what tasks he will perform himself. Increased integration between the operator and technology enables a holistic view of the system - a socio-technological system with man in the centre, aided by technology.

In order to map out the human factor, it is also necessary to study the social system to which the operator belongs. In spite of good information, long education and great experience, operators do make misjudgements, take wrong actions and cause accidents. The primary strategy against this has been to regulate the operator’s actions with instructions and rules. To put it simply, regulations are used to predetermine how all operators should act in a given situation.

As a result, one common approach to analysing the human factor has been to categorize errors of different types: skill-based errors, rule-based errors and knowledge-based errors.

This analytical method, which focuses solely on regulations and the operator, may be suitable for highly regulated activities. However, this method fails to take into account other circumstances and conceivable causes of accidents, thus limiting its usefulness.

Systems of rules differ considerably between different lines of business. The nuclear power industry and air traffic tend to be highly regulated, leaving comparatively little room for the operator to decide. The maritime industry is different, owing to tradition, diversity, travel times and the nature of the business. Regulations are fewer, which means greater latitude for the individual operator’s decisions. In other words, the demands on an operator are greater in the maritime industry than, for example, in aviation.

A further - and traditional - strategy against the negative effects of the human factor is to organize work so as to prevent accidents. The purpose of an organization is normally to involve several people interactively in the same operation, thereby minimizing the risk of an individual operator acting erroneously. A good organization means an efficient division of labour, where several operators handle the available information, evaluate it jointly, and observe and challange each other’s actions. This is based on the notion that many operators perceive more than one and have greater combined experience and knowledge. In this respect, aviation has made greater progress than the maritime industry, which is still largely traditional, hierarchical and authoritarian. There is, however, a noticeable increase in the interest in teamwork, pilot/co-pilot system and Bridge Resource Management, modelled on aviation.

Unfortunately, organization brings other human weaknesses to light. Communications problems may arise, as well as problems involving the distribution of responsibility and labour. Group dynamics can also arise with both positive and negative effects on efficiency and the ability to act.

The latest development in maritime safety and the battle against the negative consequences of the human factor is to broaden the organization concept. The shortcomings of operators or various parts of the onboard organization were formerly regarded as the main sources of error. Today this view has been extended to include the entire onboard organization, the entire shore-based organization, the integration between both organizations, and the attitudes about safety found within the whole shipping organization, i.e. the entire socio-technological system.

IMO states this new view with great clarity in the preamble to the ISM code, paragraph 6: "The cornerstone of good safety management is commitment from the top. In matters of safety and pollution prevention, it is the commitment, competence, attitudes and motivation of individuals at all levels that determine the end result."


Accepting Limitations
If we are to come further in our work to eliminate the consequences of human shortcomings, we should agree on a definition of "the human factor". Until we can do so, we should at least catalogue human errors regardless of whether they come from the operator, the organization, the legislator, the shipowner, or the technician.

We must also learn more about our inherent psychological limitations since this knowledge will enable us to take them into account when designing integrated systems. Increased knowledge about our limitations can also teach us to recognize situations where such limitations can have adverse effects. This will enable us to deal with greater awareness.

Finally, we have the so-called unforeseeable events. These events are the fundamental, genuine expression of the human factor. We cannot prepare ourselves for unforeseeable events because, by definition, they lie beyond our ability to think. This does not mean that we are at the mercy of chance. It merely means that our mental capacity is limited. Once they have happened, unforeseeable events often appear logical, inevitable and sometimes even rather foreseeable.

Unforeseeable events teach us more about the limitations of human thought and imagination than the capricious world of reality. In the short term, there is perhaps not much we can do about human intellectual limitations, but we can take greater precaution through careful risk analysis and we can make our plans knowing that our psychological make-up has inherent limitations.

Good knowledge of our own limitations has, during the course of human history, probably had a decisive value for the survival of the individual as well as of mankind.

Bengt Schager

 




Profil Schager & Co AB,  Box 7019,  300 07 Halmstad, Sweden  |  Phone +46-(0)35-10 43 80  |  Fax +46-(0)35-10 84 34  |  Info@profilschager.com