“human factors, understanding mistakes to prevent ... · accordingly, deals with the design of...

15
“Human Factors, understanding mistakes to prevent technological accidents, an application of the HFACS model at the Tuninter 1153 disaster (August 6th, 2005)” Roberto Carella 1 , Gianluca De Donno 2 1 University of Basilicata [email protected] 2 University of Rome “La Sapienza” [email protected] Abstract. The term Human Factors refers to a multidisciplinary approach that takes care of the interaction between man and his working environment and the ways in which man acts within it. The Human Factors approach, starting from how people perceive the outside world and behave accordingly, deals with the design of tools, products and systems with which humans interact. The principles of Human Factors are applicable to the design of small objects such as a simple pen or a computer mouse, up to very complex systems such as an aircraft or a Space Shuttle. In summary, the study of Human Factors is, therefore, an approach that was born in a working organizational context, and key elements are analysis and the inter- human relations, with the main aim to minimize the danger and the perverse combinations of risk and error that often generate unknowingly incident. Human Factors is now a multi-disciplinary approach, which collects and applies different knowledge from different sciences, medicine, engineering, psychology, sociology, theory of organizations, with the aim of improving safety conditions at different levels of the organization involved which will prevent unexpected events of a certain hazard . In aviation, the International Civil Aviation Organization (ICAO), through Circular 227 defines "Human Factors as the object of studying people while performing their duties, their inclusion in the workplace in a physical sense and interpersonal relating with their working tools and procedures to follow. The objective of this approach is the pursuit of safety and efficiency; it is therefore an approach to security that considers the whole organization fully involved and responsible for an accident. Keywords: Human Factors, Organization, Technology, Knowledge Management.

Upload: others

Post on 13-Mar-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: “Human Factors, understanding mistakes to prevent ... · accordingly, deals with the design of tools, products and systems with which humans interact. The principles of Human Factors

“Human Factors, understanding mistakes to

prevent technological accidents, an application of

the HFACS model at the Tuninter 1153 disaster

(August 6th, 2005)”

Roberto Carella1, Gianluca De Donno2

1University of Basilicata

[email protected]

2University of Rome “La Sapienza”

[email protected]

Abstract. The term Human Factors refers to a multidisciplinary approach

that takes care of the interaction between man and his working environment

and the ways in which man acts within it. The Human Factors approach,

starting from how people perceive the outside world and behave

accordingly, deals with the design of tools, products and systems with which

humans interact. The principles of Human Factors are applicable to the

design of small objects such as a simple pen or a computer mouse, up to

very complex systems such as an aircraft or a Space Shuttle. In summary,

the study of Human Factors is, therefore, an approach that was born in a

working organizational context, and key elements are analysis and the inter-

human relations, with the main aim to minimize the danger and the perverse

combinations of risk and error that often generate unknowingly incident.

Human Factors is now a multi-disciplinary approach, which collects and

applies different knowledge from different sciences, medicine, engineering,

psychology, sociology, theory of organizations, with the aim of improving

safety conditions at different levels of the organization involved which will

prevent unexpected events of a certain hazard . In aviation, the International

Civil Aviation Organization (ICAO), through Circular 227 defines "Human

Factors as the object of studying people while performing their duties, their

inclusion in the workplace in a physical sense and interpersonal relating

with their working tools and procedures to follow. The objective of this

approach is the pursuit of safety and efficiency; it is therefore an approach to

security that considers the whole organization fully involved and responsible

for an accident.

Keywords: Human Factors, Organization, Technology, Knowledge

Management.

Page 2: “Human Factors, understanding mistakes to prevent ... · accordingly, deals with the design of tools, products and systems with which humans interact. The principles of Human Factors

1 Human Factors in Aviation

In the history of Aviation there are a series of successes and failures with the

result of improving and continue to realize the dream. Aviation accidents

have been and continue to be numerous, and a large percentage of them

cause many deaths and economic and social damage for organizations

involved. Before the Second World War the majority of aircraft accidents

were caused by structural failure of the aircraft, due to bad weather

conditions, and pilots had to manage often alone in complex situations.

Human Factors, was understood as a certain individual inability to manage

complicated situations, poor knowledge of the flight, fatigue, were elements

that contributed to the occurrence of air disasters. It was after the Second

World War that the structure of the aircraft was made more reliable thanks to

technological interventions, and there was better flight assistance with the

intensification of communications between the control tower and the crews

and the analysis of the Flight Data Recorder, the famous black box. In

aviation, studying the causes of technological and organizational accidents,

there is a substantial difference between incident and accident. Incidents are

less serious events, occurring on a frequent basis and are to be considered,

reported or investigated in order to avoid their recurrence. Accidents are less

frequent and are events that have disastrous consequences, victims and

affected individual, social and economic issues. During the first IATA

Conference in Istanbul (1975) was consolidated the knowledge that men,

pilots and crew members, have a strong responsibility in the dynamics of a

plane crash, and since then, training programs were made especially to

improve certain skills and competencies related to group dynamics and

situations which can present during a mission. In this context arises “Crew

Resource Management” that tries to prepare the “Non-technical Skills” of

the crew. In fact accidents are not only the result of technical or human

error, but are the result of several concomitant and concatenated, causes

which involve the whole organization. This approach aims to highlight the

unavoidable human error and study as well the shortcomings in a technical

and a practical level and those at the front line, due to inefficiencies in

procedures and faulty communication both vertically and horizontally across

organizations. On 28 April 1988, something happened that changed the way

to manage all the activities pre-flight, culminating in implementation at the

organizational level, a series of practices and procedures geared towards

total security. On this date, in fact, a Boeing 737-297 of Aloha Airlines,

took off from Hili airport in Hawaii, but only a few minutes from take-off

was forced to make an emergency landing in Kahului Airport. There had

been a phenomenon which never happened before: the top part of the

fuselage had separated causing an explosive decompression that forced the

pilot to abort his mission, which killed a victim and injured many others.

The National Transportation Safety Board (NTSB) concluded its report,

identifying a cause-related to corrosion due to a phenomena of careless

maintenance of the aircraft. It was understood then, that bad maintenance

programs, resulting from poor management of these activities, could have a

devastating impact on aviation safety and it was for this reason that in 1993,

the second conference of IATA in Montreal, it was decided to launch new

programs training that took the name of “Maintenance Resource

Management”, and pointed to a safety management level of maintenance. It

was recognized that human error was ”unavoidable“, and deficiencies were

Page 3: “Human Factors, understanding mistakes to prevent ... · accordingly, deals with the design of tools, products and systems with which humans interact. The principles of Human Factors

studied at a technical and practical level in procedures and in faulty

communication both vertically and horizontally across organizations. All the

systems are based on the consideration that not only man fails, but the whole

system, in line with the considerations of a distinguished psychologist and

professor at the University of Manchester, James Reason. He, through the

analysis of several disasters, in his book Human Error (1990) introduces the

”Theory of Latent Factors”, saying that accidents are the result of a series of

active factors (active failure), and latent factors (latent failure) in the system.

The former are generally committed by the operators in the first line, the

second are real organizational deficiencies, sometimes voluntary, decided by

the company management. Therefore in aviation the Human Factors

approach primarily desires to increase levels of security, trying to mitigate

and avoid errors, not only through the study of behaviour and situations, but

also trying to introduce and correct certain internal processes and procedures

in organization involved in aviation activities. And above all people are the

main element , which are built around the framework of this approach and

agents that interacts each other and that have to deal with a multitude of

factors. The key issue is, therefore, to continuously implement and

disseminate a ”culture of safety“ at all levels of an organization, in order to

keep ever high attention which is the basis of complex systems such

airports, aircraft, companies involved in aviation. James Reason considers

and defines the error as a deviation from something that should follow a

linear path, a path that points to the achievement of a goal. The psychologist

follows the theories formulated by Rasmussen providing a classification of

the errors according to the intentions, actions, context and outcome. With

regard to the intention, we can identify the kinds of errors closely related to

the desired objective. When the goal is not reached due to deviations from

the predetermined initial path, there are errors due to distractions, or failures

in the performance that Reason defines as ”slip and lapse”. The slips are

easily recognizable, because we know our intentions and we quickly realize

the error, the lapses are promptly identified too, but they could be

discovered later on. If the action plan is well executed but it did not reach

the goal, we are faced with a mistake, that is a failure due to bad planning

processes. Mistakes however, are difficult to recognize because sometimes

we do not know the right path to follow to achieve a goal, and it all depends

on the result. The end result can be positive or negative and is often linked

to randomness; we realize only at the end of a path if we made a mistake or

not. The errors classified in the category of actions, refer to the nature of the

actions themselves. These omissions are not required when an action is

performed; there are intrusions when certain actions are performed in a

context, but refer to other activities; there are repetitions at a time when

various actions are repeated unnecessarily; when certain actions are correct,

but they are referring to other contexts; disorders when right actions are

performed in the wrong sequence; are uncoordinated when correct actions

are not performed when appropriate; fusion when several actions are fused

together, leading to not pursuing the objective.

Reason also identifies another category of errors, voluntary and related to

procedural situations. These are acts that are committed voluntarily during

procedures, because it may seem unsuitable, or routinely violated, because

we know that will lead to the achievement of a goal. They are actions that

the psychologist calls “violations”, and that despite being wrong, always do

not have a different result from that intended in the sense that they are not

Page 4: “Human Factors, understanding mistakes to prevent ... · accordingly, deals with the design of tools, products and systems with which humans interact. The principles of Human Factors

always wrong actions with negative results. All the different types of unsafe

acts are those that in the theory of latent factors, are identified as active and

latent failures. But if on the one hand the above theory provides a clear way

to interpret accidents in organized systems, on the other hand says nothing

about relating the opportunity to identify the real causes of these, or to

identify the active and latent factors. To overcome, the U.S. Army Safety

Center, the U.S. Air Force Safety Center, the NTBS, and the Federal

Aviation Administration, on the basis of the analysis of three hundred

aircraft accidents in the U.S. Navy, built a model of investigation, able to

find the various errors that are configured at the organizational level, in a

plane crash. The model is called “Human Factors Analysis and

Classification System” (HFACS; Douglas A. Wiegmann; Scott A. Shappell), and

is a framework for easy application and understanding divided into four sub-

frameworks, and analyzing four levels of deficiencies from the one closest

to accident, namely ”dangerous acts” and then analyze the “preconditions

for dangerous acts”, the “dangerous supervision” and finally the

”organizational influences.” The dangerous acts are those that Reason

defines active failure and are divided into ”errors“ and ”violations“. These

in the HFACS model, errors are to be identified, respectively, as skill,

judgment and perception, and routine and exceptional violations. Once

having discovered the active factors that have caused the accident, the

model goes on to analyze the causes that have caused it, trying to understand

at a higher level what has happened or what has influenced their creation.

The assumptions look to environmental factors including also the technical

and technological shortcomings of the system; inadequate practices of the

operators, with reference to the conduct of the crew, and the preparation of

the staff involved. It also investigates the psychological and physiological

conditions of the operators and their physical or mental limitations. The

dangerous supervision tries to bring out misconduct or inappropriate

decisions by those who are supposed to ensure compliance with certain

procedures or activities, schedules, inadequate, failures on the correction of

errors, and violations of these figures that are supervisors . At the last level

of the first investigation but the first in the scale of the model, there are

”organizational influences“, related to the processes, the management of

human and economic resources and materials, and related to organizational

climate, some habits, practices, ways of incorrectly seeing and doing, far

from contributing to the actual birth of a probable accident. The HFACS

model, is an investigative tool that greatly helps to understand why and how

an accident happened and provides several elements which go to seek to

highlight active and latent failures, and shows, in line with the theory of

Reason, that the presence of both factors is crucial. However this model

only investigates what has already happened, and so provides guidelines,

suggestions, and adjustments to the level of processes or activities that are

part of an organizational structure complex.

Page 5: “Human Factors, understanding mistakes to prevent ... · accordingly, deals with the design of tools, products and systems with which humans interact. The principles of Human Factors

2 Methodology

Given the relative paucity of research on Human Factors, we proposed an

approach for this study based on an exploratory case study (Yin, 1994) with

a qualitative research design that allows a detailed exploration of the topic

(Eisenhardt, 1989). Data collection involved: access to newspapers,

organizational documents collected on the Internet, information bulletins,

public domain lists, relations and released by institutions.

3 Case Study: the accident of Tuninter 1153 (August

6th, 2005)

Aviation accidents have been and continue to be numerous, and a large

percentage of these cause many deaths, economic damage, despite

increasingly sophisticated technologies, and practices of flight always in

continuous improvement. The Human Factors Analysis and Classification

System, built on analysis of three hundred aircraft accidents of the Navy of

the United States, (in which participated civil and military organizations

such as the U.S. Army Safety Centre , the U.S. Air Force Safety Centre, the

NTBS , and the FAA) , based on the concepts of active and latent factor

error of Reason, HFACS identifies four levels of shortcomings: dangerous

acts, conditions of dangerous acts , dangerous supervision, and

organizational influences. The aim of this work is the application of the

HFACS model to a case study of a plane crashed on August 6th, 2005 near

the coast of Capo Gallo, Palermo (Italy), a tragic event occurred at 13:37

UTC which involved an ATR72 aircraft , which operated the flight TUI in

1153 with thirty-nine people on board and which caused the death of sixteen

people. The analysis carried out shows that the incident in question, is a

typical accident involving people in a complex organizational context,

where they were to face numerous elements at the same time that caused a

situation turned into tragedy. The official investigation of the accident was

conducted by ANSV, the National Agency for the Safety of Flight, which is

the investigating authority for the safety of civil aviation of the Italian State.

ANSV not only investigated the technical factors that had caused the event,

but also identified a number of organizational errors. On the basis of the

official report produced by the Italian agency, after having presented a

description of what happened with particular reference to actions that led to

the disaster, we will investigate the event using the HFACS model, trying to

identify and highlight the latent factors that impacted on this event. The day

before the accident, the aircraft had been used to run five flights on routes

between Djerba, Tunis and Catania, and the same commander involved in

the event, had noticed and reported the failure of a Fuel Quantity Indicator

(FQI) . On the evening of the same day, the instrument was replaced with a

more efficient one, but with a different Part Number, although with an

identical design, and, it was understood by subsequent investigations, a

different indicator for measuring capacity, suitable to be used on another

type of aircraft: the ATR42, an aircraft with smaller capacity of fuel. The

technician in charge of the replacement, consulted the Illustrated Part

Catalogue (IPC) and he identified three different alternative Part Numbers

Page 6: “Human Factors, understanding mistakes to prevent ... · accordingly, deals with the design of tools, products and systems with which humans interact. The principles of Human Factors

that could be mounted in place of the original. Then he made other research

on the database, and found one of a different numerical designation,

according to the computer system used was suitable to installation on the

aircraft ATR72. Information was wrong but the technician, trusting in what

he read on the terminal left the instrument to his colleague who in the next

shift, performed the installation following operational tasks listed in the Job

Instruction Card, instructions that did not include a verification of the

information measurement of the instrument, but only a test of the lights of

the display. He also did not check the real applicability of the instrument by

consulting the IPC. The next day, the plane had to make routes from Tunis

to Bari and from Bari to Djerba, and all the necessary refuelling operations

were implemented. The crew of the first flight, informed the Flight

Dispatcher (FD) the required amount of fuel in order to make the first flight,

and exactly 1400 kg of fuel was loaded. There were therefore, the activities

of the refuelling transaction involving the use of a tanker and two

technicians. The supply was accomplished by setting up an electronic

instrument, the amount of block fuel needed, the line mechanic input 3800

kg on the panel, but in reality only 465 kg was loaded, compared to the 700

kg needed; this was because the FQI indicated on the display the desired

amount. The system was automatic, and when FQI detects the amount of

fluid set, closed through the valves, the supply of fuel. Completed the

document, this was delivered by mechanics for the commander's inclusion

with the documentation on board, but no one noticed the discrepancy. In

addition, the maintenance manual procedure, did not provide for the

inspection of the actual amount of fuel through the drip stick. Before

carrying out the mission, the Commander verified the flight documentation,

assisted by FD who had reported a previous filling the night before, an

operation that would bring the fuel level to 3100 kg, but the bill related to

this activity was not found. The FD, however, informed the Commander that

the receipt could probably have been left somewhere by the previous team

assigned to the supply, and that it intended to supply it to the return flight,

although he was not sure of this claim. The pilot decided, however, to fly,

trusting the information received. No one noticed in fact, that the quantity of

fuel in the tanks, was considerably less than what it should be. The plane

took off normally, and landed in Bari at 24:05. No one of the members of

the crew paid attention to the fuel consumption indicated on Fuel Used

(FU). In Bari, however, another supply was made according to the estimated

quantity for the next question. Here, too, no one noticed that the amount

uplifted was less than that actually required, because the FQI reported a

different but corresponding value calculated for the second flight. The

operational flight plan was also not compiled, where time and fuel

consumption is transcribed, and the fuel consumption was not checked at the

different points of flight. This was a widespread common attitude in the

airline. It is believed that neither was the Performance Record (PR), where

replenished and fuel consumption are transcribed, compiled. Subsequently,

the operations were carried out boarding passengers, all thirty-five, along

with the pilot and co-pilot, and two other members of the crew. At 12:32

received the order of clearance from the control tower and flew. During the

first phase, at a certain altitude, the captain got in communication with a

control tower in Rome changing radio frequency devices, and exchanged

some information related to the mission. At 13:21:36, the pilot of the ATR

asked to Rome permission to descend to a lower altitude for technical

Page 7: “Human Factors, understanding mistakes to prevent ... · accordingly, deals with the design of tools, products and systems with which humans interact. The principles of Human Factors

problems, without specifying the type, but received a negative response

because of high traffic. The right engine had shutdown. Just 100 seconds

later, the left engine shut down, and at this point permission was requested

to make an emergency landing at the Punta Raisi Airport of Palermo. The

two engines went out because aeroplane was out of fuel. The commander

gave a first MAYDAY, but did not give the order to carry out any procedure

provided for Both Engines Flame Out, which provides for this type of

aircraft to feather the propeller, in order to reduce the friction with the air

and allow the aircraft to glide longer distances. From Rome, meanwhile, the

opportunity was given to land in Palermo, but radio communications

appeared to be unclear, and only at 13:24:19, was referred the loss of control

of the two engines and transmitted a second MAYDAY. Rome, rather than

managing via the radar vectoring at the airport of Palermo, invited the

Commander to make contact with the control tower at Punta Raisi, because

at that distance it was difficult to give assistance via radar, and called on

Palermo, asking to get in touch with the aircraft. During communications

between Palermo and the ATR, there were misunderstandings due to the

English language, which delayed awareness of what was happening. The

Commander requested information three times regarding the distance from

another airport where he can land, information given to him by another

plane in flight because the controller in Palermo still did not understand the

demands of ATR. At 13:33:53 Palermo began to understand what was

happening and told the Commander and all crew, who were at an altitude of

about 4000 feet, and at a speed of 20 NM from Punta Raisi. The pilot

ordered to perform the ditching procedure and also spotted two boats toward

which to direct the aircraft in order to speed up the rescue once at sea. It also

indicated its current altitude of 2200 feet and the desire to land near the

boats. At 13:37, the ATR72 crashed into the sea, the plane broke into three

pieces and killed a member of the crew and fifteen passengers.

4 Application of the HFACS model to the case study.

The accident in question, as in the majority of aircraft accidents, has been

determined by a series of events that led to a final splashdown. The primary

cause that caused the disaster, is the erroneous substitution of FQI due to a

human error, but the event in question is also characterized by active and

latent factors that have pioneered the accident. We can identify these factors,

using the Human Factors Analysis and Classification System (Douglas A.

Wiegmann; Scott A. Shappell), which as explained above, allows us to

identify the various flaws of a system at different organizational levels.

Page 8: “Human Factors, understanding mistakes to prevent ... · accordingly, deals with the design of tools, products and systems with which humans interact. The principles of Human Factors

Fig. 1. Human Factors Analysis and Classification System Model (Douglas A.

Wiegmann; Scott A. Shappell).

Page 9: “Human Factors, understanding mistakes to prevent ... · accordingly, deals with the design of tools, products and systems with which humans interact. The principles of Human Factors

4.1 Step 1: Unsafe Acts.

Fig. 2. The Unsafe Acts Analysis.

At the level of dangerous acts, we can identify several violations classified

as routine. The technician responsible for the replacement of the fuel

performed in fact a wrong procedure, he did not inspect the actual

applicability of the part. He was wrong to use the documents in possession

to him, he checked on the Illustrated Parts Catalogue, but omitted a

fundamental rule that prevents all aviation operator to install any Part

Number which is not in the catalogue. In these cases he would have to

report the problem to the relevant technical department. He committed a

violation of situational type rule-based, in fact omitted to apply a rule, acting

in good faith also influenced by the belief that the part was sufficient for the

service, because the database consulted reported an error. Even the refuel

truck omitted to control, twice the amount of fuel that was entered as block

fuel demand. He compiled the document, but did not realise and paid no

attention to the difference in fuel, instead of a violation he committed a

mistake, a slip due to the frequent and routine activities taking place. Both

the Commander, and the Flight Dispatcher, committed violations, and did

not find the real reasons for the lack of document relating to supply up to

3100 kg. If the FD had in-depth research, perhaps someone would have

noticed the inconsistencies. The FD provided ambiguous information,

knowledge-based, because he found himself in an unfamiliar situation, rare,

and suggested that the document was somewhere else, but in reality he was

not aware of the document’s location. During the route Tunis - Bari, nobody

paid attention to Fuel Used as an error of skills due to a lack of attention of

the crew involved in flight operations. At the level of dangerous acts, errors

were also committed caused by incorrect decisions. The Commander, when

the first engine turned off, did not declare the emergency situation, and took

care to ask the co-pilot to understand why shutdown. Only after the

subsequent flame-out of the second motor informed the Roma control tower

of the problems. He also did not give orders to perform the procedure for

Page 10: “Human Factors, understanding mistakes to prevent ... · accordingly, deals with the design of tools, products and systems with which humans interact. The principles of Human Factors

landing due to the particular situation in which he and the crew were to

find, he had not lost situational awareness, but had to manage an event of

considerable complexity, he had also too little information from on-board

instruments and by the control tower of Palermo. The whole sequence of

errors and violations in the dynamics of accident, are active factors preceded

by a number of latent factors. We can find out through the other stages of

analysis: we start from the level of ”precondition for unsafe acts”.

4.2 Step 2: Precondition for Unsafe Acts.

Fig. 3. The Precondition for Unsafe Acts Analysis.

We have already pointed out, that the main reason that caused this accident,

was the wrong installation of a FQI different from the original. This error is

also to be classified in the category of inadequate technology of the system.

The manufacturer of this instrument, suitable for the ATR42 aircraft, had

designed almost identical in design to the one installed on ATR72. He had

never thought that this similarity could cause confusion. In aeronautics each

component, device, instrument, must be designed with features that prevent

it can be used in different locations. It is said that these parts have to be

”fool-proof“, obviously without taking anything away from all people’s

intelligence. This category also comprises the computerized management

system of spare parts in use by the airline, which featured a matter of

applicability of the non-verified parts. The consultation of the database was

instrumental in the decision to install the fuel gauge found. The

investigations of ANSV, also identified the presence of a device “Not of

Low Fuel” level between the instrumentation of the cabin, a lack of the

manufacturer of this aircraft. If there had been this tool, perhaps the

Commander or a member of the crew would have noticed of actual amount

of fluid in the tanks. The National Agency for the Safety of flight, also took

over the investigations, that the maintenance staff had not received formal

training regarding the proper use of the system of inventory management, a

factor that falls into the category of personal preparation. There were also

Page 11: “Human Factors, understanding mistakes to prevent ... · accordingly, deals with the design of tools, products and systems with which humans interact. The principles of Human Factors

particular problems of communication between crew and the control tower

of Rome and Palermo, factors which influenced the coordination of the

activities foreseen in case of emergency. After switching off the engine first,

Rome did not receive detailed information about the real issues, and did not

allow the authorization to descend in altitude. Some radio interference, then,

did not allow the controller to immediately speak to Rome, to understand

that both engines were damaged, and not accomplished vectoring towards

the Punta Raisi Airport, but it is nonetheless put in contact with Palermo.

Palermo took too long to understand the gravity of what had happened

because of misunderstandings due to language: English. All this delayed the

receipt of the relevant information from the Commander and crew, who

decided to land in the sea because he was too far from Palermo. The

procedure for feathering the propellers then, was not performed because

unordered, because of the understandable concern of the crew, who were

paying attention above all to understand the distance from the place of

landing, and engaged in an attempt to restart engine. All this information

was recorded by listening to the conversations recorded on the Voice

Recorder, and the analysis of the data downloaded from the flight data

recorder recovered after all had happened. Other abnormal behaviours are

to be found on the third level of the model HFACS: ”unsafe supervisions”.

4.3 Step 3: Unsafe Supervisions.

Fig. 4. The Unsafe Supervisions Analysis.

The improper action that we can place under this heading is a procedural

violation committed by a supervisor, in this case the Commander of the

crew. In fact, in checking the documentation on board, he noticed the lack of

supply, but he decided to take off without investigating. If he had done so,

perhaps the anomaly would have jumped out and the accident could have

been avoided. Finally, we can also identify other latent factors in the system,

using the level of ”organizational influences”.

Page 12: “Human Factors, understanding mistakes to prevent ... · accordingly, deals with the design of tools, products and systems with which humans interact. The principles of Human Factors

4.4 Step 4: Organizational Influences.

Fig. 4. The Organizational Influences Analysis.

As we have already had occasion to specify, flight crews were accustomed

to not fill flight plans or the Performance Record, which these operations

required, but that was found to have been violated throughout Tuninter, the

flight company owner of the aircraft. Lack of a system for monitoring flight

data, a Flight Data Monitoring, while not mandatory, but only recommended

by “Annex 6” of the ICAO. An habitual behaviour from organizational

climate where they fit even the poor training of staff to use of the database

and, as he was able to detect the ANSV, also a level of standard maintenance

and organizational inadequate for the proper management of the aircraft. In

fact, the airline used an Operations Manual Maintenance which is not

approved by the “Directorate General for Civil Aviation” (DGAC), as well

as using a database with the data entered not thoroughly. In the category of

organizational processes, lies the use of an inadequate Job Card Instruction,

the visual and manual control of the computer system for the management

of spare parts, an absence of a system of security management and

assurance quality. The surveys conducted by ANSV, noticed that the Job

Card Instruction did not provide a cross-check between the reading of the

amount of fuel the instrument replaced and the amount shown on the

documentation on board, as well as verification by the rods placed under the

graded wings. Also lacked a Safety Management System, not required at the

time or in accordance with the national legislation of operator, nor by

international standards, as well as a Quality Assurance System, still in the

implementation phase at the time of' the event. There was only a system of

Inspection System, envisaged in the “Annex 6” of the ICAO, which

included quality checks by auditors / inspectors. The analysis carried out

shows that the accident in question, is a typical accident involving persons

in a complex organizational context, where they were faced with numerous

elements that turned situation into a tragedy. As Reason teaches, active

faults committed by operators of front-line, were generated by something

that had to be searched upstream of the system. It proves in fact, that a

whole series of wrong behaviours, procedures, tools unsuitable deficiencies

Page 13: “Human Factors, understanding mistakes to prevent ... · accordingly, deals with the design of tools, products and systems with which humans interact. The principles of Human Factors

on the part of the executive management, have formed the basis to make

sure that the latent factors inherent and dormant in organization were

presented, which paved the way to a ”trajectory of opportunity“ leading to

the accident. With the use of the HFACS model, it was possible to identify

and classify these factors and understand what is actually happened. This

model in fact allows us therefore to identify areas where corrective actions,

are needed to get a system capable of ensuring the safety and the prevention

of technological accidents.

Conclusions.

In our project, we wanted to follow a path with the objective to present and

to outline the structure of Human Factors, trying to bring the reader to the

basic concepts of the discipline, but also to provide the necessary elements

for the understanding of those particular errors, with the aim of preventing

technological accidents. We do not pretend to be exhaustive, because the

study of Human Factors, is enriched with the tools and knowledge over

time, taken from other disciplines, experiences, methods of investigation

and technologies, that can pinpoint the error probability. Today the study of

Human Factors in fact, includes medical and physiological research and

surveys, the improvement of working conditions, with particular attention to

the environments, the instruments used, procedures, training. Human

Factors tries to use and disseminate techniques of risk management and risk

assessment, and to intervene in the dynamics of communication and group

work, with a specific interest in feedback, from which they can benefit in a

perspective of bottom down. It is also ergonomic and cognitive ergonomic,

which is mainly engaged in designing and perfecting the man-machine

interface, it is technology, it is engineering, is bio-mechanics. But as we

have seen, it is primarily a ”man“, the main element of which is built around

the skeleton of this discipline, and is, the man, that particle that interacts and

which must relate, with a multitude of factors, which characterizes both

himself, and the habitat in which it is located: the organizations. We have

also used the HFACS model to analyze the case of a flight, and we arrived at

the following conclusions. This model of analysis, can to be used to

understand the causes of an accident and to improve the safety of the flight,

but it is a descriptive model and not prescriptive, it is in fact, a reactive

safety model, because it is possible to analyze an accident, only after this

has happened, and not before. It is a proactive safety model, but at the same

time, a valuable tool to investigations. Many strides have been made since

its introduction in aviation, and for this reason, it would be appropriate to

use also in others contexts that involve the organization and the mix of

technological and human components, such as marine disasters, rail

accidents, (i.e. Costa Concordia, or Formula1- Bianchi accident) and in all

those situations where the Reason Model is already applied.

Page 14: “Human Factors, understanding mistakes to prevent ... · accordingly, deals with the design of tools, products and systems with which humans interact. The principles of Human Factors

Bibliography

AERONAUTICA MILITARE, Il Fattore Umano, a cura del Col. Pil. Enrico

Gavettini, Istituto Superiore per la Sicurezza del Volo, Roma, 2010.

BONAZZI GIUSEPPE, Come studiare le organizzazioni, Il Mulino,

Bologna, 2006.

BOUDON RAYMOND, FILLIEULE RENAUD, I metodi in sociologia, Il

Mulino, Bologna, 2005.

CAA, Aviation Maintenance Human Factor (JAA, JAR 145), CAP 716,

Documedia Solutions Ltd, 2002.

CATINO MAURIZIO, Da Chernobyl a Linate. Incidenti tecnologici o errori

organizzativi?, Bruno Mondatori, Milano, 2006.

CATINO MAURIZIO, Capire le organizzazioni, Il Mulino, Bologna, 2012.

CHIALASTRI ANTONIO, Human Factor, Sicurezza e Errore Umano, IBN

Editore, Roma, 2011.

COOLEY CHARLES HORTON, Il gruppo primario. I processi

comunicativi, a cura di Raffaele Rauty, Kurumuny, Lecce, 2012.

FAA, Human Factor Guide for aviation maintenance and inspection, 2002.

FAA SYSTEM SAFETY HANDBOOK, Human Factors Principles &

Practices, August 2, 2000.

GOFFMAN ERWING, La vita quotidiana come rappresentazione, Il

Mulino, Bologna, 1997.

HACHEN JR. DAVID S., La sociologia in azione, Come leggere i fenomeni

sociali, Carrocci Editore, Roma, 2008.

KOTTAK CONRAD PHILLIP, Antropologia culturale, McGraw Hill,

Milano, 2008.

ICAO, Doc 9859, Safety Management Manual, Second Edition, 2008.

MEAD GEORGE HERBERT, Mente, sè e società, Giunti Editore, Firenze,

2010.

MADGE JOHN, Lo sviluppo dei metodi di ricerca empirica in sociologia, Il

Mulino, Bologna, 2011.

NASA-AMES RESEARCH CENTER, Evaluating behaviourally oriented

aviation maintenance resource management (MRM) training and programs:

methods, results, and conclusions, Moffett Field, 2003.

PERRONE LUIGI, Da straniero a clandestino. Lo straniero nel pensiero

sociologico occidentale, Liguori Editore, Napoli, 2005.

PIRANDELLO LUIGI, Uno, Nessuno e Centomila, Einaudi Editore, Torino,

1994.

RHONA FLIN, PAUL O’CONNOR, MARGARET CRICETON, Il Front-

Line della Sicurezza. Guida alle Non-Technical Skill, Hirelia Edizioni,

Milano, 2011.

REASON JAMES, The Human Contribution. Errori, Incidenti e Recuperi

Eroici, Hirelia Edizioni, Milano, 2010.

RUBINELLI SARA, CAMERINI LUCA, SCHULTZ PETER J.,

Comunicazione e salute, Apogeo, Milano, 2010.

SCHEIN EDGAR, Culture d’impresa, Come affrontare con successo le

transizioni e i cambiamenti organizzativi, Raffaello Cortina Editore, 2000.

SCICUTELLA MARIO, La gestione d’impresa, Cacucci Editore, Bari,

2011.

SCIOLLA LOREDANA, Sociologia dei processi culturali, Il Mulino,

Bologna, 2010.

SMELSER NEIL J., Manuale di Sociologia, Il Mulino, Bologna, 2007.

Page 15: “Human Factors, understanding mistakes to prevent ... · accordingly, deals with the design of tools, products and systems with which humans interact. The principles of Human Factors

SMITH ADAM, La ricchezza delle Nazioni, Editori Riuniti, Roma, 2006.

WEICK KARL E.; ROBERTS KARLENE H., Collective mind in

organization, Heedful interrelating in flight decks, in Administrative Science

Quarterly, 1993.

VAUGHAN DIANE, The Challenger Launch Decision. Risk Technology,

Culture, and Deviance at NASA, The University Chicago Press, Chicago,

1996.

Papers

AERONAUTICA MILITARE, dispensa del 45° corso qualificazione

sicurezza volo, Sicurezza del volo e dinamiche di gruppo (dal gruppo al

team), Roma, 2007.

AERONAUTICA MILITARE, Sicurezza del Volo, Periodico nr. 263

settembre/ottobre, 2007.

AERONAUTICA MILITARE, Sicurezza del Volo, Periodico nr. 267

maggio/giugno, 2008.

AERONAUTICA MILITARE, Sicurezza del Volo, Periodico nr. 284

marzo/aprile, 2011.

ANSV, Agenzia Nazionale per la Sicurezza del Volo, Relazione Finale d’

Inchiesta, Incidente accorso all’ aeromobile ATR 72, Marche TS-LBB,

ammaraggio al largo di Capo Gallo (Palermo) 6 agosto 2005.

ENAV S.p.A., Italian Company for Air Navigation Service, ANS Training,

Legislazione Aeronautica, 2011.

FONDAZIONE 8 OTTOBRE 2001, Gli incidenti aerei si possono evitare?

Compiti e responsabilità dei governi, Milano, 2005.

Websites

www.ansv.it

www.faa.gov

www.iata.org

www.icao.int

www.enac.gov.it

www.enav.it

www. wikipedia.org/wiki/Volo_Aloha_Airlines_243