Food Poisoning (OGHFA SE)
From SKYbrary Wiki
|Content source:||Flight Safety Foundation|
|Human Factors Aspects||Food Poisoning, Safety Culture, Judgment, Expertise|
|Operator's Guide to Human Factors in Aviation|
The Incident as a Situational example
You are the captain of a crew that was off duty for 48 hours before your scheduled flight back to Europe. You had dinner together. A few hours later, all felt sick except you. A local doctor prescribed drugs and advised your crew to see an approved flight doctor, who diagnosed severe gastroenteritis. This doctor kept the crew on flight status, following an airline recommendation.
Your sick crew takes over the four-engine airplane midway to Europe, with more than 250 passengers aboard. Due to strong headwinds, fuel reserves have to be recomputed often and the first officer frequently leaves his seat to go to the toilet or to rest in the first class section.
What is your attitude toward your first officer?
As an experienced captain having logged more than 15,000 flight hours, you find yourself practically the only pilot flying for the next five hours of flight.
Over southeast Europe, air traffic control (ATC) advises that weather is deteriorating over the destination and that a Cat III landing is expected due to heavy fog. The first officer is not qualified for such an approach, and you have only practiced it in the simulator. You contact the airline and ask for an exemption covering the first officer, which is granted contrary to government regulations. Nobody asks if all is well on board.
Approaching the airport, you are instructed to stay in the holding pattern northeast and are flying manually. The first officer returns to the flight deck and calls your attention to the low fuel level, which may require a diversion to a nearby airport, where the weather is better and will allow a Cat I instrument landing system (ILS) approach.
What would your decision be under the circumstances?
Interrupting your conversation when you are about to decide to divert, ATC finally authorizes the flight to land at the original destination airport where the weather is improving and will allow a CAT II ILS approach.
You leave the holding pattern and start down toward 3,000 ft, extending the landing gear and flaps. ATC instructs you quite late to turn to a heading of 240 degrees when only 16 nm (30 km) from the localizer, 4 nm (7 km) less than in a normal ILS approach.
You have to rush, and your workload increases significantly because the first officer is more of a spectator due to his illness. The airplane continues the descent at 190 kt, flaps 10 degrees, and still without consistent information between the flight director and the localizer on Autopilot “B,” which is flashing red.
Furthermore, ATC informs you that one of the approach light groups is out due to an electrical failure and that he will call back. The flight engineer goes through the on-board documentation to find out if the situation is still compatible with the Cat II approach.
You switch back to Autopilot “A,” upon which the localizer is captured on the right side of the ILS. There is still no clearance from the approach controller, who is waiting for another aircraft to exit the runway. Altitude is 2,450 ft and decreasing, stress is increasing and there is no further communication from the controller.
Would you continue the final approach in view of the low fuel level?
You switch back to Autopilot “B.” The airplane continues to follow an “S” pattern blindly trying to intercept the correct ILS bearing and glideslope.
Passing 1,000 ft and preoccupied by the low fuel level, you decide to continue the final approach. Still in the fog at 600 ft and without precise data from the ILS, an automated landing is no longer feasible, so you disengage all the automatic systems and take over manual control.
At 250 ft, the runway is still not visible, and you prepare for a go-around. At 150 ft, the airplane clears the fog, and you suddenly realize you are not aligned with the runway centerline.
What would your reaction be?
Both you and the first officer push the thrust levers to full go-around thrust.
Due to inertia, the 300-ton airplane — with a descent rate of 15 fps — is slow to react and flies over the motorway parallel to the runway at a height of only 75 ft. It then overflies buildings at less than 50 ft, creating panic among pedestrians below before regaining altitude and flying back up through the fog. The controller clears you for a second approach to the same runway, this time under Cat I rules.
The first officer offers to relieve you, but you refuse the offer and reconnect Autopilot “B.” After a perfect go-around and approach, you land on the runway.
By the time the airplane reaches the gate, dozens of complaints have been received by the airline. The chief pilot calls you to his office.
How do you feel about being summoned by your chief pilot?
You are very irritated, refuse to speak to the flight safety officer and return home. You warn your wife you might receive a call from the company related to the go-around and go to sleep. That evening, you are informed that you and your crew have been taken off flight status until further notice.
Data, Discussion and Human Factors
Several weeks later, you were demoted to first officer pending an investigation. You decided to resign, followed by the flight engineer some time later.
Two years later, a trial opened under pressure of the local news media. You were accused of having endangered the lives of your passengers and crew by your negligence, and fined. During the trial, it was reported that the airplane had flown shortly after the incident without any inspection of the faulty autopilot. Several days later, when it returned to the hub airport for maintenance, the technicians discovered that four pages were missing from the airplane’s avionics maintenance log.
The captain’s appeal, supported by the pilots’ union, was rejected. Three years after the incident, the former captain drove his car to his home village, attached a hose to the car’s exhaust pipe and took his life.
The above scenario shows the impact of food poisoning on the conduct of a flight and on the consequences that it could have.
It is important to check the quality of food and the preparation, especially in countries known for water quality problems.
After feeling sick, the crew went to see a local doctor, who prescribed drugs. He further advised the crewmembers to see an approved company flight physician, which they did.
This second doctor diagnosed severe gastroenteritis but, surprisingly, did not remove the crew from flight status.
Once in flight, only the captain was fully capable of flying, and the affected crewmembers were dehydrating rapidly.
During the flight, the possibility of diverting to an alternate en route airport was considered but not pursued.
Safety culture and blame
This incident and its sad conclusion contributed to triggering a change of attitude toward safety in most of those involved.
It was realized at the time that the cooperation of all was needed to improve the overall level of safety. Blaming those who reported errors was a mistake in itself. The proper attitude was to encourage people to report why an incident occurred so that the experience would be shared and help others to avoid the same mistake.
International Civil Aviation Organization (ICAO) Annex 13 states: “A systematic search for the ‘why’ is not intended to pinpoint a single cause, nor is it intended to assign blame or liability, or even to excuse human error. Searching for the ‘why’ helps identify the underlying deficiencies that might cause other incidents or other accidents to happen.”
It is human not to tell others something that could possibly result in self-incrimination. An incident/accident investigation is only effective in terms of prevention if the open cooperation of the people involved can be achieved with all the pertinent information provided, including possible mistakes or omissions. This can be achieved if the investigation is free from blame, disciplinary action or liability, all aspects of a genuine safety culture.
Prevention Strategies and Lines of Defense
To mitigate the risks associated with bad food and inadequate preparation, it is the responsibility of the crew to choose the proper restaurant when off duty. Most international hotel chains where crews stay between flights offer good quality food.
To further reduce the risk of food poisoning, airlines often recommend that crewmembers chose different menu items when eating together, just as in flight where the captain and first officer get different meals.
Safety culture and blame
It is often said that the main protection against accidents includes the following:
- High level of technical skills;
- Strict adherence to procedures;
- Reduction of distractions;
- Proper and timely decision making;
- Conscious and renewed situational awareness; and,
- Crew coordination and mutual backup.
However, critical errors are still made. One possible way to mitigate their consequences is to detect and recover from them before it is too late, and disseminate any findings throughout the community to avoid a repetition. This must be done in a positive way, not in order to assign blame or to find a culprit.
People do not commit errors on purpose. The following principles of error management may aid prevention:
- Recognize that we, as human beings, make mistakes.
- Apply standard operating procedures (SOPs).
- Manage available resources, increase redundancy and decrease workload.
- Make other team members aware of risks and intentions.
- Evaluate the consequences of an action before undertaking it.
A conscious safety culture enables organizations to identify and manage risk through a formalized approach that identifies issues, corrects them and ensures they stay fixed.
Formalizing risk management is imperative as we move from an after-the-fact accident investigation approach, to a diagnostic and more predictive one. We also need to share what we learn.
During a Cat II ILS approach to a congested European airport after a very long flight, the captain had difficulty getting the autopilot to lock on to the ILS. He was virtually flying alone because the first officer and the flight engineer were incapacitated by food poisoning. The aircraft deviated to the right of the runway centerline. A go-around was performed, and the aircraft came close to buildings and people on the ground. The second approach and landing were uneventful.
Safety culture was also called into question by the fact the captain did not report the incident, did not want to discuss it and was made a scapegoat by the airline. Becoming involved in a public trial, the captain could not stand the pressure and later committed suicide.
The following can prevent other crews and airline managers from falling into the same traps:
- Adopt a positive organizational safety culture: prevent, detect, correct, communicate.
- Recognize that human beings do not make mistakes on purpose.
- Recognize, too, that failure to report an error is a violation that can be punished.
- Ensure that reporting of errors does not result in blame, disciplinary action or liability.
- Discretionary choice of restaurant and different crew meals crew may help prevent food poisoning.
Associated OGHFA Material
- Company Safety Culture
- Decision Making
- Decision Making Training
- Pilot judgment and expertise