U.S. Department
of Transportation

Federal Aviation

St. Louis
Flight Standards District Office

10801 Pear Tree Lane
Suite 200
St. Ann, Missouri 63074


May 1999 




Thought for the month....The way out of trouble is never as simple as the way in.

INCREMENTAL CHANGES...."Couldnít the pilot see that disaster coming?" Thatís a question that many of us ask when we read an account of an aircraft accident. In hindsight, itís completely clear that the situation was deteriorating and any prudent pilot should have seen it coming and stopped the chain of events. Are pilots who have these types of mishaps inept? Do they blindly blunder into these disasters with no conscious thought?

At times it would seem that way, but the reality is that most of us do not intentionally put ourselves in situations we believe will end in disaster. How we come to be there is usually a result of incremental change. If weíre flying on a clear day and we spot a huge thunderstorm with lightning and rain coming out of it, itís a pretty easy decision to change course and navigate around it. There is a clear delineation between clear skies and dark dangerous clouds. If, on the other hand, weíre flying in marginal conditions, with barely legal VFR visibility, things can get progressively worse in small bites. We divert to avoid some rain showers and now weíre no longer on our planned course. Weíre flying lower to maintain visual contact with the ground so now weíre off course and low level. Neither one of those conditions might cause an accident, but linked together, we hit a tall tower.

When incremental changes occur, each small change or addition isnít noticed until the whole chain is formed. At that point it may be too late to correct the situation. Problems add-up exponentially, that is, 1 problem plus another problem doesnít equal a value of 2, it may equal 4 or 8. Landing with one engine inoperative in a multiengine airplane, and landing in a cross wind that is very close to the demonstrated capability of the aircraft are two separate conditions. Each by themselves should be manageable by a proficient, well trained pilot. If both are allowed to be linked together however, the sum of their effects is greater than either condition separately.

Technology is often responsible for handing us incremental changes. Almost everyone who flies aircraft with new technology installed has found themselves asking the question: "Whatís it doing now?" This occurs after we have entered data or programmed a piece of equipment and it gives us other than expected results. The pilot - and copilot if installed - end up staring at the black box or the instrument panel chanting that familiar incantation. The first step is deciding whether it was us or the magic that screwed-up, after which we are faced with an uneasy feeling of distrust. Confidence in our ability, the equipment, or both is eroded, and this is often a link in the chain of events in a mishap.

Because of the insidious nature of incremental change, we may suddenly realize that we are well into task saturation. Little by little, things build up until we are no longer able to effectively manage the workload. At times like this, more than skill is required. Most of our flying skills were developed in routine situations. Likewise, there may be few rules that we can fall back on, so our decisions are going to require conscious thought and some sort of structured decision-making process. Unfortunately, trying to free up the necessary brain power to dedicate to that purpose may be very difficult. Attempting to shed workload after weíve become overloaded doesnít work well.

Error tolerance isnít a new concept, but the term isnít normally associated with planning a flight. Error tolerance essentially means protecting a system from a single event catastrophe. Manufacturers of aircraft realize that subtle incremental changes may be difficult to identify, and design their machines so that a single point failure will not cause the destruction of the aircraft. This is often done through redundant systems or sacrificial components - items that are designed to fail first, protecting more critical systems. The connection to flight planning is that we need to build in error tolerance by redundancy, and sacrificial components that are designed to warn us that the workload is building, allowing us to take action before a catastrophic failure occurs.

Examples of sacrificial components of a flight might be a "hard deck." An indicated altitude which, if sacrificed either above or below, will raise a mental flag that we are entering a potentially high workload situation. Another that I recall from my early training involved check points on a cross country flight. If I hadnít located a check point by three minutes past my ETA I was to turn around and fly back to the last place where I positively knew my location. Once there I could sort out why the error had occurred and continue if conditions allowed, or head back to my departure point. The purpose was to spread out the workload. Donít waste time and fuel trying to find a check point and become hopelessly lost, sacrifice it and only concentrate on going back to the last known point.

Most flights do not go as planned. Flights that are not planned can be worse to a greater degree. Big changes alert us immediately and focus our attention on attending to the problem. Incremental changes pile up slowly and may not be realized until itís too late, and we learn that the way out of trouble is never as simple as the way in.

Upcoming Events:

May 15

St. Clair, MO. City Hall, #1 Paul Parks Drive, 12 - 2 PM
Giving and Getting a Good Flight Review & Flight Services
Open House at the airport 15th & 16th
Food and activities both days

June 5

St. Louis Downtown Parks Airport
Users Meeting & Operations at Towered Airports
Parks College; hangar #8 1 to 4 PM

July 3

Poplar Bluff Airport
Poplar Bluff, MO
Flight Discipline



Safety Program Manager

1-800-322-8876 x 4835


"May Day"