Disuse of Automation Disuse and the death of a railroad legend



Download 22.36 Kb.
Date18.10.2016
Size22.36 Kb.
#2909
Disuse of Automation
Disuse and the death of a railroad legend. Disuse is a recently coined term, but the underutilization of automation has a long history. Automation disuse played a vital but often overlooked role in the death of the world’s most famous railroad man. Here is the story of Casey Jones and the wreck of Engine 382 (Gurner, 2000).
During the early morning hours of April 30, 1900, Engine 382 pulled into Memphis’ Poplar Street Station. The train was over an hour late. To compound problems, the engineer scheduled to take over in Memphis was sick. Although they had just finished an earlier run, Casey Jones and his fireman, Sim Webb, volunteered to take the six-car passenger train to Canton, MS. Jones and Webb were an interesting partnership. Casey was in his thirties, a very tall and popular Irishman with a reputation as a “fast roller.” Jones was an experienced engineer, but his record was not spotless. Since becoming an engineer in 1890, he had been suspended nine times for infractions involving the operation of his train. At 26, Sim was a Black graduate of Tulane and former schoolteacher. After the dangers of pedagogy, serving as Casey Jones’ fireman must have seemed like a tranquil occupation. Sim became a teacher after his predecessor was killed by a mob. The insensitive instructor had infuriated the local citizens by asking for a raise.

With a powerful engine and a light load, Casey was sure that he and Sim could make up the lost time. Jones and Webb put the steam to Engine 382 and moved through the southern countryside at record pace. As they sped along, the setting for disaster was unfolding before them. Three trains in Vaughn yard needed to be transferred to a sidetrack to allow Engine 382 to pass. As one train was being moved, an air hose broke stranding four cars on the main line. The workers were scurrying to replace the air hose when Engine 382 rounded the curve heading into Vaughn. Casey yelled for Sim to jump and the fireman leapt to safety. Casey reversed the engine, opened the sanders, and put the brake in emergency. At 0352, 382 smashed into the cars on the main line, left the track and plowed into an embankment. No other cars were thrown from their rails and Casey was the only fatality.


After the wreck, Wallace Sauders, a Black engine wiper in Canton composed a ballad about his friend. Workers along the Illinois Central Railroad first sang the song; later it became a fixture of vaudeville acts. Eventually, Casey’s song and fame spread around the world. The Casey Jones home in Jackson, TN is now a railroad museum and the center of a lucrative tourist industry. The Jones story has maintained a lasting interest, in part, because of controversies surrounding Casey’s death.
The cause of the wreck brings out the most emotional arguments. Some people, including Casey’s wife, believed that the flagman did not signal Jones to stop. Also, Engine 382 was slightly behind schedule when it arrived at Vaughn. Why then did the yard workers wait until the last possible opportunity to move the cars to the sidetrack? To this day, many people believe that the Vaughn railroad workers constructed a false story to protect their jobs. If so, they succeeded. The Illinois Central’s investigation placed the blame solely on Casey. More objective eyes might deduce that the wreck was due to a concurrence of events. Had the air hose not broken or if Engine 382 arrived a few minutes later, the tracks would have been cleared. Still, Casey must still bear some responsibility. If he was not running at a record pace, he may have been able to bring Engine 382 to a halt or at least lower the speed at impact.
Why Casey did not jump will be forever unresolved. The brake functioned automatically after the engineer initiated it. Once applied, neither the engineer nor the fireman was necessary to slow the train. The automated braking system was a safety feature of which Casey did not take advantage. He could have set the brake and jumped from the cab, possibly saving his life. Instead, Casey needlessly rode Engine 382 to its destruction. When they pulled him from the cab, he was clutching the brake as if he were trying to control it manually. The legend of Casey Jones grew from a combination of music, misfortune, heroism and disuse of automation.
Crisis at Reactor #2. At 0400 on March 28, 1978, an electro-magnetically valve was routinely activated to release pressure on Reactor #2 at Three Mile Island, PA (Morrissey, 1979). The valve stuck in an open position, but the gauge on the operators’ control panel incorrectly indicated that it had closed. Operators quite reasonably assumed that the core was emerged in its coolant and the temperature in the reactor controlled. Meanwhile, radioactive steam and water were forced through the open valve, letting a significant amount of coolant escape. As the temperature rose in the reactor, the automated emergency core cooling system (ECCS) engaged. Presented with conflicting information from their instruments, the operators were faced with a critical AUD. They could either allow the ECCS to conduct its safety functions or override the automatic systems. According to a report by the Nuclear Regulatory Commission, the operators were overly concerned with keeping the coolant from overfilling the reactor. They cut back the ECCS and the temperature in the core soared to 2000 degrees F. Disuse of automation at Reactor #2 turned a minor mishap into a potential disaster.
Fortunately, the accident at Three Mile Island did not have calamitous health or environmental consequences. Nevertheless, fears of future malfunctions led the public to demand that greater restrictions and safeguards be placed on the nuclear industry. As a result, the expansion of nuclear energy and the construction of nuclear power plants in the US were severely curtailed.
Equations as decision aids. Not all decision aids are electronic devices equipped with sophisticated sensors. Some of the most successful decision aids are statistical formulas that simulate human decision making. In his very controversial book, Clinical Versus Statistical Prediction, Meehl (1954) contended that many judgments are best made statistically, not intuitively. Since Meehl issued his challenge, the question of intuitive versus statistical decisions has extended to many topics, including studies of psychiatric diagnoses (Goldberg, 1965), survival times of terminally ill patients (Einhorn, 1972), treatment outcomes (Barron, 1953) and violent behavior (Werner, Rose, & Yesavage, 1983). A large proportion of these studies (e.g., Dawes, Faust, & Meehl, 1989; Kleinmuntz, 1990) have supported Meehl’s position, finding that statistical predictions are more accurate than subjective judgments.

An investigation by Beck, Palmer, Lindau, and Carpenter (1997) illustrates the development of actuarial equations how these formulas can be used as decision aids. Their purpose was to provide North Carolina prison officials with an equation, which would assist in determining if violent criminals should be moved from maximum security to a less restrictive form of incarceration. This is an important decision, because promotion usually leads to early release. The database consisted of all promotion decisions involving violent criminals made in this state over the preceding three years. From the sample, an equation was computed in which a number of predictors (e.g., prior arrests, age of first arrest, personality test scores) were correlated with the promotion decision (yes/no). Results revealed that the combination of predictors yielded a reliable estimate of the likelihood of promotion.


Before a new decision, the prisoner’s data were entered in the equation and the likelihood of promotion calculated. For example, the equation might indicate that the probability of promotion for a particular prisoner was .12. Most decisions of the review board were expected to coincide with the results of the equation. However, when the judgment of the review board and the results of the equation conflicted, the prisoner’s promotion application was to receive further scrutiny. That was the plan, but the equation was soon discarded. Like many decision makers, the review board was convinced of the accuracy of their own subjective opinions. While acknowledging that the developer of a decision aid may be a particularly biased judge of its utility, ignoring this and many other actuarial equations are instances of automation disuse.


Tragedy at Martha’s Vineyard. John F. Kennedy Jr. obtained a private pilot’s license in April 1998, which permitted him to fly under good visibility. By July 16, 1999, he had gained more than 300 hr of flying time and was working on, but had not yet finished, an instrument rating. That evening, as his plane approached Martha’s Vineyard, no horizon was visible over the open water. A haze also obscured Kennedy’s flight path. Without spatial guideposts pilots frequently encounter difficulty determining if the plane is ascending, descending or turning. According to the National Transportation Safety Board (2000), Kennedy became disorientated and lost control of the aircraft. The plane containing he, his wife and his sister-in-law hit the water facing nose-down, descending at a rate of over 1400 m/min.
Pilots experiencing disorientation are taught to disregard their senses and concentrate on the readings of their navigation instruments. If Kennedy were a more experienced pilot, it is likely that he would have centered his attention on the instruments and successfully restored the situation. A lack of instrument training led to automation disuse and tragedy.
Misuse of Automation
Pilots and over reliance. Deciding between automated and manual control is often a question of balance. Operators are vulnerable to disuse if they discount the benefits of automation. On the other hand, persons, who put too great a dependence on automation, are susceptible to misuse. Parasuraman and Riley (1997) describe the case of a small plane attempting to land during a nighttime snowstorm. Apparently, the pilot lacked confidence in his ability to manually control the aircraft. Relying excessively on the automatic pilot, he failed to monitor his airspeed and crashed short of the runway.

More experienced pilots are not immune to automation misuse. A Boeing 737 was heading for Denver on October 1998 and received instructions to begin a descent to 19,000 ft (“Automation Errors,” 1999). The captain attempted to use the flight management system to reduce the plane’s altitude, but the aircraft did not react in time. After the controller once more ordered the plane to descend, the captain took manual control of the aircraft. The 737 and another plane had a close call. Later the captain attributed his mistake to an over reliance on automation.


A second case of automation misuse occurred the same month as an ATR-72 commuter airliner approached Dallas-Fort Worth Airport at 5000 ft (“Automation Errors,” 1999). The pilots’ traffic-collision avoidance radar advised them to climb to avoid an oncoming aircraft. When they put the plane in a steep climb, the controller immediately told them, “Don’t climb! There’s crossing traffic above you at 6000 ft.” Relying on their automation, the ATR-72 continued to ascend, emerging from the clouds only 550 ft from an MD-80 jet.
Misuse of automation and international conflict. In 1983, Sukhoi Su-15 fighters shot down a Korean Airlines over Soviet airspace, killing 269 persons. The US government accused the Soviets of callously killing civilians and the Soviets contended that the flight was on a spy mission (Church, 1983). It now appears (“KAL 007 - ICAO conclusions," 2000) that incorrect use of the inertial navigation system caused the aircraft to fly off course. Misuse of automation may also have contributed to the fratricide of two Army Blackhawk helicopters over the No-Fly Zone in 1994. In that incident, two American F 15 pilots shot down the Blackhawks, killing all 26 people on board. Investigators (Lacayo, 1994; Thompson, 1995) found that the Blackhawks had set their “friend-or-foe” system to a frequency of 42. The F-15s friend-or-foe system was set to 52, so their computers did not recognize the Blackhawks as friends. The helicopter pilots sent the wrong signal because they were not told to change to frequency 52 when passing from Turkish to Iraqi airspace.
The failure of the Air Force to furnish the Blackhawk pilots with both frequencies certainly led to the tragedy. However, that mistake was not the sole causal factor; the fratricide was due to multiple breakdowns. An Airborne Warning And Control System (AWACS) crew could have prevented the attack if they altered the F-15 pilots that Blackhawks were in the area. The fighter pilots must also assume responsibility. After making visual contact, they mistook the Blackhawks for Iraqi helicopters. Furthermore, the F 15 pilots did not follow the rules of engagement.

Ironically, automated safeguards may have set the stage for the fratricide that they were designed to prevent. Crew complacency is a potentially negative effect automation, resulting in an over reliance on machines (Parasuraman, Molloy, & Singh, 1993; Singh, Molloy, & Parasuraman, 1993a; Thackray & Touchstone, 1989). The F-15 pilots may have put too much faith in their automation devices to distinguish American from enemy aircraft. When they made visual contact, the fact that they had not received a friendly signal or a warning from the AWACS may have predisposed them to “see” the helicopters as Iraqi and to fire their missiles.


Response errors


Disaster at Sea. Some accidents are not due to misuse or disuse, but result from the inability of operators to execute the responses required by manual or automated controls. One such response error occurred on May 23, 1939, onboard the USS Squalus. The Squalus was the Navy’s newest and most advanced submarine and was taking a test dive about five miles southeast of the Isles of Shoals, NH. As the Squalus submerged, all operations appeared to be normal. Then at a depth of 50 ft, water began to rush through the submarine. Before the sea completely filled the sub, sailors sealed off the three forward compartments. There seemed little hope for the men as the submarine came to rest below 243 ft of water. The Terrible Hours (1999) by Peter Maas chronicles the astonishing rescue of 33 crewmembers by “Swede” Momsen and his divers.
A Court of Inquiry concluded that a malfunction of the engine-induction valve led to the sinking of the submarine. But how had that happened? No conclusive evidence was ever found. Momsen believed that the main induction valve was closed at the start of the dive and inadvertently opened after the Squalus submerged. In his view, the levers for the hull valves were positioned close together in a potentially confusing fashion. Poor system design may have led a seaman to pull the wrong lever.
Squalus Revisited. It seems incredible that a sailor could inadvertently pull the wrong switch and sink a submarine. Nonetheless, similar accidents continue to occur. On January 12, 1997, a freight train consisting of three locomotives and 75 hopper cars was moving through Kelso, CA (“LAX 97 FR 004,” 2000). The engineer unintentionally activated the multiple-unit engine shutdown switch, cutting off the diesels and the train’s dynamic braking power. The train rapidly accelerated beyond the 20-mph speed limit and left the tracks at 72 mph. Sixty-eight cars derailed, but thankfully there were no injuries.

The crash of a China Airlines A-300 in April of 1994 (“Aircraft Accident,” 2001) was also due to operator error. As the Airbus was making its approach to the airport in Nagoya, Japan, the first officer struck the wrong switch on the autopilot. The plane automatically accelerated and began an emergency climb. The crew tried to reduce power and bring the nose down, but the computers pushed the plane higher. Eventually, the nose reached an angle of 53 degrees. In attempt to save the plane, the crew applied full power. Despite their efforts, airspeed dropped to 90 mph, far below the speed needed for a safe approach. The tail hit before the runway, killing 264 of 271 persons on board.



One of the most frequently cited response errors was the crash of American Airlines Flight 965 near Cali, Columbia in December 1995 (Phillips, 1999). Pilots accepted the controller’s offer to land on a different runway at Alfonso Bonilla Aragon International Airport. In a rush to descend, they programmed the wrong abbreviation into the flight management system. The plane changed its direction and flew into a 9000 ft mountain; 159 of 163 persons on board were killed. Following the crash, the National Transportation Safety Board stressed that pilots should program the full names of radio beacons into their flight management systems and to check those fixes with navigational coordinates.
Download 22.36 Kb.

Share with your friends:




The database is protected by copyright ©ininet.org 2024
send message

    Main page