Chapter 7 Software and the Challenge of Flight Control §


§ This chapter will appear in a forthcoming book (2013)



Download 132.5 Kb.
Page3/3
Date29.07.2017
Size132.5 Kb.
#24366
1   2   3
§ This chapter will appear in a forthcoming book (2013) Space Shuttle Legacy: How We Did It/What We Learned to be published by the AIAA and edited by Roger Launius, James Craig, and John Krige.


1 Charles Fishman, “They Write the Right Stuff,” Fast Company, December 1996


2 Tarandeep Singh, “Why NASA Space Shuttle Software Never Crashes: Bug-free NASA Shuttles,” The Geeknizer, July 17, 2011


3James Tomayko, “Computers in Spaceflight: The NASA Experience,” NASA Contractor Report CR-182505, 1988.


4 http://www.ibiblio.org/apollo/Gemini.html


5 Arthus L. Slotkin, Doing the Impossible: George E. Mueller and the Management of NASA’s Human Spaceflight Program, Springer-Praxis, Chichester, U.K., 2012.


6 Software “discrepancy” is the common NASA terminology for a software bug or error.


7 GAO, “NASA Should Implement Independent Oversight of Software Development,” GAO/IMTEC-91-20, 1991

8 In 1992, an NRC committee studying the NASA Shuttle software process was told that the yearly cost for the flight software development contractors was approximately $60 million. Operation of the Shuttle Avionics Integration Lab (SAIL), which is used to test the flight software, required approximately $24 million per year. This total does not include the costs for the Space Shuttle Main Engine software and other support contractors. See An Assessment of Space Shuttle Flight Software Development Processes, Committee for Review of Oversight Mechanisms for Space Shuttle Flight Software Processes, Aeronautics and Space Engineering Board, National Research Council, 1993


9 User notes are provided to the Astronauts to help them work around known software limitations and errors. Waivers are decisions on the part of the Shuttle program to recognize a condition, such as a known software error, as an acceptable risk. Thus, a condition that receives a waiver is set aside, sometimes fixed at a later date when time and resources were available, but were not considered sufficient cause to hold up a flight. The excessive use of waivers got attention after the Columbia loss.


10 R.M. Mattox and J.B. White, “Space Shuttle Main Engine Controller,” Report NASA-TP-1932 M-360, NASA, Nov. 1, 1981.


11 Gene D. Carlow, Architecture of the Space Shuttle Primary Avionics Software System, Communications of the ACM, Vol. 27, No. 9, Sept. 1984, pp. 926–936


12 A Titan carrying a Milstar satellite was lost in 1999 when a typo in a load tape that had inadvertently never been tested led to an incorrect attitude value being used by the flight control software. See J.G. Pavlovich, Formal Report of the Investigation of the 30 April 1999 Titan IV B/Centaur/TC-14/Milstar , U.S. Air Force.


13 A time-sliced system allocates predefined periods of time for the execution of each task and then suspends tasks unfinished in that time period and moves on to the next time slice.


14 J.R. Garman, "Software Production Facility: Management Summary Concepts and Schedule Status," NASA Data Systems and Analysis Directorate, Spacecraft Software Division, February 10, 1981, p. 12


15 Compare this with the approximately 5,000,000 lines of code on commercial aircraft today and 15-20 million lines in military aircraft. The ISS has 2.4 million lines of code on board.


16 In the preface to the language specification document, the name “HAL” is described as being in honor of Draper Labs Dr. J. Halcombe Laning.


17 HAL/S Language Specification, United Space Alliance, 2005


18 F.H. Martin, HAL/S: The Avionics Programming System for the Shuttle, AIAA, 315, 1977.


19 ALT (Approach and Landing Tests) were a series of taxi and flight trials of the prototype Space Shuttle Enterprise conducted in 1977 to test the vehicle’s flight characteristics both on its own and when mated to the Shuttle Carrier Aircraft, prior to the operational debut of the shuttle system. In January 1977, Enterprise was taken by road from the Rockwell plant at Palmdale, California, to the Dryden Flight Research Center at Edwards Air Force Base to begin the flight test phase of the program, which had been christened by NASA as the Approach and Landing Tests (ALT).


20 Frederick Brooks, The Mythical Man Month, Addison-Wesley, 1973.


21 William A Madden and Kyle Y. Rone, “Design, Development, Integration: Space Shuttle Primary Flight Software System,” Communications of the ACM, Vol. 27, No. 9, Sept. 1984, pp. 914–925.


22 For example, impacts to the Mission Control Center, the Launch Processing System, procedures and training, or flight design requirements.


23 J. Christopher. Hickey, James B. Loveall, James K. Orr, and Andres L. Klausman, “The Legacy of Space Shuttle Flight Software,” AIAA Space 2011 Conference, Sept. 2-29, 2011, Long Beach California, 2011.

24 John C. Knight, J.C. and Nancy G. Leveson, “Experimental evaluation of the assumption of independence in multiversion software,” IEEE Trans.Software Eng. SE-12(1):96-109, 1986.


25 John C. Knight and Nancy G. Leveson, “A Reply to the Criticisms of the Knight and Leveson Experiment,” ACM Software Engineering Notes, January 1990.


26 Nancy Leveson, Safeware: System Safety and Computers, Addison-Wesley Publishing Companry, 1996


27 Nancy Leveson, Engineering a Safer World, MIT Press, 2012.


28 John R. Garman, “The “Bug” Heard ‘Round the World,” Software Engineering Notes, ACM, October 1981, pp. 3–10


29 Stubs are used in place of modules that have not yet been developed. They act as procedures and return default values so the software can execute before all the procedures are written and interfaces checked.


30 Keller, T.W. 1993, “Maintenance Process Metrics for Space Shuttle Flight Software” Forum on Statistical Methods in Software Engineering, National Research Council, Washington D.C., October 11-12, 1993.


31 Shuttle flight software errors are categorized by the severity of their potential consequences without regard to the likelihood of their occurrence. Severity 1 errors are defined as errors that could produce a loss of the Space Shuttle or its crew. Severity 2 errors can affect the Shuttle's ability to complete its mission objectives, while severity 3 errors affect procedures for which alternatives, or workarounds, exist. Severity 4 and 5 errors consist of very minor coding or documentation errors. In addition, there is a class of severity 1 errors, called severity 1N, which, while potentially life-threatening, involve operations that are precluded by established procedures, are deemed to be beyond the physical limitations of Shuttle systems, or are outside system failure protection levels.


32 Letter to Administrator, NASA, from Chairman, House Committee on Science, Space, and Technology, March 31, 1988.


33 Post-Challenger Evaluation of Space Shuttle Risk Assessment and Management, Aeronautics and

Space Engineering Board, National Research Council, January 1988.




34 An Assessment of Space Shuttle Flight Software Development Processes, Committee for Review of Oversight Mechanisms for Space Shuttle Flight Software Processes, Aeronautics and Space Engineering Board, National Research Council, 1993.


35 Stan A. Smith and Michael A Cusumano, Beyond the Software Factory: A Comparison of Classic and PC Software Developers, Massachusetts Institute of Technology, Sloan School WP#3607=93\BPS, September 1, 1993.


36 H. Hecht, “Investigation of Shuttle Software Errors,” SoHar Incorporated, study prepared for Polytechnic University, Brooklyn, New York, and the Langley Research Center, Hampton, Virginia, under NASA Grant NAG1-1272, April 1992


37 NASA, “Shuttle System Failure Case Studies: STS-126, NASA Safety Center Special Study, NASA, April 2009.


38 Anne Broache, “IRS Trudges on with Aging Computers,” CNET News, April 12, 2007 , http://news.cnet.com/2100-1028_3-6175657.html


39 Mark Lewyn, “Flying in Place: The FAA’s Air Control Fiasco,” Business Week, April 26, 1993, pp. 87, 90


40 Dan Eggan and Griff Witte, “The FBI’s Upgrade That Wasn’t,” The Washington Post, August 18, 2006, http://www.washingtonpost.com/wp-dyn/content/article/2006/08/17/AR2006081701485.html


41 Kirk Johnson, “Denver Airport Saw the Future. It didn’t work,” New York Times, August 27, 2005


Download 132.5 Kb.

Share with your friends:
1   2   3




The database is protected by copyright ©ininet.org 2024
send message

    Main page