“Organizational Behavior and Disaster: A Study of Conflict at NASA”

“Organizational Behavior and Disaster: A Study of Conflict at NASA”

Write a two – three page (8.5” x 11”, one inch margins, 11 point Arial, double spaced) summary review of the material presented in the article, as it impacts project management, both execution and quality. The intent of this assignment is to perform a critical analysis of the material, what did you learn relative to the major project management issues and then YOUR conclusion as to whether or not YOU agree with the authors’ conclusions. It should NOT be a summary of the history of NASA or a collection of quotations. I’m interested in what you think.

Suggested discussion points include:

How/why was the tripe constraint compromised? Why did competent technical staff that knew issues were present that could jeopardize the success of the project hesitate to speak up? Do you believe “group think” is limited to NASA or could it happen elsewhere? As project manager or technical lead of your own project, if you discover a project is going off track and there is reluctance on the team to speak up, what could you do?

ORGANIZATIONAL BEHAVIOR AND DISASTER: A STUDY OF CONFLICT AT NASA
ROBERT D. DIMITROFF, ChevronTexaco LU ANN SCHMIDT, Exxon Mobil Global Services Company TIMOTHY D. BOND, Transportation Industry Manufacturer

ABSTRACT
This paper examines how groupthink led to conflict in the National Aeronautics and Space Administration (NASA), with a focus on the Challenger and Columbia shuttle tragedies. We will show that, although there were technical causes of the accidents, there are deeper root causes that constitute a recurring thread. Throughout NASA’s history, there have been budgetary and scheduling constraints. In an attempt to meet these externally imposed restrictions, management has unconsciously and repeatedly fallen into the psychological tendencies of groupthink. A bulletproof attitude amongst NASA officials was a direct cause of the Challenger accident. Management began tolerating increasing amounts of “acceptable flight risks.” Management compromised safety, one of the quality components of the project management “triple constraint” of schedule, budget, and quality. As a result of this disdain for managing quality, the second accident occurred with a chilling sense of déjà vu. We will examine the root causes of the pressure on management, as well as the traps of conflict that have befallen management. Keywords: risk management; constraints, conflict resolution; groupthink
©2005 by the Project Management Institute Vol. 36, No. 1, 28-38, ISSN 8756-9728/03

Introduction ASA has an illustrious history of space exploration since its inception in 1958. Throughout history, however, there have been several failures along with the many successes. The Challenger and Columbia accidents, being the most tragic of accidents, are the focus of this paper. Groupthink-related conflict is evident at NASA, as exhibited by actions taken leading up to these incidents. Political pressures related to cost and schedule are historically at odds with engineering quality and are, thus, detrimental to flight safety. The project management profession uses a term entitled the “triple constraint,” where the project scope is constrained by cost, schedule, and quality. Quality can be further broken down into components such as risk, safety and controls. In A Guide to the Project Management Body of Knowledge (PMBOK® Guide), the Project Management Institute identifies nine Project Management Knowledge Areas, including ones concerned with schedule, cost, quality, risk, scope and human resources (Project Management Institute, 2000) When managing a project, one must understand the interaction between these Knowledge Areas and be able to manage tradeoffs when a change occurs. A change in one area typically impacts one or more of the other Project Management Knowledge Areas. Thus in reality, the “triple constraint” represents the management of tradeoffs among the various Knowledge Areas. The political pressures faced by NASA helped lead to making tradeoff decisions focused on cost and schedule versus the quality components of risk, and safety. Compounding these tradeoffs was a sense of invulnerability, and thus complacency, fashioned by a string of successful missions. A summary of key events in NASA history is provided to build an appreciation for the management culture at NASA. Over time, NASA developed a culture built upon success that bred complacency, which ultimately led to groupthink. This conflict-related behavior was identified as one of the root causes for both shuttle crashes. We then look at both the Challenger and Columbia incidents in more detail. These reviews are based on the published government reports and are focused on management behavior seen throughout both incidents. When looking at management actions, haunting similarities between the two cases can be seen. Groupthink and its components of collective rationalization, pressure on others, self-censorship, invulnerability, and fear of separation are at the heart of the key technical problems that led to each accident.

N

28

J U N E 2005 Project Management Journal

In our analysis, we will show how both shuttle incidents are related to NASA culture, as exhibited throughout the history of the organization. A sense of invulnerability, based on technological success, bred complacency and, ultimately groupthink types of behavior that placed program cost and schedule goals ahead of crew safety. Recommendations are then provided to indicate how future accidents can be avoided. These are geared towards management behavior and do not address technological solutions. History Throughout the history of NASA, safety issues were compromised during the decision-making process. The following is a summary of key events in NASA history organized to show how years of success led to a sense of invulnerability, a classic symptom of groupthink (Janis, 1971, p. 44). This timeline is also presented to show that NASA regularly faced political pressures related to budgets and schedules and correspondingly reacted to such politics Lastly, this legacy of NASA shows both the good commercial and technological successes resulting from various missions, along with the bad—tragic accidents. NASA began operations on October 1, 1958 almost a year after the Soviet Union successfully launched Sputnik 1, the world’s first satellite, on October 4, 1957. Garber (2003) cites the following: That launch ushered in new political, military, technological, and scientific developments. While the Sputnik launch was a single event, it marked the start of the space age and the U.S.-U.S.S.R space race.…The Sputnik launch also led directly to the creation of National Aeronautics and Space Administration (NASA). In July 1958, Congress passed the National Aeronautics and Space Act (commonly called the “Space Act”), which created NASA as of October 1, 1958 Due to Cold War implications, NASA started off in the middle of a political pressure cooker. The Sputnik launch raised fears that the Soviets could soon launch ballistic missiles capable of delivering nuclear weapons. Hence, the race was on. Throughout its history, NASA has been a focal point for politicians bent on achieving lofty goals—while also instituting budget cuts and accelerated schedules, and clamoring for the next breakthrough. It is this history of political, budgetary and schedule pressures that conflicted with acceptable risks, flight quality, and safety. These conflicts ultimately led to disaster for both Challenger and Columbia. After NASA’s formation, early technological success was the norm. Pioneer 1 was the first NASA launch on October 11, 1958. During this period, NASA exhibited a corporate personality of dominance. Such a personality is defined as “entrepreneurial, aiming for high goals, thriving on the challenge, aggressive and quick paced” (Davita, 2003, p. 28). In December, an Atlas rocket placed a communications satellite into orbit, allowing President Eisenhower to transmit a Christmas greeting—the first voice sent via space. In March of

1959, Pioneer 4 became the first vehicle to fly by the moon. Later that year in May, NASA proved its capability to launch a spacecraft (the Jupiter Bioflight 2 Test/Ionosphere mission) from Cape Canaveral and recover the vehicle in the Atlantic Ocean. Two monkeys had been placed aboard the vehicle, demonstrating NASA’s early commitment to minimize human safety risks during testing (Launius & Fries, 2003). In June of 1959, research pilot Scott Crossfield made the first unpowered glide flight in the X-15 Hypersonic plane. This flight was part of the X-15 joint research program between NASA, the Air Force, the Navy and North American Aviation, Inc. The X-15 program, which lasted until October 1968, contributed in important ways to the development of the Space Shuttle, including information from flights to the edge of space in 1961-63 in “what many consider to have been the most successful flight research effort in history” (Launius & Fries, 2003). This program showed commitment by NASA to leverage resources with the Armed Forces and key aerospace contractors. In addition to advancing technical knowledge needed to benefit the space race, such programs were also famous for producing commercial products. On May 25, 1961 President John F. Kennedy initiated the Apollo program, which was geared towards landing on the moon, in his Special Message to the Congress: I believe this Nation should commitment itself to achieving the goal, before this decade is out, of landing a man on the moon and returning him safely to earth. No single space project in this period will be more impressive to mankind, or more important for the longrange exploration of space; and none will be so difficult or expensive to accomplish (Kennedy, 1961). This speech summarized the political and technological challenges of the space race, as well as potential sacrifices needed to reach the moon. It not only placed pressure directly on NASA, but also established the public reality of a massive spending program that required additional taxation. This pressure was realized at a meeting on June 7, 1962, when NASA leaders argued about an earth-orbit versus lunar-orbit rendezvous. After more than six hours of discussion, those in favor of an earth-orbit rendezvous finally gave in to the lunar-orbit rendezvous mode, saying, “Its advocates had demonstrated adequately its feasibility and that any further contention would jeopardize the President’s timetable” (Launius & Fries, 2003). This was an early indication of how political and schedule pressures impacted technological (and thus quality and risk) decisions. Pressure and stress are some of the key elements of groupthink (Janis, 1971, p. 44). The decision-making exhibited at NASA was an early indication of groupthink behavior associated with President Kennedy and his administration. A couple of early flight problems occurred during the early 1960s, despite the continued success of the NASA space program. The first occurred when the hatch blew off prematurely from astronaut “Gus” Grissom’s Mercury spacecraft. He was hoisted to safety upon landing in the Atlantic,

J U N E 2005 Project Management Journal

29

but nearly drowned from the incident. The flight was still viewed as a “success,” since NASA technicians were able to find solutions to many of the problems associated with suborbital flights. In February 1962, John Glenn became the first American to circle the earth. Radio stations transmitted messages from the astronaut across the United States and United Kingdom (BBC News, 2003). Despite an autopilot failure and faulty heat shield, he was able to safely guide his craft back to earth and received a hero’s welcome. These early failures highlighted the risks of the space program. NASA, however, exhibited a capability to learn from these problems and create substantial knowledge and growing confidence within the close-knit space community. The early to mid 1960s was a period of continued space program success. The Mercury, Gemini, and Saturn programs all had multiple successful launches during this time. Many of these missions tested and confirmed the technology required to land on the moon. Benefits from the space program were also realized as the first commercial communication satellites were launched. Then, on January 27, 1967, NASA suffered its first serious setback in the race to the moon. Apollo 1, then named Apollo-Saturn (AS) 204, caught on fire during a launch simulation, killing astronauts Gus Grissom, Ed White, and Roger Chaffee. These were the first deaths directly attributable to the U.S. space program A review board was formed to investigate this accident. In addition to its findings on the technological causes, the board also noted several organizational-related factors that contributed to this accident (Garber, 2003). As an outcome of this report, Congress established the Aerospace Safety Advisory Panel (ASAP). The charter of this organization was to: Review safety studies and operations plans … review, evaluate, and advise on those program activities, systems, procedures, and management policies that contribute to risk and provide identification and assessment for management. Priority will be given to those programs that involve the safety of human flight. (NASA, 2003) As a result of these findings and the oversight provided by the ASAP, NASA established and followed rigorous procedures aimed at protecting the safety of launch crews. After the Apollo 1 accident, the Apollo program went into hiatus until the spacecraft could be redesigned. Apollo flights 2 through 6 were used to test various aspects of the program, such as the propulsion system, launch pad, and spacecraft, and the program returned to flight status with Apollo 7 in October 1968 (Launius & Fries, 2003). The flights of Apollo 4, Apollo 7, and Apollo 8 in 1967 and 1968 proceeded to show that NASA had conquered some of the technical and safety challenges, allowing the race to the moon to continue. The flight of Apollo 8 was described heroically as: an enormously significant accomplishment coming at a time when American society was in crisis over Vietnam, race relations, urban problems, and a host of other dif-

ficulties. And, if only for a few moments, the nation united as one to focus on this epochal event. (Launius & Fries, 2003) Such a description of this flight shows how the space program galvanized the U.S. public in a time of strife and created a string of heroes. Apollo 9 and 10 were also successful in leading to the first lunar landing. Apollo 11 lifted off on July 16, 1969. Four days later, Neil Armstrong became the first human to set foot on the moon, adding to the legacy of NASA. Later that year, a special task force met to plot a new course for the space program, including the development of a space station and reusable space shuttle (Launius and Fries, 2003). The flight of Apollo 13 again created heroes. This time they were located on the ground as well as in the air. The following reference to NASA’s technological genius is indicative of the building culture of invulnerability. Invulnerability, also a symptom of groupthink (Janis, 1971, p. 44), has its roots in the success of the program, nationalistic pride, and heroism. According to Launius and Fries (2003): The flight of Apollo 13 was one of the near disasters of the Apollo program.… The near disaster served several important purposes for the civil space program—especially prompting reconsideration of the propriety of the whole effort while also solidifying in the popular mind NASA’s technological genius. Throughout the 1970s, NASA continued with a string of technological successes that also resulted in substantial commercial benefits. For instance, flight research by NASA has drastically improved commercial airline fuel efficiency, as well as flight controls. This flight research led to the eventual flight of the first space shuttle. Additional benefits were also realized by the launching of Landsat and other satellites. Among the successes, NASA also overcame technical problems that developed during the Skylab program. NASA developed solutions to ensure the success of Skylab experiments and to allow for safe reentry of the vehicle. Technical know-how again was demonstrated as a factor in mitigating problems and further contributed to a sense that all failure can be overcome. In 1977, the first space shuttle orbiter, Enterprise, took to the air. Again, NASA enjoyed success and the shuttle flights increased. Space Shuttle Columbia first flew on April 12, 1981 and Space Shuttle Challenger first flew on April 4, 1983. Throughout the early and mid 1980s, NASA conducted many successful shuttle flights. These missions proved to be commercially successful, as many new satellites were placed into orbit. Flights became virtually routine and the proud agency continued going “where no man had gone before.” The reality of risks associated with space exploration was poignantly reinforced on January 28, 1986 when the Space Shuttle Challenger exploded 73 seconds into its flight, killing all seven astronauts. NASA created The Office of Safety, Reliability, Maintainability and Quality Assurance in response to the accident, which closely parallels the forma-

30

J U N E 2005 Project Management Journal

tion of ASAP after Apollo 1. The string of successes, accompanied by the need to overcome periodic failures, finally caught up with NASA. The natural response was to retrench and focus on improving quality and safety, while reducing risk. NASA’s personality had changed from one of dominance to one of a critical thinker, defined as “striving for more stability, setting higher standards, and systematic thinking” (Davita, 2003, p. 31). After a short hiatus, shuttle flights began again, along with more success—both technological and commercial. Multiple missions were conducted through the remainder of the 1980s and 1990s without incident. One of the “successes” was the launching of the Hubble Space Telescope in 1990. Here again, NASA exhibited extraordinary capabilities to overcome the technical problems with Hubble, ultimately resulting in many astronomical discoveries by the telescope. Throughout the 1990s, NASA also experienced success launching various space probes. Galileo was launched to explore Jupiter. Once again, technical problems were overcome to allow the mission to continue. The Mars Observer was sent to explore the “Red Planet;” despite its collection of significant data, this probe did not complete its mission due to lost contact with the spacecraft. However, since this was an unmanned vehicle, the failure went largely unrecognized. The Space Shuttle continued to fly in support of the new International Space Station, “the most complex space project ever.” Many shuttle missions were flown over a fiveyear period beginning in late 1998 to support the station. These flights were very successful in their tasks, and also had political implications by showing our ability to work with Russia towards a common goal. Unfortunately, this pattern of success mirrored previous history, where confidence led to complacency and the recurring sense of invulnerability. On February 1, 2003, The Space Shuttle Columbia broke up upon reentry, killing all seven astronauts aboard. Since then, NASA has once again grounded flights and has reorganized their focus on the quality, risk, and safety aspects of the program. The rest of NASA history is yet to be made. As we look at the causes of the two shuttle accidents resulting in a total loss of life and equipment, there will be an ensuing discussion on the causes behind the accidents and whether NASA is following the same flawed operating procedure. There were specific recommendations from both committees established after each of the shuttle accidents. Some of the recommendations put forth by the Rogers Commission after the Challenger accident may not have been implemented in totality. If this was the case, could the lack of implementation of these recommendations been a cause of the Columbia accident? Groupthink Before delving into the Challenger and Columbia accidents, we will first look at the concept of “groupthink.” Irving Janis coined this term in 1971 after his analysis of the Cuban Bay of Pigs crisis, Vietnam, Korea, and Pearl Harbor. He noted that a common thread tied together incidents where the par-

ticipants had a “desperate drive for consensus at any cost that suppresses dissent among the mighty in the corridors of power” (Janis, 1971, p. 43). We will examine some components of groupthink observed from the Challenger and Columbia incidents. These include: Invulnerability – The “group shares an illusion of invulnerability” that causes them to “become over-optimistic and willing to take extraordinary risks.” Collective Rationalization – The group begins to “ignore warnings” and “construct rationalizations in order to discount warnings” that current information may conflict with past decisions. Pressure on Others – The group applies “direct pressure to any individual who … expresses doubts about any of the group’s shared illusions.” Self-Censorship – People in the group do not want to disagree with the “group consensus” and “keep silent about their misgivings.” Unanimity – The group has an “illusion of unanimity” because people with different views do not speak up and thus the group feels that everyone is “in full accord with what the others are saying.” (Janis, 1971, pp. 44-45) We will also examine the concept put forth by Jerry B. Harvey in his “Abilene Paradox” paper involving “management of agreement.” (Harvey, 1974). In this case, the participants in the decision went along with others so as not to be a “stick-in-the-mud,” and to avoid being different from the group. This directly ties to a “Fear of Separation” which is seen in groups, and was also noticed by Wilfred Bion. (Rioch, 1970, p. 7) We will examine these symptoms of groupthink, along with the concept of fear of separation from the group, to investigate their effects on management actions for both Challenger and Columbia. The Challenger Accident On January 28th of 1986, the space shuttle Challenger exploded. Most people who are familiar with the facts say administrative causes of the accident included poor communication, pressures to launch, political influence, and differing views of risk by various managers. The Presidential Commission on the Space Shuttle Challenger Accident was formed to investigate the accident and reported a number of findings. The primary cause of the explosion was apparently a combustion gas leak through the right solid rocket motor aft field joint that weakened or penetrated the external tank; this, in turn, led to subsequent failure. A quick summary of the findings is useful in order to proceed. The findings include a few interesting facts that will eventually be discussed in more detail. While there were many observations made in the

J U N E 2005 Project Management Journal

31

report, the majority of the problems were not due to launch pad procedural violations or out-of-tolerance parts; the primary failure resulted from reduced resiliency of the O-rings in cold weather, and one key finding regarded behavior that was indicative of a dysfunctional organization. The Presidential Commission on the Space Shuttle Challenger Accident (a.k.a. the Rogers Commission) made some recommendations addressing the technical specifications of the faulty solid rocket motor joint and seal. There were also recommendations made as to the shuttle management structure. The project managers for the various elements of the shuttle program felt more accountable to their center management than to the shuttle program organization (Rogers Commission, 1986). Other recommendations include formulation of a policy to both impose and remove constraints, revamping of the management structure, and a variety of lesser recommendations. Groupthink—Invulnerability In his book, Vital Lies, Simple Truths, The Psychology of Self-Deception, Dan Goleman cites a passage credited to Irving Janis. The passage cited is: The leader does not deliberately try to get the group to tell him what he wants to hear … nevertheless, subtle constraints, which the leader may force inadvertently, prevent a member from fully exercising his critical powers and from openly expressing doubts when most others in the group appear to have reached consensus. (Goleman, 1985, p. 181) Janis points out that a leader may unknowingly constrain a group. Groupthink is a natural outcome when faced with pressure from above. Is it possible that some of these subtle constraints also existed at NASA resulting in groupthink? With the Challenger explosion, one has to look at some of the key reasons that led to the disaster. As documented by the commission, there was not necessarily conflict at NASA or at Morton Thiokol, but more a hesitancy to “raise a red flag.” Questions were raised, but the flag was not waved high enough. During the course of the investigation, it was discovered that NASA and Morton Thiokol had vigorously debated the wisdom of operating the shuttle in the cold temperatures predicted for the next day. The investigation also revealed a NASA culture that gradually began to accept escalating risk, and a safety program that was largely silent and ineffective (CAIB, 2003, p. 25). The argument between Morton Thiokol and NASA regarding launch-time temperatures should have resulted in an investigation. It appears the decision to launch in colder temperatures was brought about by feelings of invulnerability due to the success of the Apollo program, as well as the fact they had flown previously in somewhat similar conditions. NASA management still pushed to meet its schedule, and subconsciously relegated safety and quality to a minor role. During the pre-flight meeting, Morton Thiokol and NASA management came to see the cold temperature problem as an acceptable flight risk—a violation of a design

requirement that could be tolerated (CAIB, 2003, p. 100). Taking into account the argument between the contractor and NASA the night before the launch, as well as the organizational attributes of NASA’s culture, any discussion raised as to the safety of the mission should have been dealt with in a much more proactive manner. The commission placed fault with both NASA and Morton Thiokol. Problem areas identified after the incident by the commission included: certification that the O-Rings would function properly in the colder temperatures, willingness to accept escalated risk (due to the success of the Apollo program), and failure to implement corrective action at the O-ring test prior to the Challenger flight. Groupthink—Collective Rationalization and Pressure on Others After Morton Thiokol engineers warned about the O-ring problem, NASA discounted their opinion and urged them to reconsider. Morton Thiokol then had an internal conference, reversed their opinion, and announced that the Challenger was flight ready. We can see the effects of groupthink, where Morton Thiokol collectively rationalized their earlier decision and reversed it, all due to pressure and rationalization from NASA. The end result was that Morton Thiokol agreed there was no additional concern for flight safety. Verifiable documentation is available that sums up the Challenger incident as thus: “The decision to launch the Challenger was flawed.” Communication failures, incomplete and misleading information, and poor management judgment all figured in a decision-making process that permitted, in the words of the commission, “internal flight safety problems to bypass key shuttle managers” (CAIB, 2003, p. 100). In another instance of what could be termed “collective rationalization,” Steven P. Feldman uses a citation from a study done by K. E. Weick, where Weick cites the work of Diane Vaughn: The teleconference participants were the medium by which the invisible hand of NASA’s rules, beliefs, and procedures converted [risk] uncertainty to certainty. The system knew that the Challenger should not be launched under conditions that were this far outside the experience base. But this dispersed knowledge could not be assembled credibly, because the people who ran the teleconference were far removed from the murky technology, shims, improvisation, and tacit understanding that engineers used to make the shuttle fly. (Feldman, 2000, pp. 474-490) Groupthink—Self-Censorship and Unanimity One of the factors that contributed to the decision being made to discount the warning from the Morton Thiokol engineers was the homogeneous structure of the leadership at NASA. There was a commonality amongst NASA personnel who were technically educated with advanced degrees in the hard sciences. When a group of people have similar backgrounds, it creates a culture based on common attrib-

32

J U N E 2005 Project Management Journal

utes and, in this case, similar educational levels within NASA contributed to the homogeneity of the decision-making group. This also contributed to the cohesiveness of the group, which Janis mentions when he cites Marvin Shaw in his article on groupthink. Shaw states that “high-cohesive groups are much more effective than low-cohesive groups in achieving their goals” (Janis, 1986). Janis also held that the “superglue” of solidarity that bonds people together often causes their mental process to get stuck. Fear of Separation In addition to groupthink, the engineers at Morton Thiokol also suffered from fear of separation. This became apparent when they revised their initial warnings about O-ring failure at the behest of NASA management. They wanted to remain part of the team, as well as continue receiving their fees due to the ongoing support required. Even though the launch decision was political in nature, the fear of separation, as discussed by Harvey, also played a role in the revision of the warnings. Harvey discusses the notion that “action anxiety” occurs when people have fear of “negative fantasies” they believe will befall them if they follow their beliefs about what is right (Harvey, 1974). This is actually a fear of separation from the peer or work group. It appears that the “fear of separation” also contributes to the acceptance of groupthink. This was corroborated in a detailed analysis by Diane Vaughn (as described by Feldman) when, after much research, she came to the conclusion that there was a “normalizing of deviance.” All danger signals were interpreted as acceptable risks. The process by which this is explained is “institutional theory” (Feldman, 2000, p. 477). NASA deemed the risks to be acceptable, while the individuals, who voiced their initial concerns, were comforted by the acceptance of the group. This theory states that group acceptance is a large factor when dealing with controversial issues. The need to be accepted causes certain people to alter their opinion to be part of the group, thereby increasing the chances of jointly arriving at an erroneous decision. The Columbia Accident Ignoring Project Management Constraints as a Root Cause While the Challenger accident resulted from groupthink during a very short management meeting, leading to a decision to proceed with the flight in cold temperatures, the Columbia accident decisions took place over several days with a much larger cast of characters. According to the Columbia Accident Investigation Board (CAIB): The physical cause of the loss of Columbia and its crew was a breach in the Thermal Protection System on the leading edge of the left wing, caused by a piece of insulating foam which separated from the left bipod ramp section of the External Tank at 81.7 seconds after launch, and struck the wing in the vicinity of the lower half of Reinforced Carbon-Carbon panel number 8. During re-entry this breach in the Thermal Protection System allowed superheated air to penetrate through

the leading edge insulation and progressively melt the aluminum structure on the left wing, resulting in a weakening of the structure until increasing aerodynamic forces caused loss of control, failure of the wing, and breakup of the orbiter. (2003, p. 9) However, the Columbia Shuttle accident’s root cause can be directly tied to management’s unwillingness to understand tradeoffs between the project management scope’s triple constraints of schedule, cost, and quality. The project manager’s primary job is to balance these three objectives, to still meet scope, while making acceptable trade-offs and “integrating all aspects of the project” (Meredith & Mantel, 2000, p. 4). In the Columbia program, NASA continued the historical pattern of succumbing to political pressures to reduce budgets and accelerate schedules, while believing that quality would not suffer. Unfortunately, in this case, without enough funding or schedule relief, quality was severely compromised. This reduction in quality increased overall risks, so that safety also suffered with catastrophic consequences. CAIB referenced this theme throughout their extensive 2003 accident analysis, stating: NASA had conflicting goals of cost, schedule, and safety. Safety lost out as the mandates of an “operational system” increased the schedule pressure. Scarce resources went to problems that were defined as more serious, rather than to foam strikes or O-ring erosion. (p. 200) Following the Challenger incident, the White House replaced the NASA administrator in 1992 with Daniel S. Goldin, who served until 2001. Goldin tried to follow the principles of quality guru Edward Deming and, in doing so, made many frequent changes to procedures in search of the “faster, better, cheaper” approach, which became the new NASA motto. This method, in the name of cost cutting and efficiency, threw out many of the checks and balances developed over time (CAIB, 2003, pp. 106-107). Goldin believed that operating “faster” and “cheaper” would not result in a lowering of quality and safety. He is quoted as saying in 1994, “When I ask for the budget to be cut, I’m told it’s going to impact safety on the Space Shuttle … I think that’s a bunch of crap” (CAIB, 2003, p. 106). This attitude carried forward after Goldin was replaced by Sean O’Keefe in 2001. O’Keefe admitted that he was “not a rocket scientist,” but rather that his expertise was in the management of large government programs. His appointment was an explicit acknowledgement by the new Bush administration that NASA’s primary problems were managerial and financial (CAIB, 2003, p. 115). Under O’Keefe, management was driven to meet a predetermined schedule, because he had promised Congress that NASA would deliver on the “Core Complete” date of February 19, 2004. The Core Complete date was the final deadline to deliver the final U.S. part of the International Space Station (CAIB, 2003, p. 131). The Board discovered that:

J U N E 2005 Project Management Journal

33

At first glance, the Core Complete configuration date seemed noteworthy but unrelated to the Columbia accident. However, as the investigation continued, it became apparent that the complexity and political mandates surrounding the International Space Station Program, as well as Shuttle Program management’s responses to them, resulted in pressure to meet an increasingly ambitious launch schedule.… If this goal was not met, NASA would risk losing support from the White House and Congress for subsequent Space Station growth (CAIB, 2003, p. 131). With the intense focus on schedule, management began a pattern of ignoring or suppressing conflict that would jeopardize the schedule. During the beginning discussions of the Columbia’s foam strike, it was apparent to the Board that “most of the Shuttle Program’s concern about Columbia’s foam strike were not about the threat it might pose to the vehicle in orbit, but about the threat it might pose to the schedule” (CAIB, 2003, p. 139). So, at the time of the Columbia launch, management had systematically, over the past 10 years, removed the checks and balances implemented after Challenger, reduced costs, and focused on the schedule. This left the quality leg of the project management triangle the only flexible constraint left. In order to meet the other two legs of the triangle, management unconsciously compromised quality. This resulted in a lowering of safety and increasing risk. Groupthink—Invulnerability The symptom of invulnerability emerges when the group becomes “over-optimistic and willing to take extraordinary risks” (Janis, 1971, p. 44). NASA management and culture had reached a mind-state where they had a resistance to externally imposed changes and persisted in an attempt to maintain the internal belief that NASA was still a “perfect place,” and that NASA was “alone in its ability to execute a program of human space flight” (CAIB, 2003, p. 102). Managers continued to “maintain their view of the organization” and “they lost their ability to accept criticism, leading them to reject the recommendations of many boards and blue-ribbon panels” (CAIB, 2003, p. 102). Even while discussing the foam-strikes, management “turned the experience of failure into the memory of success” (CAIB, 2003, p. 181). Groupthink—Collective Rationalization and Pressure on Others NASA management also began to enter the realm of collective rationalization. They began to “construct rationalizations in order to discount warnings … that, taken seriously, might lead the group members to reconsider their assumptions each time they recommit themselves to past decisions” (Janis, 1971, p. 44). In addition, they began to “apply pressure to any individual who … expresses doubts” (Janis, 1971, p. 74). CAIB (2003) also noted that: External criticism and doubt, rather than spurring NASA to change for the better, instead reinforced the will to “impose the party line vision on the environment, not to

reconsider it”…. This in turn led to “flawed decision making, self deception, introversion and a diminished curiosity about the world outside the perfect place.” (p. 102) Management began to suppress information that did not agree with their own perception of the world as noted by the Board: Communication did not flow effectively up to or down from Program managers. As it became clear during the mission that managers were not as concerned as others about the danger of the foam strike, the ability of engineers to challenge those beliefs greatly diminished. Managers’ tendency to accept opinions that agree with their own dams the flow of effective communications. (CAIB, 2003, p. 169) In fact, even before any analysis of the foam strike took place, the “Shuttle Program managers officially shared their belief that the strike posed no safety issues, and that there was no need for a review to be conducted over the weekend” (CAIB, 2003, p. 142). Other evidence of suppression occurred in 1995 when, due to budget cuts; the parallel contractor and government engineering teams were reduced to one team. This prompted a senior engineer, Jose Garcia, to risk his job by sending a letter to President Clinton, bypassing the NASA hierarchy, and stating: Historically NASA has employed two engineering teams …, one contractor and one government, to cross check each other and prevent catastrophic errors … although this technique is expensive, it is effective, and it is the single most important factor that sets the Shuttle’s success above that of any other launch vehicle…. Anyone who doesn’t have a hidden agenda or fear of losing his job would admit that you can’t delete NASA’s checks and balances system of Shuttle processing without affecting the safety of the Shuttle and crew. (CAIB, 2003, p. 108) Unfortunately, this warning was not heeded, and the operations contract was awarded to a sole contractor. At this point, there was no counterbalancing group to offset any mistakes made by the contractor. The management team used collective self-rationalization to help believe that their preconceived ideas were correct, and pressure those who didn’t agree with them. Groupthink—Self-Censorship and Unanimity After the time of the Columbia liftoff, during discussions about the foam strike, the management team began to seek consensus, and rejected any evidence that did not meet their preconceived notions. They had decided that the foam condition was of no immediate consequence, but might be a maintenance problem later. Their focus on schedule drove them to ignore possible current danger for the mission in orbit. According to Janis, they showed effects of self-censorship, where they began to “keep silent about their misgiv-

34

J U N E 2005 Project Management Journal

ings and even minimize to themselves the importance of their doubts” (Janis, 1971, p. 74). They also began to seek the “illusion of unanimity within the group … any individual who remains silent during … discussion is in full accord with what the others are saying” (Janis, 1971, p. 74). Immediately after the foam strike, three separate requests from engineers were made to obtain military satellite imaging of the Columbia to look for damage. These requests, however, “were ultimately dismissed by the Mission Management Team.” These denials were “then interpreted by the Debris Assessment Team as a direct and final denial of their request for imagery” (CAIB, 2003, p. 152). In fact, the management team had already begun selfcensorship. As CAIB (2003) explained: [They] resisted new information. Early in the mission it became clear that the Program was not going to authorize imaging because … images were not needed … evidence indicates that Program leaders decided the foam strike was merely a maintenance problem long before any analysis had begun (p. 181). Management continued to self-censor by focusing “on the information that tended to support the expected or desired result at that time … believing the safety of the mission was not at risk, managers drew conclusions that minimized the risk of delay” (CAIB, 2003, p. 200). In addition, management did not probe or question for more information. Instead, the Chair of the Mission Management Team, Linda Ham, actively turned down requests for imaging, raising budget concerns that she didn’t want to spend the resources, but later she “publicly stated she did not know of the Debris Assessment Team member’s desire for imaging” (CAIB, 2003, p. 153). Most likely, she unconsciously blocked out any actions that would delay the next launch for which she would be the Launch Integration Manager. All of these activities combined to produce an atmosphere of fear, where concerns could not bubble up to the correct managers. In fact, “The engineers found themselves in the unusual position of having to prove that the situation was unsafe—a reversal of the usual requirement to prove a situation is safe” (CAIB, 2003, p. 169). Their conclusion that the foam breach posed no risk to the shuttle was further given the illusion of unanimity, due to the ineffective safety board. During these meetings—where people from the safety board were present—they were: … passive and did not serve as a channel for the voicing of concerns or dissenting views.… they were merely party to the analysis process and conclusions instead of an independent source of questions and challenges.… The highest-ranking safety representative at NASA headquarters deferred to Program managers when asked for an opinion on imaging of Columbia. The safety manager he spoke to also failed to follow up. (CAIB, 2003, p. 170) In these incidents, the management team insulated themselves from reality by self-censoring facts that did not meet with their ideas, and also created an illusion of una-

nimity by not having any alternate views presented forcefully by the safety board. Fear of Separation Fear of separation, as described by Harvey (1974), was again seen at NASA in some incidents surrounding the Columbia accident. In one case, the Debris Assessment team, when asked why they did not push their concerns about damage further, replied that “by raising contrary points of view about Shuttle mission safety, they would be singled out for possible ridicule by their peers and managers” (CAIB, 2003, p. 169). In another incident, engineer Rodney Rocha composed a long e-mail with his concerns and reasons for requesting imaging of the foam strike area. His comments were prescient: In my humble technical opinion, this is the wrong (and bordering on irresponsible) answer from the SSP and Orbiter not to request additional imaging help from any outside source. I must emphasize (again) that severe enough damage (3 or 4 multiple tiles knocked out down to the densification layer) combined with the heating and resulting damage to the underlying structure at the most critical location … could present potentially grave hazards…. Remember the NASA safety posters everywhere around stating, “If it’s not safe, say so?” Yes, it’s that serious. (CAIB, 2003, p. 157) However, he did not send this e-mail. When questioned why, he said, “that he did not want to jump the chain of command … he would defer to management’s judgment on obtaining imagery” (CAIB, 2003, p. 157). In both of these cases, good people who had valid points and reasons for raising them to higher management were suppressed. They did not fight harder, because they did not want to be separated from their peer group, or be considered a troublemaker by management. The Safety Advisory Panel The Columbia Accident Investigation Board was very critical of the lack of checks and balances between NASA and the Safety and Mission Assurance organization in the Shuttle program. They noted that the safety organization “due to a lack of capability and resources independent of the Shuttle program, was not an effective voice … safety personnel present … were largely silent during the events leading up to the loss of Columbia” (CAIB, 2003, p. 192). Due to the scathing criticism, nine safety experts quit in September 2003. They noted that they felt “a very big sense of frustration” because NASA was not receptive to the changes they recommended (CBS News, September 2003). Again, we can see that individuals were trying to make changes, but due to the safety panels’ lack of independence from NASA itself, their concerns were suppressed or ignored by NASA management. Despite the fact that, 20 years earlier, an independent safety organization was recommended by the Rogers Commission, “little to no progress has been made toward attaining integrated, independent, and detailed analy-

J U N E 2005 Project Management Journal

35

ses of risk to the Space Shuttle system” (CAIB, 2003, p. 193). Analysis: Comparison of Challenger and Columbia NASA, as a governmental entity, is subject to regular political pressures. It was formed to help the U.S. compete in the space race with Russia. Throughout its history, we have seen that political pressures were commonplace, and were usually linked with related schedule and budgetary constraints. Despite these hurdles, NASA was able to conquer the challenges to achieve mission objectives. Pride and ego created by these successes helped NASA rationalize risk and quality tolerance in a tradeoff versus cost and schedule. In both the Challenger and Columbia tragedies, budget and schedule were constrained. Management was trying to meet a schedule, which in their eyes became a more important factor than flight safety. In the case of the Challenger, there was willingness to take a certain amount of acceptable risks. This risk-tolerant attitude can be attributed to NASA’s proclivity for solving technical problems “on the fly.” Thus, the risk of forecasted low temperature was ignored. Desire to maintain flight schedules dominated the thought process of decision-makers, not O-rings. For Columbia, schedule pressures again were at the root of decision-making to meet the space station schedule. Symptoms of groupthink were also present in both cases. For Challenger, this happened in the one meeting to decide flight readiness of the shuttle at low temperatures. For Columbia, the groupthink manifested itself at the Mission Management team. This team had a set of preconceived notions about the lack of flight danger after foam strikes and, thus, inadvertently suppressed information that would likely have lead to a different conclusion. Organizational structure was also a common thread between both accidents. The bureaucratic structure created an impediment to the receipt of important data. Self-censorship within the organization resulted in having inaccurate information for decision-making. Management could have overcome these challenges by probing for answers, or by playing devil’s advocate. In the end, this lack of communication actually created an atmosphere where the engineers were afraid to speak out due to a fear of separation from their group. In analyzing these two incidents, it becomes evident that the culture of NASA is a central factor in both cases. History has shown that NASA placed great emphasis on safety in the early years. As time progressed, the string of unprecedented successes, ability to overcome technological challenges, and heroic labels placed on the program led to complacency and a sense of invulnerability. Ultimately NASA management began to trade off this safety emphasis by reducing quality in response to schedule and cost constraints. Recommendations Utilizing the advantage of hindsight, we have developed recommendations for both NASA and other organizations to help avoid the problems of groupthink that can lead to tragic outcomes.

Independent Quality Review At NASA, the supposedly independent safety board was actually ineffective at providing checks and balances to the operational organization. This problem was supposed to be fixed after Challenger, but the independent safety board was gradually dismantled, as the pressures of cost became too great. An independent organization needs to promote “minority opinions” and “bad news” in order to ensure that important information is not suppressed (CAIB, 2003, p. 181). In many companies, experts from other work groups perform self-audits. These reviews are conducted to provide a diverse opinion of the work. This type of process is used to challenge work team assumptions and minimizes the potential of single-minded alternatives. The independent reviewer should not receive budgetary funding from the group; the reviewer must be truly independent and impartial in order to give a “cold eyes” review. Such a review should also examine the state-of-affairs of the organization. Even though things are going well and everything is running smoothly, management should be vigilant for areas where problems may be festering. (Fortunately, companies usually rally around problem-solving situations.) When things are going well, people have a tendency to become complacent. At such times, situations can arise that shake the foundation of the organization if people “let their guard down”; this is precisely what happened at NASA, where a pervasive groupthink tendency prevented concerns about O-rings and foam strikes from being recognized. Open Communications Organizations need a way for the lower-level employee to be able to call a “hot line” or utilize other means to have a direct line to management for important information. Management must not penalize people using this direct line, and should also encourage other multiple channels of communication to ensure that they have correct information. When working on extremely critical issues, Janis recommends that management set up multiple groups under different leadership to work on the same problem (Janis, 1971, p. 76). He also recommends assigning a devil’s advocate to take opposite opinions during meetings (Janis, 1971, p. 76). Project Renegotiation Organizations should perform a complete analysis when any of the three constraints commonly used in project management are at risk. The three constraints are cost, quality, and schedule. For the entire duration of NASA’s space exploration, schedule was the critical driver. This started with the U.S.-Soviet Cold War and the associated space race. More recently, it was political factors that resulted in schedule pressures up to, and including, the recent Columbia disaster. As noted in the Columbia section, groupthink was definitely a prime cause of the accident. If any of the triple constraints are in danger of making a project fail, a viable risk management plan must be put into place. Such a risk management plan failed to materialize throughout the age of

36

J U N E 2005 Project Management Journal

space exploration. In such a situation, the scope of the project should be renegotiated, instead of ignoring the other two constraints—which would also be affected by the failure of any one of the constraints. For example, if the schedule was failing, a corresponding failure of quality and cost would be expected if the schedule was to be maintained. Complete renegotiations must be adopted to avoid future disasters. Leadership The Apollo 1 fire resulted in death and ASAP was formed as a result of the findings of the Apollo 1 Review Board. Nineteen years later, the Challenger explosion resulted in death. Another 17 years later, the Columbia explosion resulted in death. Similar recommendations to the Apollo 1 findings were made as a result of the Challenger and Columbia investigations. Safety concerns were at the heart of these investigations; however safety issues continue to occur. Many unmanned missions resulted in failure with loss of equipment. The entire history of NASA is riddled with a proportional rate of failure when compared to the successes. What is at issue is how NASA management behaves regarding safety. The ultimate indicator of NASA’s safety behavior is the resignation of the safety panel in September of 2003. Panel members resigned since management would not respond to their concerns. Our recommendations herein are geared towards addressing this behavior. We concur with the need for a safety panel. We additionally feel there must be a way to ensure that the organization does not fall again into a groupthink trap. Until top management puts primary emphasis on safety (quality) instead of cost or schedule, there will continue to be incidents with loss of life and/or equipment. Cost or schedule can be mitigated by further negotiation, but safety should not be compromised. Within NASA, there was always intent to prevent accidents from happening, but, due to cost and/or schedule pressures, safety took a back seat. Elimination of Group Behavior In both the Challenger and Columbia disasters, there were multiple occurrences of the prime components of groupthink. These included a sense of invulnerability, collective rationalization, pressure on others, self-censorship, unanimity, and fear of separation. Steps must be taken to eliminate these aspects of group behavior. One step to eliminate this behavior might be to remove all cost and schedule pressures until the safety concerns have been addressed to the fullest. A revision to the mission statement or scope might be wording such as: “Obtain the best design first; schedule and cost are secondary to design. The design needs to have safety as attribute # 1.” Conclusions In summary, this examination of NASA and the Challenger and Columbia accidents offers important lessons to all organizations. We find that it was not solely the technical problems that were at fault, but, instead, management

refusal to acknowledge the triple project management constraints of budget, schedule, and quality. When cost and time pressures were introduced, NASA failed to understand the tradeoffs associated with lowering of quality, which resulted in lower safety and higher risk. When one of the constraints is changed, one must manage the tradeoffs that will occur. In addition, management easily fell into a groupthink mode, which colored their decisions and caused them to reject information that did not match preconceived ideas. In this example from NASA, as well as in many other corporate examples (such as Enron), quality (which includes safety, risk, and controls) was compromised to meet a budget or schedule. This refusal to believe in the constraints is often seen at project reviews, where the project executive fixes the schedule and budget, and also wants to retain all product features. Unless managed properly, it is not possible to have “faster, better, cheaper” as advocated by Mr. Goldin at NASA. NASA has accomplished many great things over the last four and a half decades. The failures that resulted in deaths of astronauts cast a heavy shadow over this success. This study makes it apparent that management must always be vigilant about groupthink and work to prevent it from happening by improving communication, and by encouraging alternate opinions. A lack of awareness could easily turn into a terrible tragedy, as it has multiple times at NASA. With revised priorities and practices, we strongly feel that NASA can continue its legacy of going where no man has gone before…safely. References BBC News. (2003, February 20). On this day: 1962: American spaceman rounds the world. Retrieved November 6, 2003, from http://news.bbc.co.uk/onthisday/hi/dates/stories/february/20/newsid_2552000/2552161.stm CBS News, (2003, September 23). NASA safety panel members quit. Retrieved November 6, 2003, from http://www.cbsnews.com/stories/2003/10/03/tech/main57 6526.shtml Columbia Accident Investigation Board (CAIB) (2003, August). National Aeronautics and Space Administration (NASA) report, Vol. 1. Retrieved November 6, 2003, from http://www.caib.us/news/report/default.html Davita, S. F. (2003). Uncommon sense. Washington, DC: The George Washington University. Feldman, S. P. (2000, December). Micro matters—The aesthetics of power in NASA’s flight readiness review. The Journal of Applied Behavioral Science, 36(4), 474. Garber, S. (2003, February 21). Sputnik and the Dawn of the Space Age. NASA History Web. Retrieved November 6, 2003, from http://www.hq.nasa.gov/office/pao/History/sputnik/ Goleman, D. (1985). Vital lies, simple truths – The psychology of self-deception. New York: Simon and Schuster. Harvey, J. B. (1974, Summer). The Abilene paradox: The management of agreement. Organizational Dynamics, 3(1) Janis, I. (1971, November) Groupthink Psychology Today, 5(6), 43-46; 74-76.

J U N E 2005 Project Management Journal

37

Janis, I. (1986, September) Groupthink: Psychological studies of policy decision. New York: Houghton-Mifflin. Kennedy, J. (1961, May 25) Special message to Congress on urgent national needs. Retrieved November 6, 2003, from http://www.jfklibrary.org/j052561.htm Launius, R., & Fries, C. (2003, August 11).Chronology of defining events in NASA history Retrieved November 6, 2003, from http://www.hq.nasa.gov/office/pao/History/Definingchron.htm Meredith, J. R., & Mantel, S. J. (2000). Project management, a managerial approach. New York: John Wiley & Sons. NASA. (2003, February 25). NASA Aerospace Safety Advisory Panel report. Retrieved November 6, 2003, from

http://www.hq.nasa.gov/office/codeq/asap/charter.htm Project Management Institute. (2000). A guide to the project management body of knowledge (PMBOK® guide). Retrieved November 6, 2003, from http://www.pmi.org/ prod/groups/public/documents/info/pp_pmbokguide2000 excerpts.pdf. Rioch, M. J. (1970, February). The work of Wilfred Bion on groups. Psychiatry Journal for the Study of Interpersonal Process, 33, 7. Rogers Commission (1986, February 3). Report of the Presidential Commission on the Space Shuttle Challenger accident. Retrieved November 6, 2003, from http://science.ksc.nasa.gov/shuttle/missions/51-l/docs/rogers-commission/table-of-contents.html

BOB DIMITROFF received a Bachelor of Science degree in Mining Engineering in 1980 from the University of Pittsburgh, and a Master of Business Administration in 1985 from California State University, Bakersfield. He is currently working towards a Master of Science in Project Management from The George Washington University, with an expected graduation date in 2006. Bob obtained his Professional Engineer’s license in Mechanical Engineering from the State of California in 1993. He also earned Project Management Professional (PMP®) certification from the Project Management Institute in 1998. He has worked 24 years in the oil and gas business for ChevronTexaco, and is currently working as a project professional in their liquefied natural gas business.

LU ANN SCHMIDT received a Bachelor of Science degree in Chemical Engineering in 1976 from the University of Texas at Austin, a Master of Business Administration in 1981 from the University of Houston, and will complete a Master of Science in Project Management from The George Washington University in 2005. Lu Ann earned PMP® certification in 2001. She has been employed at Exxon Mobil Global Services Company for 28 years, and is currently an Information Systems project manager.

TIM BOND received his degree in Manufacturing Engineering in 1981. In 2001, he received his Master of Business Administration from City University. He is currently working towards a Master of Science in Project Management from The George Washington University, with an expected graduation date in 2006. Tim obtained his Certification in Manufacturing Engineering in 1987, and Certified Engineering Manager accreditation in 2003. He also completed accreditation as a Stanford Certified Project Manager in 2004. Tim has worked 25 years in the transportation industry for a variety of major manufacturers, and is presently working as a Large Scale System Integrator for his current employer.

38

J U N E 2005 Project Management Journal
PLACE THIS ORDER OR A SIMILAR ORDER WITH US TODAY AND GET AN AMAZING DISCOUNT 🙂

 

 

© 2020 customphdthesis.com. All Rights Reserved. | Disclaimer: for assistance purposes only. These custom papers should be used with proper reference.