Question

In: Civil Engineering

ENMA 480: ETHICS AND PHILOSOPHY FOR ENGINEERING APPLICATIONS Chairman   Rogers   of   the   Challenger   Accident   Investigation   asked  ...

ENMA 480: ETHICS AND PHILOSOPHY FOR ENGINEERING APPLICATIONS

Chairman   Rogers   of   the   Challenger   Accident   Investigation   asked   Bob   Lund:   “Why   did   you  
change   your   decision   [that   the   seals   would   not   hold   up]   when   you   changed   hats?”   What   might  
motivate   you,   as   a   midlevel   manager,   to   go   along   with   top   management   when   told   to   “take   off  
your   engineering hat   and   put   on   your   management hat”?  

Applying   the   engineering-as-experimentation   model,   what   might   responsible   experimenters  
have   done   in   response   to   the   question?

Solutions

Expert Solution

I've given the example as per chapter 4 John Kenneth Reyes:

Let us apply this discussion of engineering as social experimentation to the explosion of the space shuttle Challenger, and by extension the space shuttle Columbia. The Columbia and its sister ships, Challenger, Discovery, and Endeavor, were delta-wing craft with a huge payload bay (Figure 4-1). Early, sleek designs had to be abandoned to satisfy U.S. Air Force requirements when the Air Force was ordered to use the National Aeronautics and Space Administration (NASA) shuttle instead of its own expendable rockets for launching satellites and other missions. As shown in Figure 4–2, each orbiter has three main engines fueled by several million pounds of liquid hydrogen; the fuel is carried in an immense, external, divided fuel tank, which is jettisoned when empty. During liftoff the main engines fire for approximately 8.5 minutes, although during the first 2 minutes of the launch much of the thrust is provided by two booster rockets. These are of the solid-fuel type, each burning a one-million-pound load of a mixture of aluminum, potassium chloride, and iron oxide.



The casing of each booster rocket is approximately 150 feet long and 12 feet in diameter. It consists of cylindrical segments that are assembled at the launch site. The four field joints use seals composed of pairs of O-rings made of vulcanized rubber. The O-rings work in conjunction with a putty barrier of zinc chromite.



The shuttle flights were successful, although not as frequent as had been hoped. NASA tried hard to portray the shuttle program as an operational system that could pay for itself. But aerospace engineers intimately involved in designing, manufacturing, assembling, testing, and operating the shuttle still regarded it as an experimental undertaking in 1986. These engineers were employees of manufacturers, such as Rockwell International (orbiter and main rocket) and Morton-Thiokol (booster rockets), or they worked for NASA at one of its several centers: Marshall Space Flight Center, Huntsville, Alabama (responsible for the propulsion system); Kennedy Space Center, Cape Kennedy, Florida (launch operations); Johnson Space Center, Houston, Texas (flight control); and the office of the chief engineer, Washington, D.C. (overall responsibility for safety, among other duties).



After embarrassing delays, Challenger’s first flight for 1986 was set for Tuesday morning, January 28. But Allan J. McDonald, who represented Morton-Thiokol at Cape Kennedy, was worried about the freezing temperatures predicted for the night. As his company’s director of the solid-rocket booster project, he knew of difficulties that had been experienced with the field joints on a previous cold-weather launch when the temperature had been mild compared to what was forecast. He therefore arranged a teleconference so that NASA engineers could confer with Morton- Thiokol engineers at their plant in Utah.



Arnold Thompson and Roger Boisjoly, two seal experts at Morton-Thiokol, explained to their own colleagues and managers as well as the NASA representatives how on launch the booster rocket walls bulge, and the combustion gases can blow past one or even both of the O-rings that make up the field joints (see Figure 4–2).15 The rings char and erode, as had been observed on many previous flights. In cold weather the problem is aggravated because the rings and the putty packing are less pliable then. But only limited consideration was given to the past history of O-ring damage in terms of temperature. Consideration of the entire launch temperature history indicates that the probability of Oring distress is increased to almost a certainty if the temperature of the joint is less than 65°F.16



The engineering managers, Bob Lund (vice president of engineering) and Joe Kilminster (vice president for booster rockets), agreed that there was a problem with safety. The team from Marshall Space Flight Center was incredulous. Because the specifications called for an operating temperature of the solid fuel prior to combustion of 40°F to 90°F, one could surely allow lower or higher outdoor temperatures, notwithstanding Boisjoly’s testimony and recommendation that no launch should occur at less than 53°F. They were clearly annoyed at facing yet another postponement.



Top executives of Morton-Thiokol were also sitting in on the teleconference. Their concern was the image of the company, which was in the process of negotiating a renewal of the booster rocket contract with NASA. During a recess Senior Vice President Jerry Mason turned to Bob Lund and told him “to take off your engineering hat and put on your management hat.” It was a subsequent vote (of the managers only) that produced the company’s official finding that the seals could not be shown to be unsafe. The engineers’ judgment was not considered sufficiently weighty. At Cape Kennedy, Allan McDonald refused to sign the formal recommendation to launch; Joe Kilminster had to. Accounts of the Challenger disaster tell of the cold Tuesday morning, the high seas that forced the recovery ships to seek coastal shelter, the ice at the launch site, and the concern expressed by Rockwell engineers that the ice might shatter and hit the orbiter or rocket casings.17 The inability of these engineers to prove that the liftoff would be unsafe was taken by NASA as an approval by Rockwell to launch.



The countdown ended at 11:38 AM. The temperature had risen to 36°F. As the rockets carrying Challenger rose from the ground, cameras recorded puffs of smoke that emanated from one of the field joints on the right booster rocket. Soon these turned into a flame that hit the external fuel tank and a strut holding the booster rocket. The hydrogen in the tank caught fire, the booster rocket broke loose, smashed into Challenger’s wing, then into the external fuel tank. At 76 seconds into the flight, by the time Challenger and its rockets had reached 50,000 feet, it was totally engulfed in a fireball. The crew cabin separated and fell into the ocean, killing all aboard: Mission Commander Francis (Dick) Scobee; Pilot Michael Smith; Mission Specialists Gregory Jarvis, Ronald McNair, Ellison Onizuka, Judith Resnik; and “teacher in space” Christa MacAuliffe.



Why was safe operation of the space shuttle not stressed more? First of all, we must remember that the shuttle program was indeed still a truly experimental and research undertaking. Next, it is quite clear that the members of the crews knew that they were embarking on dangerous missions. But it has also been revealed that the Challenger astronauts were not informed of particular problems such as the field joints. They were not asked for their consent to be launched under circumstances that experienced engineers had claimed to be unsafe and without any safe escape mechanism (safe exit) available should things go wrong. The reason for the rather cavalier attitude toward safety is revealed in the way NASA assessed the system’s reliability. For instance, recovered booster rocket casings had indicated that the field-joint seals had been damaged in many of the earlier flights. The waivers necessary to proceed with launches had become mere gestures. Richard Feynman made the following observations as a member of the Presidential Commission on the Space Shuttle Challenger Accident (called the Rogers Commission after its chairman): “I read all of these [NASA flight readiness] reviews and they agonize whether they can go even though they had some blow-by in the seal or they had a cracked blade in the pump of one of the engines . . . and they decide ‘yes.’ Then it flies and nothing happens. Then it is suggested . . . that the risk is no longer so high. For the next flight we can lower our standards a little bit because we got away with it last time . . . It is a kind of Russian roulette.”18



Since the early days of unmanned space flight, approximately 1 in every 25 solid-fuel rocket boosters failed. Given improvements over the years, Feynman thought that 1 in every 50 to 100 might be a reasonable estimate now. Yet NASA counted on only one crash in every 100,000 launches.



Another area of concern was NASA’s unwillingness to wait out risky weather. When serving as weather observer, astronaut John Young was dismayed to find his recommendations to postpone launches disregarded several times. Things had not changed much by March 26, 1987, when NASA ignored its devices monitoring electric storm conditions, launched a Navy communications satellite atop an Atlas-Centaur rocket, and had to destroy the $160 million system when it veered off course after being hit by lightning. The monitors had been installed after a similar event involving an Apollo command module eighteen years before had nearly aborted a trip to the moon.



Veteran astronauts were also dismayed at NASA management’s decision to land at Cape Kennedy as often as possible despite its unfavorable landing conditions, including strong crosswinds and changeable weather. The alternative, Edwards Air Force Base in California, is a better landing place but necessitates a piggyback

ride for the shuttle on a Boeing 747 home to Florida. This costs time and money.



In 1982 Albert Flores had conducted a study of safety concerns at the Johnson Space Center. He found its engineers to be strongly committed to safety in all aspects of design. When they

were asked how managers might further improve safety awareness, there were few concrete suggestions but many comments on how safety concerns were ignored or negatively affected by management. One engineer was quoted as saying, “A small amount of professional safety effort and upper management support can cause a quantum safety improvement with little expense.”19 This points to the important role of management in building a strong sense of responsibility for safety first and schedules second.



The space shuttle’s field joints are designated criticality 1, which means there is no backup. Therefore a leaky field joint will result in failure of the mission and loss of life. There are 700 items of criticality 1 on the shuttle. A problem with any one of them should have been cause enough to do more than just launch more shuttles without modification while working on a better system. Improved seal designs had already been developed, but the new rockets would not have been ready for some time. In the meantime, the old booster rockets should have been recalled.



In several respects the ethical issues in the Challenger case resemble those of other such cases. Concern for safety gave way to institutional posturing. Danger signals did not go beyond Morton-Thiokol and Marshall Space Flight Center in the Challenger case. No effective recall was instituted. There were concerned engineers who spoke out, but ultimately they felt it only proper to submit to management decisions.



One notable aspect of the Challenger case is the late-hour teleconference that Allan McDonald had arranged from the Challenger launch site to get knowledgeable engineers to discuss the seal problem from a technical viewpoint. This tense conference did not involve lengthy discussions of ethics, but it revealed the virtues (or lack thereof) that allows us to distinguish between the “right stuff” and the “wrong stuff.” This is well described by one aerospace engineer as arrogance, specifically, “The arrogance that prompts higher-level decision makers to pretend that factors other than engineering judgment should influence flight safety decisions and, more important, the arrogance that rationalizes overruling the engineering judgment of engineers close to the problem by those whose expertise is naive and superficial by comparison.”20 Included, surely, is the arrogance of those who reversed NASA’s (paraphrased) motto “Don’t fly if it cannot be shown to be safe” to “Fly unless it can be shown not to be safe.”



In a speech to engineering students at the Massachusetts Institute of Technology a year after the Challenger disaster, Roger Boisjoly said: “I have been asked by some if I would testify again if I knew in advance of the potential consequences to me and my career. My answer is always an immediate yes. I couldn’t live with any self-respect if I tailored my actions based on potential personal consequences as a result of my honorable actions





Summary of Chapter IV



1.On June 5, 1976, Idaho’s Teton Dam collapsed, killing eleven people and causing $400 million in damage. The Bureau of Reclamation, which built the ill-fated Teton Dam, allowed it to be filled rapidly, thus failing to provide sufficient time to monitor for the presence of leaks in a project constructed with less-thanideal soil.7 Drawing on the concept of engineering as social experimentation, discuss the following facts uncovered by the General Accounting Office and reported in the press.



a.Because of the designers’ confidence in the basic design of Teton Dam, it was believed that no significant water seepage would occur. Thus sufficient instrumentation to detect water erosion was not installed.



b. Significant information suggesting the possibility of water seepage was acquired at the dam site six weeks before the collapse. The information was sent through routine channels from the project supervisors to the designers and arrived at the designers the day after the collapse.



c. During the important stage of filling the reservoir, there was no around-the-clock observation of the dam. As a result, the leak was detected only five hours before the collapse. Even then, the main outlet could not be opened to prevent the collapse because a contractor was behind schedule in completing the outlet structure.



d. Ten years earlier the Bureau’s Fontenelle Dam had experienced massive leaks that caused a partial collapse, an experience the bureau could have drawn on.



2. Research the collapse of the Interstate 35W Bridge in Minneapolis on August 1, 2007, which killed 13 people and injured 100 more. In light of the social experimentation model, discuss its causes and whether it could have been prevented.



3. Debates over responsibility for safety in regard to technological products often turn on who should be considered mainly responsible, the consumer (“buyer beware”) or the manufacturer (“seller beware”). How might an emphasis on the idea of informed consent influence thinking about this question?



4. Thought models often influence thinking by effectively organizing and guiding reflection and crystallizing attitudes. Yet they usually have limitations and can they be misleading to some degree. With this in mind, critically assess the strengths and weaknesses you see in the social experimentation model. One possible criticism you might consider is whether the model focuses too much on the creation of new products, whereas a great deal of engineering involves the routine application of results from past work and projects. Another point to consider is how informed consent is to be measured in situations where different groups are involved, as in the construction of a garbage incinerator near a community of people having mixed views about the advisability of constructing the incinerator.



5.A common excuse for carrying out a morally questionable project is, “If I don’t do it somebody else will.” This rationale may be tempting for engineers who typically work in situations where someone else might be ready to replace them on a project. Do you view it as a legitimate excuse for engaging in projects that might be unethical? In your answer, comment on the concept of responsible conduct developed in this section.



6. Another commonly used phrase, “I only work here,” implies that one is not personally accountable for the company rules because one does not make them. It also suggests that one wishes to restrict one’s area of responsibility within tight bounds as defined by those rules. In light of the discussion in this section, respond to the potential implications of this phrase and the attitude it represents when exhibited by engineers.



7. Threats to a sense of personal responsibility are neither unique to, nor more acute for, engineers than they are for others involved with engineering and its results. The reason is that, in general, public accountability also tends to lessen as professional roles become narrowly differentiated. With this in mind, critique each of the remarks made in the following dialogue. Is the remark true, or partially true? What needs to be added to make it accurate?



Engineer: My responsibility is to receive directives and to create products within specifications set by others. The decision about what products to make and their general specifications are economic in nature and made by management.



Scientist: My responsibility is to gain knowledge. How the knowledge is applied is an economic decision made by management, or else a political decision made by elected representatives in government.



Manager: My responsibility is solely to make profits for stockholders.



Stockholder: I invest my money for the purpose of making a profit. It is up to our boards and managers to make decisions about the directions of technological development.



Consumer: My responsibility is to my family. Government should make sure corporations do not harm me with dangerous products, harmful side effects of technology, or dishonest claims.



Government Regulator: By current reckoning, government has strangled the economy through overregulation of business. Accordingly, at present on my job, especially given decreasing budget allotments, I must back off from the idea that business should be policed, and urge corporations to assume greater public responsibility.



8. Mismatched bumpers: What happens when a passenger car rear-ends a truck or a sports utility vehicle (SUV)? The bumpers usually ride at different heights, so even modest collisions can result in major repair bills. (At high speed, with the front of the car nose down when braking, people in convertibles have been decapitated on contact devoid of protection by bumpers.) Ought there to be a law?



9. Chairman Rogers asked Bob Lund: “Why did you change your decision [that the seals would not hold up] when you changed hats?” What might motivate you, as a midlevel manager, to go along with top management when told to “take off your engineering hat and put on your management hat”? Applying the engineering-as-experimentation model, what might responsible experimenters have done in response to the question?



10. Under what conditions would you say it is safe to launch a shuttle without an escape mechanism for the crew? Also, discuss the role of the astronauts in shuttle safety. To what extent should they (or at least the orbiter commanders) have involved themselves more actively in looking for safety defects in design or operation?



11. Examine official reports to determine to what extent the Columbia explosion can be ascribed to technical and/or management deficiencies. Was there a failure to learn from earlier events? (Search the Web under “CAIB Report”: the Columbia Accident Investigation Board Report.)


Related Solutions

ENMA 480: ETHICS AND PHILOSOPHY FOR ENGINEERING APPLICATIONS Cigarettes   kill   more   than   400,000   Americans   each   year,  ...
ENMA 480: ETHICS AND PHILOSOPHY FOR ENGINEERING APPLICATIONS Cigarettes   kill   more   than   400,000   Americans   each   year,   which   is   more   than   the   combined   deaths   caused   by   alcohol   and   drug   abuse,   car   accidents,   homicide,   suicide,   and   acquired   immunodeficiency   syndrome   (AIDS).   Cigarette   companies   do   much   good   by   providing   jobs   (Philip   Morris   employs   more   than   150,000   people   worldwide),   through   taxes   (more   than   $4   billion   paid   by   Philip   Morris   in   a   typical   year),   and   through   philanthropy.   Most   new   users   of   cigarettes   in   the   United   States  ...
ENMA 480-ETHICS QUESTIONS 1.   What is your definition of “ethics”? Are there differences in engineering ethics...
ENMA 480-ETHICS QUESTIONS 1.   What is your definition of “ethics”? Are there differences in engineering ethics versus other kinds of ethics, such as business ethics or medical ethics? 2.   What recent or historical engineering case studies do you find most interesting? 3.   Describe a moral or ethical issue that is most conflicting to you and explain your indecisiveness about it. 4.   Do humans have free will or is our fate predetermined? Explain. You may choose to provide your definition of...
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT