Pages

Saturday, April 14, 2012

Engineering for Success by building on failure--like Aldrin says " use & improve Shuttle

News > Science
Twitter (5)Facebook (14)ShareComments (0)Recommend (1)
 
Engineering For Success By Building on Failure
Listen to the Story
Talk of the Nation [13 min 8 sec]
 
text size A A A April 13, 2012
In a new book, To Forgive Design: Understanding Failure, engineer Henry Petroski chronicles disasters from the sinking of the Titanic to the destruction of space shuttles Challenger and Columbia. Petroski discusses why these accidents are often caused by factors other than a design flaw.

Copyright © 2012 National Public Radio®. For personal, noncommercial use only. See Terms of Use. For other uses, prior permission required.

IRA FLATOW, HOST:

Hey, buddy, you want to buy a bridge? It wasn't built to spec, but I can give you a good price. Of course we're talking about the Brooklyn Bridge, which you'll learn if you read Henry Petroski's new book, was built with inferior parts that did not quite live up to the standards the engineer had called for. Yet there it stands, still stands today because the designer was wise enough to know that stuff happens.

My next guest says that it's important to look at structural failures, whether we are talking about the sinking of the Titanic, a space shuttle disaster, a smartphone malfunction - look at them in a larger context, as a system that includes people who both maintain and use the structure.

Dr. Henry Petroski is the author of the new book "To Forgive Design: Understanding Failure." He's professor of civil engineering and history at Duke University. He joins us from Durham, North Carolina. Welcome back SCIENCE FRIDAY, Henry, it's always good to have you.

HENRY PETROSKI: Thank you, Ira, it's always good to be here.

FLATOW: You must be getting a lot of questions about the 100th anniversary of the sinking of the Titanic, about whose fault it really was.

PETROSKI: Well, that's very difficult to pin down to one or two people. This is a system. This is a big ship, a big piece of machinery going out into waters that are dangerous, with a lot of people onboard, with insufficient lifeboats. There are so many dimensions to the Titanic story, and I think that's one of the reasons that we keep hearing new things about it. And we sometimes change our minds about what we think. One thing it seems to me to be sure and that is that the ship was marketed as unsinkable.

And as we know, that was simply not true at all. The chances of hitting an iceberg were slim. Let's just say for the sake of argument that the chance of hitting an iceberg was one in a million. And everybody may have known that, at least implicitly. But that doesn't tell you when an iceberg is going to be hit. It could be hit on the first one-in-a-million sailings or the last. Things like probability are funny. They don't give us very precise ideas about what's going to happen or when it's going to happen.

In the case of the Titanic, the fact is that there were some overconfidence, hubris involved on the part of the captain who had the ship going, trying to break a speed record, whereas he was going through waters that were dangerous, known to be dangerous. He had been warned about icebergs. So it was a concatenation of all these things that came together - some chance, some deliberate.

FLATOW: And, you know, a lot of this seems to be the theme of your book...

PETROSKI: Yes.

FLATOW: ...that there are a lot of things going on here, you know...

PETROSKI: Yes.

FLATOW: ...as the title says "To Forgive Design: Understanding Failure." What other great...

PETROSKI: Well, the...

FLATOW: ...well, give me some examples of other great failures that you - that we have to understand the design and the failure.

PETROSKI: Well, you were talking with the people up in the space station. I talked about NASA failures with the space shuttle, and these are familiar. These are examples that are not unlike the Titanic, actually. The Challenger was not an accident that was not foreseen. The engineers warned the managers that it was a little too cold to launch that ship on that day with complete confidence that it would return. And they were proven to be right. The seals had been leaking. The engineers knew that, and they expected that they would be leaking on that day, too.

The Columbia, which came back in 2003 and disintegrated upon re-entry, there were also warnings about that - questions of foam flying off the external tank and hitting the shuttle as the - as it took off, as it launched from Earth. The engineers, again, said, well, you know, some of that debris has hit the shuttle's wing, and we really should investigate it to see whether it's been damaged terribly or not and whether we should have to repair it.

But, again, basically, management overruled the engineers and had an overconfidence. The difference between the perspective of managers and engineers with regard to safety and failure is very, very interesting. Before the shuttle missions took off, the engineers were asked, what was the likelihood - what did they think the likelihood was that there would be a failure of the kind that we now know happened? The engineers said, oh, about one in 100. The managers, on the other hand, predicted one in 100,000. And that's quite a difference. And we know that the engineers were proven to be right.

FLATOW: This is SCIENCE FRIDAY from NPR. I'm Ira Flatow, talking with Henry Petroski, author of "To Forgive Design: Understanding Failure." Tell us the Brooklyn Bridge story that I've just happened to mention. I thought it was also fascinating about Roebling who sort of built in extra stuff, right? He built in a safety factor into the bridge.

PETROSKI: That's right. What responsible engineering does is it specifies the quality of the materials that go into a structure like the Brooklyn Bridge. Well, in the case of that bridge, the Roebling's owned their brand, their own wire-making factory. And they would have like to have provided the wire for the bridge's cables because they would have a high level of confidence that it was high quality. But on the basis of a business decision, the board of directors said, no, no, you can't use your own wire.

You're the engineer. It's a conflict of interest. So the contract for the wire went to someone else who Roebling mourned was not a good producer of wire. Well, everything seemed to be going fine. Until one day after many deliveries of these reels of wire, it was discovered that there seemed to be some bad wire getting into the bridge's cables. And how was that happening because every supply, every shipment of wire was tested before it was passed on to go ahead and be put into the bridge?

Well, it turned out that the wire supplier was not only a bad - had bad workmanship but also had bad morals. And the rejected wire was snuck into the construction site, and it found its way into the bridge. Well, Roebling - this was Washington Roebling - his decision was crucial at this point. What would he do? Would he take all the bad wire out? Now, that would not only costs time and money, but it would also be very dangerous for the workers.

What he decided to do was estimate to the best of his knowledge how much bad wire was actually in the bridge already. And then he added additional wire beyond what was originally designed to be in the bridge of high quality and completed the project that way. To this day, that bad wire is in the bridge.

FLATOW: So if you're going to buy it, beware.

PETROSKI: Get a discount.

(SOUNDBITE OF LAUGHTER)

FLATOW: We're talking with Henry Petroski, author of "To Forgive Design: Understanding Failure." And I got a minute before the break. Henry, tell us why after two decades of book writing, you made - you wrote a sequel "To Engineer is Human." This is "To Forgive Design."

PETROSKI: Well, the first book I concentrated on mechanical reasons why things fail. In this book, I wanted to concentrate on larger questions of systems. Everything is embedded in a system, and that system includes these people who - some of whom are not the best of actors. So I wanted to tell that story. A knee-jerk reaction is when a plane has an accident or something happens, oh, it was a bad design. That's not always true. In fact, in more cases than not, it turns out not to be true. It turns out that the operators or the inspectors or some person involved wasn't...

FLATOW: All right.

PETROSKI: ...holding up his end of the deal.

FLATOW: We'll talk about more stuff from his book "To Forgive Design: Understanding Failure" with Henry Petroski. Our number, 1-800-989-8255. You can tweet us, @scifri. Stay with us. We'll be right back. I'm Ira Flatow. This is SCIENCE FRIDAY from NPR.

(SOUNDBITE OF MUSIC)

FLATOW: You're listening to SCIENCE FRIDAY. I'm Ira Flatow, talking to the venerable Henry Petroski, author of "To Forgive Design: Understanding Failure." That's his latest new book. And, Henry, you know, we've talked about the Titanic a bit, but you have a really interesting take on an aspect no one has talked about when we talk about the sinking. And that is, what would have happened if the Titanic did not sink?

PETROSKI: Yes. That's a very interesting thought experiment, I think. If the Titanic had not sunk and in fact if it had reached New York and then went back and forth across the Atlantic many times, the likely result of that would have been, in my opinion, that competing steamship companies would have wanted to better the Titanic. They would wanted to build larger ships, faster ships. They would have wanted to build them more economically to make more profit. They would have probably used thinner and thinner steel over time. They might have put fewer rivets in. They would have maybe wanted to get rid of lifeboats altogether because after all the Titanic was unsinkable. We're following the design of the Titanic, only we're making it bigger and better.

Eventually, chances are, one of those ships would have struck an iceberg or had some kind of incident in the ocean. And since it had all the inherent flaws of the Titanic, it would have sunk and probably because it was bigger with a greater loss of life.

This is what happens with cycles of success and failure. When we have a success, a prolonged period of success, we tend to become more complacent. We tend to become overconfident that we're doing it right, that we've got it figured out finally. And then, of course, a failure occurs and wakes us up out of our dream. The failure, the wakeup call then causes us to look more closely at what we've been doing, and we discover that in fact we haven't been building perfect machines or systems. We've been building them with inherent flaws.

FLATOW: Is there one system, bridge, tunnel, anything that's waiting to fail that you can warn us about that's due - that's overdue?

PETROSKI: I think the history of bridges is very interesting. Over the past century and a half or so, there's been a major bridge failure about every 30 years. So right now, we're looking ahead to about the year 2025, 2030, not too much more than a decade from now there - if things follow as they have proceeded in the past, we can expect some kind of big surprise. It will be a bridge - that bridge type that hasn't failed before. It will be something that will seemingly come out of the blue. But then in retrospect, looking at it and fitting it into the pattern, it's something we will say, oh, we should have seen that coming.

FLATOW: So it will be a combination of human error and design error?

PETROSKI: Yes. Generally, that's right. You can almost say that a design error is a human error because after all it's we humans who do the designing.

FLATOW: Yeah, I recall - I covered Three Mile Island, that nuclear accident many, you know, 1979. And the investigation showed such a combination of design and human errors there that...

PETROSKI: Yes. That's a fairly typical. Most systems, most machines, structures are designed to be somewhat robust. So that if some little thing goes wrong, the whole thing doesn't fall apart all of a sudden or blow up or anything like that. But then when humans react to this small irregularity, they sometimes make it awfully worse.

FLATOW: Henry, I want to thank you very much for taking time to be with us today. A fascinating book. It's "To Forgive Design: Understanding Failure." It talks about all kinds of engineering designs and famous failures and Henry Petroski's unique way of looking at them and explaining it. Thank you, Henry. Good luck with the book.

PETROSKI: Thank you, Ira. Thank you. Bye-bye.

Copyright © 2012 National Public Radio®. All rights reserved. No quotes from the materials contained herein may be used in any media without attribution to National Public Radio. This transcript is provided for personal, noncommercial use only, pursuant to our Terms of Use. Any other use requires NPR's prior permission. Visit our permissions page for further information.

NPR transcripts are created on a rush deadline by a contractor for NPR, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of NPR's programming is the audio.

No comments:

Post a Comment