DRAFT: This module has unpublished changes.

In this lecture, we went over a few things that we have done in past lectures, but in a little more detail and added something to it. We learned about biases and how they can affect designs. We learned from disasters like the Challenger and the Titanic that humans contribution is a major factor in causing something to go wrong. With the Titanic, the captain did not believe that the ship would sink because he was never on a ship that sunk. He was biased because he made decided that because he was never on a ship that sunk, it could not happen to him. The Challenger disaster also happened because people believed that it would be fine despite the weather. Availability bias is another thing that can sometime be hard to notice. This is because we have plenty of information after a disaster happened while during or even prior to it, the thing that went wrong could have gone unnoticed. A large part of this lecture is about risk. We took the equation we originally learned about risk, and we added complexity to it. Complexity changes the equation to something very hard to predict. Because of the simplicity of the equation for risk, it only considers parts that are working independently. However, most engineered objects do not work independently which means that if one thing fails, it can cause other things to fail and cause a chain effect. 

DRAFT: This module has unpublished changes.