It is commonly believed that poor interface design causes user confusion and inconvenience. However, in fact, bad design can be life-threatening in serious cases.
Unclear expression results in loss of crew
On August 21, 2017, a U.S. Navy warship was involved in a collision with the Liberian-flagged tanker, killing 10 crew members and injuring 48 others. The cause of the accident was the crew’s improper operation. However, a major contributor to these improper operations was the bridge's user interface on this warship.
Before the collision, the captain felt that the auto-pilot function of the warship's navigation system was too difficult to use and switched to manual mode. He was unaware that this mode was to place the system in emergency mode, which eliminated steering change-over protection. In this situation, any crew member from other stations could take over the warship direction. When the collision happens, members of the stern and bridge stations attempted to gain control of the direction from other stations. As a result, directional control was repeatedly switched between the bridge and stern stations, and the warship eventually veered off course and collided with the oil tanker.
In regular mode, directional control can only be switched between the two stations if both parties press the confirmation button. In manual mode, there is a hidden condition that does not meet the crew's expectation that the other station will take over the directional control without warning. Otherwise, they would not have easily switched to manual mode.
It's easy to get confused when introducing the concept of "mode" in interface design. For example, "manual succession mode" - what does it mean? What are the conditions and restrictions for using this mode? Designers may find it convenient to use "mode" to create labels to avoid using more textual descriptions. However, users may not understand what the "mode" means at all.
Instead, using options is a better way. The scenario in the image designs the specific condition as an option that is displayed directly on the outside, for example as "Allow any station to control steering", and replaces the drop-down checkbox with a switch. When the switch is green, it means that the condition in the option has been activated.
If the interface considered making the above design changes, it would have become more intuitive. Perhaps the crew at the time would have been more likely to notice the problems with the manual mode, thus avoiding improper operation and consequent serious accidents.
Three Mile Island Accident
On March 28, 1979, a partial meltdown at the Three Mile Island nuclear power plant in eastern Pennsylvania triggered the worst nuclear accident in U.S. history. Fortunately, it had no discernible impact on the health of plant workers or the public.
The incident started when a relief valve was not closed properly, but instruments in the control room indicated that it was closed. As a result, plant workers did not realize that cooling water was pouring out of the valve.
According to the U.S. Nuclear Regulatory Commission, "Other instruments available to the reactor operator did not provide sufficient information as coolant flowed from the main system through the valves. There was no instrument available to show how much coolant was covering the core. As a result, the nuclear plant staff believed the core was properly immersed. Without proper coolant flow, the nuclear fuel overheated to the point that the long metal tubes holding the nuclear fuel pellets ruptured and the fuel pellets began to melt."
In this case, the main problem with the design was the lack of "state visibility". Imagining that you are driving and your car runs out of fuel without displaying it, or your are not being able to see if you have indeed completed placing an order at the end of an online purchase, can certainly make for a bad experience. And if it happens in the operating system of a nuclear power plant, these errors can cause even more damage.
How to avoid life-threatening design failures?
Poor user experience design can be deadly and can have a dramatic impact on our lives and our environment. And the fact that every aspect of modern life is filled with modern technology means that unsuccessful design can also cause more serious consequences. So, how can this be avoided?
1. focus more on users/players
Conduct user research, go out and see how people actually use the product and get more feedback about it. It's a good idea for designers to talk to users in person, ask them for their opinions and feedbacks to understand more about their thoughts.
Also, insist on conducting tests would be nice. Testing is the only way to create a true "foolproof" design, and the only way to avoid the dangers that can arise from poorly designed user interfaces.
2. Building a macro perspective
If we want to focus on how the product will really impact the environment and society, we have to think about these questions in detail - how will our product change users' habits? What does our product/service want from its users? How will our product affect the entire social environment? By asking ourselves these questions often, we can also build a more macro perspective and achieve a comprehensive understanding of our products.
Certainly, these are not fixed questions to ask, and many more can be added. The answers to each question can be subjective. Most importantly, through this simple but solid process, we can quickly summarize the core of a product. From this starting point, designers will also generate more positive thinking.
3. change attitudes
The painter Lászlo Nagy-Moholy once stated that designing is not a profession, but an attitude. Designing has many connotations ...... It includes the integration of all the psychological effects that come from technical, social and economic needs, the needs on a physical level, and also from materials, shapes, colors, and spaces."
In order to truly understand our users and every aspect of the product and to eliminate dangerous or harmful design, it essentially requires us to establish good attitudes. Therefore, we must first put aside our inherent beliefs and biases and learn to listen to users and players. We must constantly ask questions about existing work, accept our mistakes and learn from them. In this way, our products may be less likely to threaten the public and be more likely to provide a better experience to those who use them.