In This Issue
Summer Bridge Issue on Aeronautics
June 26, 2020 Volume 50 Issue 2
The articles in this issue present the scope of progress and possibility in modern aviation. Challenges are being addressed through innovative developments that will support and enhance air travel in the decades to come.

Invisible Bridges: Interface Value

Thursday, June 25, 2020

Author: Guru Madhavan

Being at sea is perilous. Any safety feature on a ship can backfire. Lew had known that well since his Navy days—he spent long deployments in the Western Pacific during the Vietnam conflict. Later, he worked in corporate public relations building a career in crisis management. Then he found himself in a crisis.

In March 2019 Lew, in his 70s, and his wife set off to Norway for a 13-day cruise, In Search of the ­Northern Lights. The Viking Sky was fairly small by cruise ship standards—on this trip it carried 915 passengers and 458 crew members—but that’s how Lew wanted it. “Those big vessels are not a very pleasant experience. You’re herded around like cattle,” he told me.

The Viking Sky was Italian-built in 2017, steel, 228 meters long, 47,800 tons, and powered by four diesel generators. After a routine tour circuit, the ship would leave Bergen, motor up the Norwegian coast to the north of the Arctic Circle, then return to Stavanger and England.

Everything went according to plan. Aurora Borealis – check. Tirpitz Museum – check. Sami camp – check. Reindeer sleigh ride – check.

Late in the morning on March 23, the next to last day of the cruise, Lew was relaxing in his cabin. The next several minutes changed his life.

The ship began rolling violently, tilting almost 45 degrees. Waves rose up to 30 feet. Lew was thrown across the room. Then he hurtled head-first into a chair against the far wall. “I heard a kind of sickening crunch in my neck.” He felt woozy. “I don’t know whether I blacked out but the next thing I vividly remember is I kind of rolled over onto my back and I couldn’t move my right arm. It was paralyzed.... I couldn’t feel anything.”

The Viking Sky had sailed into Hustadvika, a region notorious for dangerous storms and hollow breaking seas. Rough waters were expected there. But something else had happened.

During the violent movements, the automated sensors on the generators picked up low levels of lubricant oil. Assuming depletion, the sensors sent alarms and automatically tripped the engines, causing a blackout.[1] Propulsion was lost. The captain couldn’t restart the engines and, because the ship had no controls, couldn’t navigate through the dangerous conditions. Tables and pianos skidded. Furniture and people collided. Ceiling panels fell. Icy sea water poured in. Passengers screamed for help. Distress videos were tweeted.

One of Lew’s fellow passengers later recalled: “I grabbed my wife but I couldn’t hold on. She was thrown across the room, and then she got thrown back again….” He added: “I did not have a lot of hope. I knew how cold that water was…. You would not last very long. That was very, very frightening.”[2]

The captain made a mayday call around 2:00 pm. The waves and winds ruled out the option of lifeboats. Fortunately, a rescue helicopter arrived promptly. Over the next 20 hours, the helicopter hoisted one person every couple of minutes, successfully evacuating 479 passengers. The next day the conditions calmed. The ship, with the remaining passengers, was tugged to the Norwegian shore town of Molde.

Lew was already in intensive care.

The Viking Sky was a near disaster. The sensors worked to specification—if the oil level was lower than a ­threshold, they were supposed to turn off the power to protect the engine. But the oil was actually within the specified levels: the rolling of the ship caused the sensors to be briefly exposed to air, leading to their false reading.

Think of a fuel gauge wrongly “assuming” that a car going uphill is running out of fuel and, without knowing the operating context, automatically shutting down both the driver’s controls and the car. This kind of narrow logic proved harmful to the ship and its passengers.

Most failures in engineering—and by extension, organizations—share this quality: a local safety function doesn’t “know about” the broader safety strategy, a unit by design protects only itself or serves only the adjacent component without appreciating the overall system.

Imagine the catastrophe if the human nervous system worked this way. Our bodies send signals that switch and tune a multitude of responses, such as fear, fever, pain, nausea, or anxiety. These might be called the face value of safety. Sometimes these are overexpressions of defense honed by natural selection. Evolutionary biologist Randolph Nesse likens these effects to how smoke detectors react to changes in the indoor environment—sometimes they go off even when there is no danger.

We accept certain false alarms because they typically don’t cost much, or anything. A parallel could be drawn to the availability of cheap, overreactive automation. “Birds flee from backyard feeders when any shadow passes overhead,” Nesse notes. “Wearing a seatbelt is unnecessary 999 times out of 1000, but sensible people do it.”[3]

Yet the cost of the convenience may in some sense far exceed the benefits of the convenience. The preponderance of overresponsive automation could make us psychologically more averse to risk—call it the interface value of safety.

The Viking Sky incident illuminates the consequences of excessive faith in automation. In the so-called “user-centered” design, the “users”—the ship’s crew—had no control. Protecting the engines was a critical requirement, but there was no option to override the erroneous automatic response that resulted in a calamitous shutdown. The extent of the harm and distress could have been avoided.

The Viking Sky crew had every reason to expect that their new ship would continue to perform in bad ­weather. They had no information that would suggest that the rolling of the ship would cause a total loss of power.

This and similar occurrences are not one-off events. Automation has produced unintentional acceleration or poor antilock braking in cars. Aircraft pilots have found themselves unable to control maneuvering after takeoff or to override the software takeover of flight controls and prevent a sudden dive. Such “automation surprises” are matters of life and death. The interface value of safety also brings to light deficiencies repeated in a typical design and business attitude that ties automation with revenue.

Whether a sensor’s actions were right or wrong in these situations, or objective safety is a pure illusion, is worth discussion. Again, the sensors and “fail-safe” they were connected to performed as they were intended to. The fault was in not imagining a circumstance where humans would know better than machines. As a result, a point failure launched a cascade that took down the entire system and even, in some cases, the people who had trusted those systems.

Responsible engineering knows that safety exists beyond official statistics and codes. Every engineered system we engage with has the potential for “behavioral surprises” even in the most quantifiable environments, even when devoid of human actions. Safety can and does thrive at its subjective best, akin to excessive belief in hand sanitizers to curb a viral pandemic. The face value of safety may well depend on its interface value.

When Lew yelled for help after being injured, six crew members stabilized his neck, took him on a stretcher five floors down from his room for medical attention, and later eight floors up to the helipad for rescue. “These guys really worked their tails off to get me there.”

Lew’s C1 and C2 vertebrae were shattered in the collision and the supporting ligaments ripped. He endured a complex surgery. Two titanium rods were inserted with several screws to reconnect his skull and neck. His rehab was intense, his recovery slow. “I cannot turn my head at all, either up-down or sideways,” he said. “That’s what I’ll live with for the rest of my life. I feel very lucky…actually. It could have been much, much worse.” If the crew of the Viking Sky hadn’t been well trained, Lew said, “there’s an almost endless list of worse things that could have happened.” The ship could have run aground on the rocks, resulting in deaths.

It requires a high degree of professionalism and an intense amount of training to operate any ship safely in the ocean, particularly in stormy weather. But, at the same time, the captain and the ship’s officers and crew are relying on the technology to do what it’s supposed to do. And in this case, technology as a support system failed, not the crew. Humans were the backup. “I hold no ill will toward the ship, or the captain, or the cruise line,” Lew said. “I do hold a fair amount of ill will toward the people who made that design decision. I don’t know who they are, but I wish they’d done their job better.”

[1]  Accident Investigation Board Norway, On the Investigation into the Loss of Propulsion and Near Grounding of Viking Sky, 23 March, Lillestrøm, Norway, November 12, 2019.

[2]  Cristina Abellan Matamoros, Euronews, March 24, 2019, https://tinyurl.com/yysm89kp.

[3]  Nesse R. 2005. Natural selection and the regulation of defenses: A signal detection analysis of the smoke detector principle. Evolution and Human Behavior 26(1):88–105, p. 98

About the Author:Guru Madhavan is the Norman R. Augustine Senior Scholar and ­director of NAE programs.