Good Design is Human

October 9, 2018

“How do we design systems that support people and humanity, as opposed to just getting the job done? ” Irene Au

From airplane crashes, to industrial disasters and medical error: When things go terribly wrong, why do we blame human beings instead of bad design?

 

TRANSCRIPT

 

KHOI: Welcome to Wireframe, from Adobe. A podcast about good design. And specifically about user experience design — how we shape technology to fit into our lives.

I’m Khoi Vinh, principal designer at Adobe.

And each week on this show, I, along with one of our producers, will explore one aspect of good design. This week, I’ve got Amy standen here in the studio. Hi Amy…

KHOI:  Hi Amy..

AMY: Hi Khoi…

AMY: Ok, Khoi I want to read something to you.

AMY: So this is my wifi password.

KHOI: Let’s hear it…

AMY: Five lowercase b n two….

KHOI: Poetry!

AMY: It’s terrible, right?

AMY: When I give people this password it’s .. I’m kind of embarrassed by it, as if it’s proof of my inability to figure out the tools in my life.

KHOI: No… That’s not right. Like you should not have to feel bad about using tech in the wrong way. That’s the designer’s job. Is to make it easy enough for you to use and for you to get all the benefit out of it. It’s actually a hallmark of bad design when users start blaming themselves for not being able to do it properly.

AMY: Should a designer be aware on some level that there are people out there like me who just are too busy and distracted and bored by their wifi software to do this right?

KHOI: Not just on some level, but on every level. I mean, that is the designer’s job to understand all of the circumstances that might impact how a user interacts with a product or an app or a website. They can’t just design for the best possible case where the user is able to give their design all the attention in the world. That’s just not realistic.

AMY: Designers seem to use this word empathy a lot. Is that what we’re talking about here?

KHOI: Yes it’s about the designer putting him or herself in the shoes of the user, and really trying to understand what their life is like, what they’re trying to achieve by using this particular piece of technology.

OK. So this idea that if i can’t use something.. that it’s the designer’s fault, not mine? That it is on them to make things that are easy to use — not on me to figure out how to use them.

This idea really struck me. And when I looked into it, I came across a guy named Cliff Kuang.

CLIFF: For the last 15 years I’ve been a journalist covering design and technology. A UX practitioner myself – by UX I mean User experience.

He’s the author of a book called User Friendly. It’s coming out next spring. And in it, he tells a story that he says really helped shape this idea of empathy in design—what is called human-centered design. That story is of the near-disaster at the Three Mile Island nuclear power plant in 1979.

And I think to most of us, the story of Three Mile Island goes like this: Workers at the plant screwed up and brought this country to the brink of a full-scale nuclear meltdown. That’s how the story is usually told.

But Cliff tells the story in a very different way — a way that has everything to do with design. Here’s Cliff.

CLIFF: So this is March 28 1979. So picture it as a you know a really peaceful early spring day… It’s early in the morning, still dark outside. And deep in the bowels of the plant, something has started to go wrong. One of the pipes that supplies water to the reactor has become clogged.

CLIFF: The foreman goes down there. They work on this clog for hours and hours and hours. All of a sudden they start hearing this, like, pressure building in the pipe. Pressure pressure pressure.

CLIFF: They report that it sounds like a locomotive coming down a tunnel. A team of workers has been standing on a platform next to the pipe, trying to get rid of the clog. And when they hear this noise – they know – they have to get out of there, immediately.   

They leap off. The pipe explodes sending boiling water everywhere… The workers race to contain the leak. And it seems to work. But the real problems are just getting started.

Where things start getting difficult is what happens way up above there in the control room.

The control room. This is the heart of Three Mile Island — and it’s the heart of this story too. Because according to Cliff Kuang, everything that really goes wrong at Three Mile Island can be attributed to this room.

More specifically, how this control room was designed.

It’s a medium-sized room. It’s filled wall to wall with control panels like this. There’s an arc of control panels… like 90 feet around. And it reaches from floor to ceiling there’s buttons everywhere there’s lights everywhere.

On a normal day, when everything is operating as it should, this is fine. But today is not a normal day. The problems that began below, in the pipes, are now mysteriously snowballing. For some reason no one can quite figure out, pressure is building inside the reactor.

And a chorus of lights and alarms in the control room is going off.

Every single one of those lights it seems like, there’s some 600 of them in that room. Every single one is going off at once.

AMY: Okay, Khoi, what is your impression of that control room?

KHOI: It sounds like the set from a disaster movie from the 70s. It sounds extravagantly complicated and difficult to maneuver.

AMY: Can we call this UI? Can you call a control room a user interface?

KHOI: Yes, Yes, you definitely can. It’s the interface for the reactor. It’s the place where you harness the technology, you interface with the technology.

AMY: So is an example of bad UI?

KHOI: I think it’s fair to say it sounds like bad design, yeah.

So here’s how Cliff Kuang describes it…

CLIFF: 600 lights. And now your task, as somebody sitting in that control room, is to figure out what went wrong and how do i fix it.

This thing is buzzing and this thing is buzzing and this thing means the opposite thing that this means. It’s overwhelming. There’s an enormous amount of information that’s suddenly being thrown at you every single thing is basically saying I’m important.

It’s impossible in that sense to get a sense of like what’s going on here? What do I do first? And the pressure keeps rising and rising and they don’t know why.

And it’s not like these guys are amateurs. They’re highly trained pros. A lot of them are ex-Navy submarine technicians.

These are smart guys.  They’re cool under pressure. They’re military guys.  Nobody is panicking, right? (PAUSE) At least not initially….

AMY: So, here’s what they do. They start pulling out instruction manuals.

Which, Khoi – on the eve of a nuclear meltdown.. Is not a good sign.

KHOI: Yeah. I mean, look, if you need a manual to operate a piece of tech, especially software, that means that the design was actually not that great.

You’ve got a whole other level of design going on with that manual– and then there’s a new question of how well was that manual designed.

AMY: Right, because now we’re not just worrying about the design of the conntrol room, we’re worrying about the design of the manual.

KHOI: Maybe the manual is oriented around the way the engineers thought about the system but not the way the operators thought about it? And so the operators, they may need to consult two or three different sections of the manual. Sometimes going back and forth… I mean it sounds like a headache.

AMY: Yeah.

Hours are going by… because instead of being resolved, the problems at Three Mile Island have only escalated.

Inside the core, water levels have become dangerously low. There’s not enough water to keep the nuclear rods cool, so temperatures inside the core are rising.

And meanwhile, the media has gotten wind of the situation. News reports are warning of a possible nuclear meltdown. The town is evacuating.

And inside the control room, an ever-growing team of engineers….. is trying to figure out what the problem is.

CLIFF: It’s hot by now. It’s sweaty. There’s coffee cups strewn everywhere. There’s instruction manuals open. They all are very aware of how close this thing is and how little they have done to solve the problem.

Eventually what happens is somebody that is familiar with the workings of the tank says like no we’ve got to go back. We missed a step.

AMY: We missed a step.

Unbeknownst to the people in the control room, a valve – a really important valve – had become stuck in the open position.

Now, water in the form of steam is escaping through the valve, exposing the nuclear rods in the core, and causing them to overheat. And of course, the engineers had checked on this valve earlier. But a light was on, indicating, they thought, that the valve was closed.

CLIFF: What the engineers did not realize when they were creating this thing was that the light only indicates whether or not the little bit of circuitry that powers me says that this is closed, not this thing is closed.

So this guy comes and he flips the valve so that now it is actually closed.

That makes all the difference in this case.

Temperatures start to cool.

Immediately, disaster averted.

Disaster averted. But the reactor was destroyed and shut down permanently.

The clean up would take 14 years and cost around a billion dollars.

And even though no one died, and the damage was contained, the crisis at Three Mile Island put a stamp on the American psyche. For decades afterwards there would be huge opposition to building any new nuclear power plants.

And everyone wanted someone to blame. In the early days, Cliff says Americans pointed the finger in two directions: One: The reactor itself. Two: The people hired to operate it?

So there are two types of stories. One is the reactor broke.

AMY: And the other…

CLIFF: The men inside messed up.

AMY: Reactor, guys running it. Those were the two theories about what went wrong at Three Mile Island.

CLIFF: And those twin poles are just not actually reflective of what happened. Because if you were in that control room and you had experienced what those men had experienced? I challenge you to have done anything different in the context and setting in which they were in.

These guys did the best they could. But between the incredibly confusing control room, and this faulty light on that one crucial valve, Cliff says it’s the design of the plant that let them down.

To designers, coming this close to a nuclear meltdown had made one thing very clear.

Even when human beings go wrong. Even when they don’t use the machine right. Even when how they use the machine and how they understand that machine goes wrong, it’s not their fault.

It’s not their fault.

This is the central tenet of human centered design. Technology cannot just be high-tech. It has to be designed in such a way that humans can use it – even when we’re stressed out or tired or just liable to make mistakes.

It’s actually making sure that the interfaces, the machines around us are actually accepting of human foible and human limitation. They basically have to present a world comprehensible to us. They have to present a world to us that we can understand, and in many cases understand under extremely trying circumstances.

AMY: Khoi, to me, I feel like we’re back at this idea of empathy.

KHOI: Yeah, I mean, Three Mile Island is an extreme example but empathy, really being able to put yourself into the shoes of the user and understanding their whole state of mind applies to anything you’re designing for. virtually anything.

Actually there’s a little known human centered design story here. One of the guys they brought in to analyze what happened at Three Mile Island was a cognitive psychologist named Don Norman.

Don became hugely influential in human-centered design and went on to write a book called “The Design of Everyday Things.” He took his learnings about Three Mile Island with him to the consumer tech world when he went to work at Apple in the early-’90s.

So, in many ways those lessons from TMI are still with us.

AMY: Right, I mean my laptop today is pretty much everything that the control room at TMI was not. I mean it’s really streamlined, it only gives me info I need when I need it. And there’s this one function that I use all the time that I kinda wanted to ask you about Khoi: the undo function.

KHOI: Yeah, undo is a core principle of making software easy to use. It reassures people that they can explore and experiment and figure something out without fear of being penalized for it.

AMY: I remember the computers we all had before Macs were prevalent. And one thing I remember happening is that you would screw something up, there was no undo function you kind of just had to reboot the whole thing. And it felt bad. It wasn’t just impractical but on an emotional level, it didn’t feel good. So, I wonder whether this emotional experience of using technology. Is that something designers really consciously think about?

KHOI: 100%. As a designer you want the user to be delighted with whatever you design. That’s the goal. And the way you get that is think about their emotional state, their emotional well being, even. You try to get them to a state where they’re having fun, they feel like maybe this technology is giving them superpowers, or really augmenting their abilities.

This is one of the lasting effects of that TMI episode. It’s the idea that design should make tech comfortable for people. It should put people at ease. So you have all these little cues that make people feel like they can explore the tech, they don’t have to worry about being caught in a trap or inadvertently triggering a nuclear meltdown.

AMY: But while human-centered design had permeated the world of consumer technology, not every industry has been as quick to adopt it.  

KHOI: And that has had some disastrous consequences. After the break.


KHOI: Welcome back to Wireframe. I’m Khoi Vinh and I’m here in the studio with my producer Amy Standen.

AMY: And we are going to pick up the story in Hawaii.

In January 2018, an employee at The Hawaii Emergency Management Agency mistakenly issued a state-wide alert that went straight to residents’ cell phones.

“Ballistic Missile Threat inbound to Hawaii,” it read. “Seek immediate Shelter. This is not a drill.”

Hawaiians were already on edge. Weeks earlier, North Korea had conducted an intercontinental ballistic missile test. Experts were saying that Hawaii was in the range of a North Korean missile.

But in this case, there was no missile.

This was supposed to be a safety drill. But the guy at the controls clicked on the wrong link.

And you kind of can’t blame him… when you see the screen that he was looking at. Here’s author Cliff Kuang again.

CLIFF: In the first few days after Hawaii people screenshotted the interface that allowed someone to even send out that message.  And what was confusing was… it was hard to tell how important this thing was versus that thing was…

AMY: Khoi, I’ve seen this interface. And to me it basically looks like a Word doc with a list of hyperlinked commands. You’ve seen it, right?

KHOI: Yeah I’ve seen it. What struck me is it looks like Craigslist. It’s a set of blue links in sans serif type and each link is identical to all the other links. It’s really difficult to tell from one or the other, which one you’re supposed to click on for any given reason.

Exactly.

Except for a few words, the file names for drill versus alert are almost identical. And there wasn’t an effective confirmation step, you know? A message that might have asked:

ARE YOU SURE YOU WANT TO TELL MILLIONS OF PEOPLE THAT A NUCLEAR MISSILE IS COMING?

After the operator sent this wrong message, it took 38 minutes for the correction text to appear telling people that this had been a false alarm.

KHOI: 38 minutes. I mean, that’s a lifetime. If you can imagine waiting 38 minutes for anything on your phone or laptop to revert or undo it’s basically a broken system.

AMY: Yeah people completely panicked. They were running to take shelter in concrete bunkers and calling their family to tell them they loved them. It was a mess. Hawaii’s governor said he would have tweeted earlier but… he couldn’t remember his Twitter password.

So, bad design was everywhere. But – as with Three Mile Island – many people blamed a person instead. Here’s Cliff Kuang again…

CLIFF: It came out that there was a guy that seemed a bit off, that maybe didn’t understand that they were actually doing a drill during this whole time, so he wasn’t supposed to send that message.

This operator at the Hawaii Emergency Management Agency, this user, was interviewed on CNN. He had the network black his face out and distort his voice, so that he could stay anonymous.

CLIFF: Once we found this person to blame– we could blame someone, nobody talked about the interface anymore, right. And so, this dynamic of blaming users and saying, “Oh, it’s just that guy’s fault. We don’t have to fix this thing,” goes on today. It actually happens all the time.

KHOI: (LAUGHS) That’s the same story, right? That’s what we heard earlier at Three Mile Island. Blame the user!

AMY: Exactly which is kind of disheartening, right? I mean, given that there are 40 years in between these two events. I get the sense that some of these same design mistakes are still happening today.

KHOI: I mean, on the one hand, technology that consumers interact with has gotten so much better. We don’t have manuals anymore. Everything is much, much more intuitive. And so, design has really come a long way in that respect. But there’s this whole other sector of technology – the stuff that you and I don’t get to interact with everyday. The stuff that–  power plants or emergency centers. That stuff still is pretty rough still.

AMY: And that’s exactly what Cliff thinks.

CLIFF: The stuff that really needs to be redesigned is not apps and things like that. It’s the gnarly ugly enterprise software out there that sort of has escaped real scrutiny right? So, let’s say some piece of software for managing a trucking fleet.

Too many of these kinds of tools still aren’t streamlined and friendly and easy-to-use the way our smart phones and laptops are. They have not yet, in other words, received the benefit of human-centered design.

CLIFF: All these things actually they exist in a “ghetto of inattention.” The greatest opportunities are within those things, this like gnarly machinery of everyday commerce that really powers the world we live in, but that has escaped notice.

IRENE: You know… We could talk about airplane crashes, we could talk about presidential elections, we could talk about medical human factors. These kinds of mistakes happen all the time, in any kind of context where lives are at stake.

AMY: This is Irene Au. She’s a design partner at Khosla Ventures, a venture capital firm in Menlo Park. And before that, she led design teams at Yahoo and Google.

Talking to Irene, you start to feel like you’re hearing a parallel narrative. Almost like a secret design history — where anytime there’s a story in the news about a human mistake? It is very often a design mistake.

One tragic example, Irene says, is a commercial airplane called the Airbus A320.

IRENE: This is like, a notoriously bad design because it has what’s known as a modal interface.

A modal interface is a screen that displays different information, depending on what setting you’ve chosen for it. And Khoi, I’m guessing you’re about as much of a fan of modal interfaces as Irene is?

KHOI: No designer is a fan of modal interfaces. They’re often a way to save on engineering or technology resources, so instead of having two screens you can have just one and rely on the user to flip back and forth between them.

Right, which is exactly how the modal interface on the Airbus A320 worked. One mode showed the descent speed; the other showed the flight path angle. Which made all the difference in 1992, when a pilot, flying through the Vosges mountains in France, misread the modal control.

IRENE: He intended to set the airplane to coast through the mountains on a gentle descent, and instead it sent the plane plummeting down into the mountains and crashed.

87 people died. They called it pilot error.

No matter how intuitive our laptops and smartphones get, bad design, she says, is still everywhere.

It’s there when a nurse injects a patient with the wrong drug, which happens to come in a vial that’s nearly identical to the right one. Or on butterfly ballots in the 2000 presidential election, when maybe thousands of voters punched the hole for Pat Buchanan instead of Al Gore.  

All of these cases demonstrate a failure to think through how the technology would fare in the real world, in front of real, mistake-prone humans.

IRENE: Anytime you have people interacting with machines or systems, you need human-centered design. You need to study the context in which people are using the systems or machines, and then design appropriately for the context of use.

Context of use — as in a hot, sweaty control room on the brink of a nuclear meltdown. Or humans who are stressed out, flustered, distracted, or in any other compromised emotional state in which we’re liable to screw things up.

IRENE: Mistakes will be made, and so there needs to be a way for people to recover from mistakes., You have to accept and acknowledge that no matter how great and perceptive the operator is, a mistake will happen. So how can you help them understand what’s going on and recover from that easily.

Sometimes that just means helping people slow down a bit.

IRENE: There are interface techniques to slow people down. Confirmation dialogue, or maybe you show a captcha — like where you’re presented with an image and you write down the letters and numbers you see in this image…

We humans don’t always do what’s best for us. And often, it’s not just about pressing the wrong button. We text and drive. We let the glow of our screens keep us up at night when we should be sleeping. To Irene – that’s why human centered design must keep evolving. Even on consumer devices like phones and tablets.

IRENE: The future of human centered design really is about ethics, it’s about the impact on society. It’s about what we think is appropriate and right and just. How do we design our system so that it supports people and humanity, um as opposed to just getting the job done.

KHOI: Yeah I think she’s exactly right here, this is exactly where design is going. Traditionally HCD has been about how can we make these products more acceptable to humans so that they’ll use them. Ultimately what you wanna do as a designer is just make things better… as design becomes more and more critical to making that technology succeed, it’s going to be about: are we making things better in the biggest sense of the word.

 The power of good design. That’s what we’ll be exploring in this season of Wireframe. What is good design? And how can it be a force for good in the world?

We’ll go from Three Mile Island to 311, augmented reality to emoji… and we’ll ask the question: Can good design actually be bad for you? That’s all coming up on… Wireframe.

CREDITS

This episode was produced by: Amy Standen, Isabella Kulkarni, Rikki Novetsky, and Abbie Ruzicka.

Devon Taylor is our editor.

Catherine Anderson is our engineer. Keegan Sanford created our show art

Learn more about the show at adobe dot ly slash wireframe.

You can subscribe to Wireframe on Apple Podcasts, Google Play, Spotify, or wherever you get your podcasts. And leave us a review! We’d love to hear what you think.

I’m Khoi Vinh. Thanks for listening.