Is Good Design Good for You?

November 13, 2018

“I think the knee jerk reaction is to say, if the product does harm it’s the product maker’s fault.” Nir Eyal

The “Like” button was a huge triumph for Facebook, or was it? Now more than ever we’re all wondering if our traditional definition of “good” design is actually in our best interests. In this episode, we take a look at the intersection of ethics and design.
KHOI: Welcome to Wireframe from Adobe, a podcast about good user experience design… (and) how we shape technology to fit into our lives.
I’m Khoi Vinh, principal designer at Adobe. And each week on this show, I, along with one of our producers, explore one aspect of good design.
This week, I’m in the studio with …
RIKKI: One second Khoi, I’m just checking Instagram.
KHOI: Uh, okay …
RIKKI: Just uploaded a new photo, so just got a few likes.
KHOI: By all means, by all means …

RIKKI: Okay. Now I just need to check my email …

KHOI: Okay, okay. Rikki, we got a whole episode in front of us here.

RIKKI: Yeah yeah, I know, I know, I’m just checking a Facebook notification  

KHOI: Okay, Rikki … Actually, you know what? This is an an excellent demonstration of PRECISELY what we’re talking about today: Addictive UX.

Rikki: Yeah, sometimes I can’t help myself …

KHOI: Exactly! Addictive UX has become an ENTIRE industry today. with designers and psychologists working together to figure out HOW to keep users engaged for LONGER.

And so, in THIS episode, we’re going to trace addictive UX to one of its origin points, uncover the mechanics of how this type of user experience actually works, and then discuss a framework for ethical design.

RIKKI: And we’re also gonna try to figure out whether there’s any chance companies will acknowledge the negative effects of addictive UX… and actually change the way they design.

But before any of that — let’s go back to an important year for our story: 2007.

RIKKI: We’re at the Facebook headquarters — which at the time — is in sunny Palo Alto, California. And just to refresh your memory about what’s going on in social media at this moment: MySpace is dominating. But Facebook is closing in. And to catch up and get some new ideas flowing, Facebook is hosting all night HACKATHONS.

JUSTIN: Hackathons start pretty late in the evening and then they go until sunrise and we just sat around a table literally all programming.

RIKKI: That’s Justin Rosenstein. Justin was an engineer at Facebook back in 2007, when Facebook had recently launched NewsFeed.

Before that, you had to go directly to someone’s profile to see any updates. BUT NOW, users had a timeline with all of this easy access to their friend’s lives. And THAT gave Justin and his colleague Leah Perlman an idea.  

JUSTIN: What if we could make it really easy to spread positivity through the system. We really believe in a world in which people build each other up rather than tear each other down. And we thought what if we could make it one click easy to spread little bits of love, positivity and affirmation through the system.

RIKKI: This idea that Justin had became an iconic design. The small but mighty LIKE BUTTON. Which, I learned, originally went by… a different name.

JUSTIN: At the time, when we were were doing the hackathon, we called it the awesome button. This is awesome! I think Like was a better choice in hindsight.

RIKKI: Justin and his team stayed up all night, coding. And when morning came, they went out for a pancake breakfast, and demo-ed this new idea to the company. But when they showed it to everyone, there were … crickets. People were skeptical that this LIKE button was a good idea.

JUSTIN: There was a sentiment of, we want to keep the user interface as clean as possible. And it felt kind of like a redundant design element. People said well if someone wants to express that kind of positivity and affirmation they can already leave a comment. Why do we need this whole extra user interface element?

RIKKI: But, Justin knew there was something to this LIKE button. It wasn’t just a lazy person’s way of leaving a comment. It would totally change how people interacted with Facebook.

JUSTIN: We felt pretty strongly that by making it one click easy, by making it this path of least resistance, that it would encourage a certain kind of behavior of leaving more of that positivity and in general just feeling more interactive and like a more interesting rich social space.

KHOI: Even though people could give positive feedback before, now it’s condensed into a single button. When it becomes just one click you’re much more likely to use that feature.
RIKKI: Yeah, so, Facebook decided test it out. They released the Like button to a small percentage of users.

JUSTIN: I was really curious would people use this thing at all. It certainly never dawned on me that it would become such a large cultural symbol that had become today.

RIKKI: Facebook’s gamble — as we all know — paid off really well. They released the Like button to all users.

Justin ended up leaving Facebook to start his own company. But as soon as the Like button was launched, he had a pretty good idea had changed Facebook…. and social media …. forever.

JUSTIN: Certainly the data that I was told about from friends was that it was indeed increasing people’s desire to post on the site, desire to consume information on the site and generally leading to more interactivity.

RIKKI: So… the Like button did ending up being a way to spread positivity. It made Facebook a public square, where people can give you validation and cheer you on — and that feels great. But it also tapped into a much darker part of human psychology. And the former leaders of Facebook are starting to concede that point.

In an interview earlier this year, Sean Parker — who was founding President of Facebook — said Facebook contains, quote “social-validation feedback loop.” And that it exploits “a vulnerability in human psychology.”  

At the time he created the Like button, Justin says he never saw that coming.

JUSTIN: It would have been a lot of hubris to be like, “Will this someday be addictive that people will use it too much in their lives?” I don’t think it occurred to anyone even years after the Like button launched, that this could be a source of addiction. I think it would have taken a very sophisticated understanding of human psychology and sociology for it to occur to someone that affirmation can can become so addictive that continuing to give people these little hits of dopamine will suck them in more and more to the service.

KHOI: I hear what Justin’s saying. When you’re designing something, you’re thinking “how do I get people to use this thing.”

You’re not a psychologist. You’re not thinking about dopamine or how your brain chemistry gives you these shots of positive affirmation. Not a whole lot of designers were thinking that way, at least back in 2007.  

RIKKI: Right. DESIGNERS haven’t been thinking about psychology and brain chemistry for all that long … but ADVERTISERS have been.

That’s why the Like button is a such big deal! It gets people to spend more time on the site. So, it bolsters Facebook’s business model — selling your attention to advertisers.

JUSTIN: In the case of social media and really media in general for the most part the person who’s paying the company is an advertiser, and the user is effectively not the customer. They are the product. Their attention is being sold to advertisers. The media producers whether that’s traditional media or social media have a deep incentive financially to maximize the amount of attention that they’re taking away from a consumer and being able to deliver that to advertisers.

RIKKI: As long as the business model is structured this way, companies will continue to keep us on our screens for as long as possible. Facebook and Google even hire researchers with backgrounds in psychology and sociology, just to figure out new ways to hold our attention.

KHOI: Right, this is like when you are scrolling down your newsfeed and a video just starts playing automatically and you didn’t even need to click on it. Or when you get an email that someone else posted a photo of you on Facebook—and you can’t help but log on just to see it.

There’s countless little details like this that just keep you hooked.

RIKKI: After the break, we’re going look at exactly how UX designers create habit forming products. And why those habits are so hard to shake…

Stick with us.


KHOI: Welcome back to Wireframe. I’m here in the studio with producer Rikki Novetsky. RIKKI: Hey Khoi. Okay, I want you to meet a new character — and this guy has a totally different perspective on addictive UX.

NIR: These products you know newsflash are built to be engaging. That’s these companies job, they are designed to get you to spend as much time as possible with them.

RIKKI: That is writer and tech consultant Nir Eyal. Nir wrote a book called Hooked: How to Build Habit Forming Products. And that book is all about the psychology behind how products can dictate our behavior.

So Khoi, in order to demonstrate this 4-step model that Nir came up with, I wanna do something interactive.

Okay, I’m picking up my phone…

KHOI: Okay

RIKKI: Going your Instagram account. Aaand I see a photo I want to like…! Okay I’m liking it.

KHOI: Hey! I just got a notification. “Rikki” liked my post!

RIKKI: And THIS is the first step of Nir’s model– The TRIGGER.

NIR: The pings, the dings, the rings the notifications, the things that give us some piece of information that tells us what to do next.

RIKKI: Okay so now that Instagram has gotten you on their app with that notification, tell me what you’re doing now.

KHOI: I’m scrolling through these photos…. Here’s a photo of an expensive looking hamburger… here are several millennials that look like they worked really hard to look like they just woke up.  

RIKKI: That’s step TWO of the model! Action! Scrolling down a feed, clicking on a video. And Khoi I want to point out — you seemed more intrigued by some photos than others.

KHOI: Yes, some of them were alright – but some are pretty terrible too.

RIKKI: You just brought us right into step three of Nir’s 4-step model, which is the reward. And the reward is of a particular kind, called a variable reward. And that means: when we open Instagram, we don’t know if what we’re gonna see will be interesting or terrible.

NIR: It’s some kind of uncertainty, some bit of mystery, that prompts the user to keep interacting with the product.

KHOI: That variable reward is a pretty interesting trick. It’s the same tactic that they use with slot machines in casinos.

RIKKI: Yeah! And this research comes from an NYU professor named Natasha Schull (Sh-uh-l). She wrote a book called Addiction by Design. And she argues that slot machines are addictive because they get you into this inescapable loop.

Pull a lever, and then you win some money. Pull again, and then you may lose. And it’s this cycle that just goes on and on, because you just never know what’s gonna happen next.

And with Instagram it’s the same thing — you keep on scrolling. And then sometimes it’s this fluffy dog, who you adore. And then sometimes it’s your ex with someone new, which you hate. And then we’re just trapped in this loop and we keep on looking at more and more photos.

KHOI: Okay so resistance is futile. Great. So what’s the fourth step? This should be the one that hooks me, right?

RIKKI: Yes, this is what Nir calls the investment phase, where YOU contribute to the site – you upload a photo or video or even when you like anything at all on the site. Now you’re part of it. You’ll want to see what kind of reactions you get … which means you’ll have to come back.

KHOI: You know this model really shows that every step of Instagram has been thought out by product designers. They really are leveraging psychology to get me to log on – stay on – and keep coming back.

RIKKI: Yeah, and the more I learn about designers so carefully plotting out our behavior, the more addictive UX can start to look a bit sinister.

Or at the very least unhealthy for us. Research out of UC San Diego and Yale, shows that Facebook use is negatively correlated with well-being. And that includes mental and physical health, and overall life satisfaction.

But despite all that  — Nir does not believe designers should have to change their practices.

NIR: All products are tools. You can use a hammer to build a house or to bash someone’s head in. I think the knee jerk reaction is to say well if the product does harm it’s the product maker’s fault. Well is it always? Not always. You know if you use a hammer incorrectly, Well. it’s kind of shame on you for not learning how to use that hammer a little bit better.

KHOI: I do see what he’s saying, but that metaphor can also be flipped on its head — A hammer is a tool, but it’s not a tool that contains design tricks to compel you to either smash a head or build a house or get addicted to hammering things.

RIKKI: True. But Nir is giving all the agency and responsibility to the user of the tool. So if you think you’re on Facebook too much, well, then… you should do something about it!

NIR: There is nothing that Zuckerberg can do if you change your notification settings. We cannot talk about these things being addictive when we haven’t taken 10 minutes to turn off the goddamn notification settings! (LAUGHS) If we hold our breath waiting for these companies to change their ways we’re going to suffocate. It just takes too long.

RIKKI: Okay so Khoi, what do you think about Nir’s point here? He’s essentially saying: We’re adults! We should take charge of our tech intake.

KHOI: Well, that’s a pretty high bar. First of all, not everyone on Facebook is an adult — and we can’t count on children and teenagers to withstand addictive technology.

And even if you ARE an adult, it’s just so hard to resist this stuff, especially when your brain chemistry is basically working against you. Plus—that’s just putting the work on the user. I really think this is the designer’s job.

RIKKI: Yeah I hear you. Justin Rosenstein also agrees with you.

JUSTIN: More and more I think we have a responsibility as product designers to develop that deep understanding, that includes remembering that what people do isn’t always exactly the same as what they want to be doing. And that, in the best cases technology helps amplify human will and helps align our attention with our intention.

KHOI: I think that’s so important, what he said. That difference between what a user will do and what they actually want to do, or what they should be doing. This is really where designers should be focusing their attention.

You really have to think about the potential impact of the work that you’re doing as a designer.. Even if you can’t always predict it, you need to spend some time really trying to appreciate what it is that you’re building.

RIKKI: When we know that technology can be bad for us… the next question is, what do we do about that? Justin cautions us to stay away from extremes.

JUSTIN: When all technologies seemed to have these unintended consequences, I think one attitude you could have toward that is Luddite-ism. Of well, we don’t know what we’re doing, we’re not competent to develop new technologies as a species we should just keep things simple and maybe even go back to nature. I think that might be an overreaction, but also unbridled technological optimism where we just assume everything we build is for the best — is also naive.

KHOI: It’s pretty powerful to hear someone who actually makes this stuff become skeptical of it.

RIKKI: Agreed… and Justin isn’t the only one in Silicon Valley who’s calling this out.

He’s an advisor to the Center for Humane Technology- which is a group that talks about the harm caused by some tech. And then they also promote designing new tech that’s better for our mental health.  

JUSTIN: As the producers of technology we have a responsibility to think really carefully about how are we creating these tools and what sort of behaviors are we encouraging? But we have the opportunity to develop as a discipline the capacity to redesign these things in a way that brings out the best in human nature rather than brings out distraction.

RIKKI: And Justin’s putting his ideas into action. Since leaving Facebook, he created a new company called Asana, that helps teams collaborate and manage their work.

KHOI: Right and Asana is a good example. But… If we want designers to change the way they think about the purpose of technology — we need something more! We need commonly accepted principles to understand what makes a design ethical or unethical.

RIKKI: So I spoke with someone working on just that… Pamela Pavliscak (pav-li-shock).

She’s a design researcher and professor at the Pratt Institute in New York, and studies the emotional layer of design. I caught up with her in her office and we got right into the big question in her field: defining. ethical. design.

PAMELA: My definition of ethical design is design that is based on consent. Just like a human to human relationship would be. You would not have a relationship with another person where they got to control all of your time, and what you saw, and when you saw it, and how much time you spent, without you having any say in it.

RIKKI: This definition of ethical design? It leads us pretty neatly into the social networks that we can’t seem to look away from.

PAMELA: With addictive UX, we don’t have consent. We’re not aware of what our role is in it, and what’s being recorded about us, how we’re being changed by it. We don’t have a lot of say.

KHOI: So when you lose control in the middle of a user experience, that sorta means a designer has taken advantage of your weakness, and they’re exerting a kind of power over you.

RIKKI: And THAT’s what Pamela calls unETHICAL design…

KHOI: Right, and this guideline that Pamela is talking about is bold. She’s being very clear cut about it. Which I think is what we need. Designers tend to think of our work as neutral, as not being inherently good or bad. But what Pamela is saying? is that we can actually impact people positively or negatively. And that means we bear some real responsibility here.

RIKKI: Which also means we do need to draw some lines between designers helping and hurting. And for Pamela where line is pretty clear.

PAMELA: A lot of us know it when we see it or know it when we experience it ourselves that we feel that our time isn’t being respected, that we’re not feeling happy after we use the site. Instead we’re feeling kind of manipulated or distracted or angry. I think we all know when we’re at 2 a.m. on Twitter obsessively like scrolling through tweets and we can’t sleep. We know that’s probably not a good situation.

RIKKI: Okay but here’s the problem: Twitter gets a big chunk of its revenue from ads. So you being on Twitter at 2am is actually pretty good for them — they WANT to keep their users engaged for as long as possible.

So here’s Pamela’s idea — change the business model.

PAMELA: We could measure other things that aren’t engagement in the short term but in fact are long term relationships. And a lot of companies are starting to look at that. And say well OK let’s look at the lifetime of the relationship and start to try to find things to measure there. Let’s look at other emotional or well-being factors and try to measure that. How well does our experience fit in with these higher motivations that people have for belonging or status.

KHOI: (That’s totally right) If products make us miserable, then we just aren’t going to use them in the long run.

So instead of just exploiting distractions, you can help them focus on achieving their LONG-TERM GOALS. That’s how they become lifetime customers who might actually adore what you make.

RIKKI: Yeah and we’re starting to see companies think that way.  Earlier this year, Facebook invited mental health experts to advise the company on new tools that help people manage their Facebook usage.

And there’s another movement afoot. A sort of anti-technology response. Where people just decide the solution to addictive tech is… putting your phone down.

KHOI: Putting down your phone is a good idea. But putting down your phone as a way to cope with addictive user experiences that’s not an effective long term solution.

RIKKI: Yeah, Pamela brought that up, too.

PAMELA: Maybe you can’t put down your phone, in a way, because it’s already shaped how you think, and how you view the world and how you handle your relationships.

RIKKI: What Pamela is saying here is, technology is completely woven into our daily lives. And in order to figure out a path forward for ethical design, we need to embrace that simple fact.

Ethical UX design will emerge from the assumption that our phones are necessary. And can be used as a force for good.

KHOI: You know, in many ways, the growth of technology has outpaced the education we get as designers. I mean, it was complicated enough when we had to start designing products for different devices and platforms all at the same time. Now there’s a whole new level of complexity, the idea that these design solutions now have a very real impact on the world at large.

So UX design is no longer just about what flows well, or what is technically beautiful. Because it defines the way we talk to each other now and the way we experience what’s around us.

And that is pretty daunting. But it’s also kind of the make or break moment for what it means to practice design.

The question now is: Are we going to retreat back into our corners, and just focus on what’s directly in front of us, on our screens? Or are we going to rise to the occasion? Answer these challenges. And… actually try to be a force for good in the world.

KHOI: Next week on Wireframe, we’ll talk about inclusive design:

BRUCE: We know that we can never really be other people. So we have to go out and talk to those other people. We believe empathy is engaging with the people that we’re trying to work with.

This episode was produced by Rikki Novetsky, Amy Standen, Isabella Kulkarni, and Abbie Ruzicka.

Rachel Ward is our editor.

Catherine Anderson is our engineer. And Keegan Sanford created our show art.

Learn more about the show at

Subscribe to Wireframe on Apple Podcasts, Google Play, Spotify, or wherever you get your podcasts.

And leave us a review! We’d love to hear what you think.

I’m Khoi Vinh. Thanks for listening.