Search
Jonathan Shariat

SE Radio 528: Jonathan Shariat on Designing to Avoid Worst-Case Outcomes

Jonathan Shariat, coauthor of the book Tragic Design, discusses harmful software design. SE Radio host Jeremy Jung speaks with Shariat about how poor design can kill in the medical industry, accidentally causing harm with features meant to bring joy, what to consider during the product development cycle, industry-specific checklists and testing requirements, creating guiding principles for a team, why medical software often has poor UX, designing for crisis situations, and why employing deceptive design patterns (also known as “dark patterns”) can be bad for products in the long term.


Show Notes

Related Links

Related SE Radio Episodes

Transcript

Transcript brought to you by IEEE Software magazine.
This transcript was automatically generated. To suggest improvements in the text, please contact [email protected] and include the episode number and URL.

Jeremy Jung 00:00:16 Today I’m talking to Jonathan Shariat, he’s the co-author of Tragic Design, the host of The Design Review podcast, and he’s currently a senior interaction designer and accessibility program lead at Google. Jonathan, welcome to Software Engineering Radio.

Jonathan Shariat 00:00:32 Hi Jeremy, thank you so much for having me on.

Jeremy Jung 00:00:34 The title of your book is Tragic Design. And I think that people can take a lot of different meanings from that. So I wonder if you could start by explaining what tragic design means to you.

Jonathan Shariat 00:00:48 For me, it really started with this story that we have in the beginning of the book, it’s also online. I originally wrote it as a medium article and that’s really what opened my eyes to Hey design is this kind of invisible world all around us that we actually depend on very critically in some cases. And so this story was about a girl, a nameless girl, but we named her Jenny for the story. And in short, she came for treatment of cancer at the hospital, was given the medication and the nurses that were taking care of her were so distracted with the software they were using to chart, make orders, things like that, that they missed the fact that she needed hydration and that she wasn’t getting it. And then because of that, she passed away. And I still remember that feeling of just kind of outrage.

Jonathan Shariat 00:01:43 And we hear a lot of news stories. A lot of them are outraging. They touch us, but some of those feelings stay and they stick with you. And for me, that stuck with me, I just couldn’t let it go because I think a lot of your listeners will relate to this. Like we get into technology because we really care about the potential of technology. What can it do? What are all the awesome things it could do if we come at a problem and we think of all the ways it could be solved with technology and here it was, doing the exact opposite. It was causing problems. It was causing harm and the design of that, or the way that was built or whatever it was failing Jenny, it was failing the nurses too, right? Like a lot of times we blame that end user and that causes.

Jonathan Shariat 00:02:27 So to me, that story was so tragic. Something that deeply saddened me and was regrettable and cut short someone’s life. And that’s the definition of tragic. And there’s a lot of other examples with varying degrees of tragic, but as we look at the impact technology has, and then the impact we have in creating those technologies that have such large impacts, we have a responsibility to really look into that and make sure we’re doing as best of job as we can and avoid those as much as possible. Because the biggest thing I learned in researching all these stories was, Hey, these aren’t bad people. These aren’t people who are clueless and making these terrible mistakes. They’re me, they’re you, they’re people just like you and I, that could make the same mistakes.

Jeremy Jung 00:03:16 I think it’s pretty clear to our audience where there was a loss of life, someone died and that’s clearly tragic, right? So I think a lot of things in the healthcare field, if there’s a real negative outcome, whether it’s death or severe harm, we can clearly see that as tragic. And I know in your book, you talk about a lot of other types of, I guess, negative things that software can cause. So I wonder if you could explain a little bit about now past the death and the severe injury what’s tragic to you.

Jonathan Shariat 00:03:58 Yeah. Still in that line of like of injury and death, the side that most of us will actually impact our work day-to-day is also physical harm. Like creating the software in a car. I think that’s a fairly common one, but also ergonomics, right? Like, and we bring it back to something less impactful, but still like multiplied over the impact of a product rather, it can be quite big, right? Like if we’re designing software in a way that’s very repetitive, or everyone’s got that scroll, thumb scroll issue, right. Our phones aren’t designed well. So there’s a lot of ways that it can still physically impact you ergonomically and that can cause you a lot of problem, arthritis and pain. There’s other ways that are still really impactful. So the other one is by saddening or angering us. That emotional harm is very real. And oftentimes sometimes gets overlooked a little bit because it’s physical harm is one is so real to us, but sometimes emotional harm isn’t.

Jonathan Shariat 00:04:55 But, we talk about in the book, the example of Facebook putting together this great feature, which takes your most liked photo and celebrates your whole year by saying, Hey, look, there’s a yearend review. That’s the top photo from the year. They add some great, well-done illustrations behind it, of balloons and confetti and people dancing. But some people had a bad year. Some people’s most liked engaged photo is because something bad happened and they totally missed this. And because of that, people had a really bad time with this where they lost their child that year. They lost their loved one that year, their house burned down. Something really bad happened to them. And here was Facebook putting that photo of their dead child up with balloons and confetti and people dancing around it. And that was really hard for people.

Jonathan Shariat 00:05:54 They didn’t want to be reminded of that. And especially in that way, and these emotional harms also come into the play of line around anger. We talk about, well, one, there’s a lot of software out there that tries to bring up news stories that anger us and which equals engagement, but also ones that use dark patterns to trick us into purchasing and buying and forgetting about that free trial. So they charge us for a yearly subscription and won’t refund us. Like if you’ve ever tried to cancel a subscription, you start to see some real they’re their real colors. So emotional harm and anger is a big one. We also talk about injustice in the book where their products that are supposed to be providing justice in very real ways like voting or getting people the help that they need from the government or for people to see their loved ones in jail, or you getting a ticket unfairly because you couldn’t read the sign, where you’re trying to read the sign and you couldn’t understand it. We look at a lot of different ways that design and the software that we create can have very real impact on people’s lives and in a negative way, if we’re not careful.

Jeremy Jung 00:07:04 The impression I get, when you talk about tragic design, it’s really about anything that could harm a person, whether physically, emotionally make them angry, make them sad. And I think the most liked photo example is a great one, because like you said, I think people may be building something that harms and they may have no idea that they’re doing it.

Jonathan Shariat 00:07:29 Exactly. I love that story because not to just jump on the bandwagon of saying bad things about Facebook or something. No, I love that story because I can see myself designing the exact same thing, like being a part of that product, building it, looking at the specifications, the PM that put together and the decks that we had. Like I could totally see that happening and just never, I thinking never having the thought, because our we’re so focused on like delighting our users and we have these metrics and these things in mind. So that’s why, like in the book, we really talk about a few different processes that need to be part of that product development cycle to stop, pause, and think about like, well, what are the negative aspects here? Like what are the things that could go wrong?

Jonathan Shariat 00:08:11 What are the other life experiences that are negative? That could be a part of this and you don’t need to be a genius to think of every single thing out there? Like in this example, I think just talking about, oh, well, some people might have had, if they would’ve taken probably like one hour out of their entire project, maybe even 10 minutes, they might have come up with, oh, there could be bad thing. Right. But so if you don’t have that moment to pause that moment to just say, okay, we have time to brainstorm together about like how this could go wrong or how the negative of life could be impacted by this feature. That’s all that it takes. It doesn’t necessarily mean that you need to do some giant study around the potential impact of this product and all the ways, but really just having a part of your process that takes a moment to think about that will just create a better product and better product outcomes. If you think about all of life’s experiences and Facebook can say, Hey, condolences, and like, and show that thoughtfulness that would of higher engagement, that would’ve higher satisfaction, right? So they could have created a better outcome by considering these things and of obviously avoid the negative impact to users and the negative impact to their product.

Jeremy Jung 00:09:29 Continuing on with that thought, you’re a senior interaction designer and you’re an accessibility program lead. So I wonder on the projects that you work on, and maybe you can give a specific example, but how are you ensuring that you’re not running up against these problems where you build something that you think is going to be really great for your users, but in reality ends up being harmful in specific cases.

Jonathan Shariat 00:09:56 Yeah. One of the best ways is, I mean, it should be part of multiple parts of your cycle. If you want something, if you want a specific outcome out of your product development, a life cycle, it needs to be from the very beginning and then a few more times, so that it’s not, I think a programmers will all latch onto this, where they have the worst end of the stick, right? Because in QA as well, because any bad decision or assumption that’s happened early on with the business team or the PM gets like multiplied when they talk to the designer and then gets multiplied again, they hand it off. And it’s always the engineer who has to put the final foot down and be like, this doesn’t make sense. Or I think users are going to react this way. Or this is the implication of that assumption.

Jonathan Shariat 00:10:44 It’s the same thing in our team. We have it in the very early stage when someone’s putting together the idea for the, the feature or a project we want to work on it’s right there. There’s a few, there’s like a section about accessibility and a few other sections talking about like looking out for this negative impact. So right away, we can have a discussion about it when we’re talking about like what we should do about this and the, and the different implications of implementing it. That’s the perfect place for it. Maybe when you’re brainstorming about like, what should we do? Maybe it’s not okay there because you’re, you’re trying to be creative. Right. You’re trying to think. But at the very next step, when you’re saying, okay, like what would it mean to build this? That’s exactly where it should start showing up and the discussion from the team.

Jonathan Shariat 00:11:29 And it depends also the risk involved, right? Like, which is attached to how much time and effort and resources you should put towards avoiding that risk? It’s risk management. So if you work like my colleagues or some of my friends who work in the automotive industry and you’re creating a software and you’re worried that it might be distracting, there might be a lot more time and effort or the, the healthcare industry. Those might need to take a lot more resources. But if you’re maybe building SAS software for engineers to spin up their resources, there might be a different amount of resources and never is zero. Because you still have, are dealing with people and you’ll impact them. And maybe that service goes down and that was a healthcare service that went down. So you really have to think about what the risk is. And then you can map that back to how much time and effort you need to be spending on getting that right.

Jonathan Shariat 00:12:22 And accessibility is one of those things too, where a lot of people think that it takes a lot of effort, a lot of resources to be accessible. And it really isn’t, it’s just like tech debt. If you have ignored your tech debt for five years, and then they’re saying, Hey, let’s, I’ll fix all the tech debt. Yeah. Nobody’s going to be on board for that as much, versus like, if addressing that and finding the right level of tech debt that you’re okay with and when you address it and how, and just better practices. That’s the same thing with accessibility. It’s like, if you’re just building it correctly, as you go, it’s very low effort and it just creates a better product, better decisions. And it is totally worth the increased amount of people who can use it and the improved quality for all users. So, yeah, it’s just kind of like a win-win situation.

Jeremy Jung 00:13:11 One of the things you mentioned was that this should all start at the very beginning, or at least right after you’ve decided on what kind of product you’re going to build, and that’s going to make it much easier than if you come in later and try to make fixes. Then I wonder when you’re all getting together and you’re trying to come up with these scenarios, try to figure out negative impacts, what kind of accessibility needs you need to have, who are the people who are involved in that conversation? Like, you have a team of 50 people who needs to be in the room from the very beginning to start working this out?

Jonathan Shariat 00:13:50 I think it would be the, the same people who are there for the project planning. And my team, we have our eng. counterparts there — at least the team lead, if there’s a lot of them. But if they would go to the project kickoff, they should be there. We have everybody there: PM, design, engineers, our project manager, anyone who wants to contribute should really be there because the more minds you have with this, the better you’ll tease out much, much more of, of all the potential problems, because you have a more diverse set of brains and eclectic life experiences to draw from. And so you’ll get closer to that 80% mark that you can just quickly take off a lot of those big items off the table or off the bat.

Jeremy Jung 00:14:33 Is there any kind of formal process you follow, or is it more just, people are thinking of ideas, putting them out there, and just having a conversation?

Jonathan Shariat 00:14:43 Yeah. Again, it depends which industry you’re in and what the risk is. So I previously worked at a healthcare industry, and for us to make sure that we get that right on how it’s going to impact the patients, especially that is cancer care. And, and they were using our product to get early warnings of adverse effects. Our system of figuring that if that was going to be an issue was more formalized. In some cases like healthcare, especially if it’s a device or in certain software circumstances, it’s determined by the FDA to be a certain category, you literally have a governmental version of this. So, the only reason that’s there is because it can prevent a lot of harm, right? So, that one is enforced, but there’s reasons outside of the FDA to have that exact formalized part of your process. The size of it should scale depending on what the risk is.

Jonathan Shariat 00:15:40 So, on my team, the risk is actually somewhat low. It’s really just part of the planning process. We do have moments when we’re brainstorming, like what we should do and how the feature will actually work, where we talk about like what those risks are and calling out the accessibility issues. And then we address those. And then as we are ready to get ready to ship, we have another formalized part of the process there where we check if the accessibility has been taken care of. And if everything makes sense, as far as the impact to users. So we have those places, but in healthcare, it was much stronger where we had to make sure that we’ve tested it. It’s robust. It’s going to work when we think it’s going to work. We do user testing. It has to pass that user testing, things like that, before we’re able to ship it to the end user.

Jeremy Jung 00:16:27 So in healthcare, you said that the FDA actually provides, is it a checklist of things to follow where you must have done this kind of user testing and you must have verified these things that’s actually given to you by the government?

Jonathan Shariat 00:16:40 That’s right. Yeah. It’s like a checklist and the testing requirement, and there’s also levels there. So I’ve only done the lowest level. I know there’s two more levels above that. And again, that’s like, because the risk is higher and higher, and there’s stricter requirements there where maybe somebody in the FDA needs to review it at some point. And again, like mapping it back to the risk that your company has is really important. And understanding that is going to help you avoid and build a better product, avoid the bad impact and build a better product. And I think that’s one of the things I would like to focus on as well. And I’d like to highlight for your listeners is that it’s not just about avoiding tragic design. Because one thing I’ve discovered since writing the book and sharing it with a lot of people is that the exact opposite thing is usually, in a vast majority of the cases, ends up being a strategically great thing to pursue for the product and the company.

Jonathan Shariat 00:17:35 If you think about that example with Facebook, okay, you’ve run into a problem that you want to avoid. But if you actually do a 180 there and you find ways to engage with people when they’re grieving, you find ways to develop features that help people who are grieving, you’ve created a value to your users, that you can help build the company off of, right? Because they were already building a bunch of joy features, right? And also like user privacy. Like we see Apple doing that really well, where they say, okay, we are going to do our ML on device. We are going to let users decide on every permission and things like that. And that is a strategy. We also see that with like something like T-Mobile: when they initially started out, they were like one of the nobody telecoms in the world.

Jonathan Shariat 00:18:23 And they said, okay, what are all the unethical bad things that our competitors are doing? They’re charging extra fees. They have these weird data caps that are really confusing and don’t make any sense. There are contracts you get locked into for many years. And they just did the exact opposite of that and that became their business strategy. And it worked for them. Now they’re the top company. So I think there’s a lot of things like that, where you just look at the exact opposite and one, you get to avoid the bad, tragic design, but you also, boom, you see an opportunity that can become a business strategy.

Jeremy Jung 00:18:56 So when you refer to exact opposite, I guess you’re, you’re looking for the potentially negative outcomes that could happen. There was the Facebook example of seeing a photo or being reminded of a really sad event and figuring out, can I build a product around still having that same picture, but recontextualizing it — like showing you that picture in a way that’s not going to make you sad or upset, but is actually a positive.

Jonathan Shariat 00:19:27 Yeah. I mean, I don’t know maybe what the solution was, but like one example that comes to mind is some companies now, before Mother’s Day will send you an email and say, Hey, this is coming up. Do you want us to send you emails about Mother’s Day? Because for some people that’s can be very painful. That’s very thoughtful. Right? And that’s a great way to show that you, that you care. Think about that Facebook example. Like if there’s a formalized way to engage with grieving, like I would use Facebook for that. I don’t use Facebook very often or almost at all, but if somebody passed away, I would engage right with my Facebook account. And I would say, okay, look, there’s like, there’s this whole formalized feature around and Facebook understands grieving and Facebook understands this event and like smoothes that process, creates comfort for the community. That’s value and engagement that’s worthwhile versus artificial engagement.

Jonathan Shariat 00:20:20 That’s for the sake of engagement. And that would create a better feeling towards Facebook. I would maybe like then spend more time on Facebook. So it’s in their mutual interest to do it the right way. And so it’s great to focus on these things to avoid harm, but also to start to see new opportunities for innovation. And we see this a lot already in accessibility where there’s so many innovations that have come from just fixing accessibility issues, like closed captions. We all use it on our TVs in busy crowded spaces, on videos that have no translation for us, and different places. So SEO is the same thing. Like you get a lot of SEO benefit from describing your images and making everything semantic and things like that. And that also helps screen readers, and different innovations have come because somebody wanted to solve an accessibility need.

Jonathan Shariat 00:21:13 And then the one I love, I think is the most common one is readability contrast and text size. Sure, there’s some people who won’t be able to read it at all, but it hurts my eyes to read bad contrast and bad text size. And so it just benefits everyone, creates a better design. And one of the things that comes up so often when I’m the accessibility program lead. And so I see a lot of our bugs is so many issues that are caught because of our audits and our test cases around accessibility that just are bad design and are a bad experience for everyone. And so, we’re able to fix that and it’s just another driver of innovation and there’s a ton of accessibility examples. And I think there’s also a ton of these other ethical examples or avoiding harm where it’s an opportunity area where it’s like, oh, let’s avoid that. But then if you turn around, you can see that there’s a big opportunity to create a business strategy out of it.

Jeremy Jung 00:22:07 Can you think of any specific examples where you’ve seen that, where somebody doesn’t treat it as something to avoid, but actually sees that as an opportunity?

Jonathan Shariat 00:22:17 I think that the Apple example is a really good one. From the beginning they saw like, okay, in the market, there’s a lot of abuse of information and people don’t like that. So they created a business strategy around that, and that’s become a big differentiator for them. They have like ML in the device. They have a lot of these permission settings with Facebook was very much focused right on using customer data, and a lot of it, without really asking their permission. And so once Apple said, okay, now all apps need to show what you’re tracking and, and asked for permission to do that. A lot of people said no, and that caused about $10 billion of loss for Facebook, and for Apple it’s they advertise on that now that we’re ethical, that we source things ethically, and we care about user privacy. And that’s a strong position, right? I think there’s a lot of other examples out there. Like I mentioned, accessibility and others, but like they’re kind of overflowing, so it’s hard to pick one.

Jeremy Jung 00:23:13 Yeah. And I think what’s interesting about that too, is with the example of focusing on user privacy or trying to be more sensitive around death, or things like that, is I think that other people in the industry will notice that and then in their own products then they may start to incorporate those things as well.

Jonathan Shariat 00:23:33 Yeah, yeah, exactly. With the example with T-Mobile, once that worked really, really well and they just ate up the entire market, all the other companies followed suit, right? Like now, having those data caps are very rare. Having those surprise fees are a lot rarer. There’s no more like deep contracts that lock you in and et cetera, et cetera. Like a lot of those have become industry standard now. And it does improve the environment for everyone because that becomes a competitive advantage that everybody needs to meet. So yeah, I think that’s really, really important. So when you’re going through your product’s life cycle, you might not have the ability to make these big strategic decisions — like, we want to not have data at caps or whatever — but if you’re on that Facebook level and you run into that issue, you could say, Hmm, well what could we do to address this?

Jonathan Shariat 00:24:21 What could we do to help this and make that a robust feature? We talk about a lot of these dating apps. One of the problems was a lot of abuse where women were being harassed or after the date didn’t go well, things were happening. And so a lot of these dating apps have differentiated themselves and attracted a lot of that market because they deal with that really well. And they have it’s built into the strategy. So it’s oftentimes like a really good place to start too, because one, it’s not something we generally think about very well, which means your competitors haven’t thought about it very well, which means it’s a great place to build products, ideas off of.

Jeremy Jung 00:24:59 Yeah, that, that’s a good point because I think so many applications now are like social media applications, they’re messaging applications, they’re video chat, that sort of thing. When those applications were first built, they didn’t really think so much about what if someone is sending hateful messages or sending pictures that people really don’t want to see, people are doing abusive things. It was like, they just assumed that, oh, people will be, people will be good to each other and it’ll be fine. But in the last 10 years, pretty much all of the major social media companies have tried to figure out like, okay, what do I do if someone is being abusive, and what’s the process for that. And basically they all have to do something now.

Jonathan Shariat 00:25:45 Yeah. And that’s a hard thing to like if that unethical or that bad design decision is deep within your business strategy and your company strategy, it’s hard to undo that. Some companies still have to do that very suddenly and deal with it, right? Like I know Uber had a lot, a big part of that and like, and some other companies, but, or like almost suddenly everything will come to a head and they’ll need to deal with it. Twitter now trying to be acquired by Elon Musk, some of those things are coming to light. But what I find really interesting is that these areas are really ripe for innovation. So if you’re interested in a startup idea or you’re working in a startup or you’re about to start one, there’s a lot of maybe a lot of people out there who are thinking about side projects right now, this is a great way to differentiate and win that market against other well-established competitors is to say, okay, well, what are they doing right now that is unethical and is core to their business strategy? And doing that differently is really what will help you to win that market. And we see that happening all the time especially the ones that are like these established leaders in the market, they can’t pivot like you can. So being able to say, and we’re going to do this ethically, we’re going to do this with these tragic designs in mind and doing the opposite. That’s going to help you define your traction in the market.

Jeremy Jung 00:27:09 Earlier, we were talking about how in the medical field, there is specific regulation or at least requirements to try and avoid this kind of tragic design. I notice you also worked for Intuit before. So for financial services, I was wondering if there was anything similar where the government is stepping in and saying like, you need to make sure that these things happen to avoid these harmful things that can come up.

Jonathan Shariat 00:27:35 Yeah. I don’t know. I mean, I didn’t work on TurboTax. So I worked on QuickBooks, which is like an accounting software for small businesses. And I was surprised like we didn’t have a lot of those robust things. We just relied on user feedback to tell us, like, things were not going well. And I think we should have, like, I think that that was a missed opportunity to show your users that you understand them and you care, and to find those opportunity areas. So we didn’t have enough of that. And there was things that we shipped that didn’t work correctly right off the box, which it happens, but had a negative impact to users. So it’s like, okay, well, what do we do about that? How do we fix that? The more you formalize that and make it part of your process, the more you get out of it.

Jonathan Shariat 00:28:21 And actually, this is a good pausing point beat that I think will affect a lot of engineers listening to this. So if you remember in the book, we talk about the Ford Pinto story, and the reason I want to talk about this story and why I added it to the book, is that one, I think this is the thing that engineers deal with the most and designers do too, which is that, okay, we see the problem, but we don’t think it’s worth fixing. Okay? That’s what we’re going to dig into here. So, so hold on for a second while I explain some history about this car. So the Ford Pinto, if you’re not familiar, is notorious because it was designed and built and shipped, and they knowingly had this problem where if it was rear ended at even like a pretty low speed, it would burst into flames because the gas tank would rupture.

Jonathan Shariat 00:29:10 And then, oftentimes the doors would get jammed. And so it became a death trap of fire and caused many deaths, a lot of injuries. And in an interview with the CEO at the time, almost destroyed Ford — very seriously, would’ve brought the whole company down. And during the design of it and design meaning in the engineering sense and the engineering design of it, they found this problem and the engineers came up with their best solution, was this rubber block. And the cost was, I forget how many dollars let’s say it was like $9 or say $6. But this is again, back then. And also the margin on these cars was very, very, very thin and very important to have the lowest price in the market to win those markets. The customers were very price sensitive. So they, being like the Legal team looked at like some recent cases where they had the value of life and started to come up with like a here’s how many people would sue us and here’s how much it would cost to settle all those.

Jonathan Shariat 00:30:11 And then here’s how much it would cost to add this to all the cars. And it was cheaper for them to just go with the lawsuits they found. And I think why this is so important is because of the two things that happened afterward. One, they were wrong. It was a lot more people it affected and the lawsuits were for a lot more money. And two, after all this was going crazy and it was about to destroy the company, they went back to the drawing board and what did the engineers find? They found a cheaper solution. They were able to rework that rubber block and get it under the margin and be able to hit the mark that they wanted to. There’s a lot of focus on the first part, cause it’s so unethical– the value of life and doing that calculation and being like, we’re willing to have people die — but in some industries, it’s really hard to get away with that, but it’s also very easy to get into that. .

Jonathan Shariat 00:31:03 And it’s very easy to get lulled into this sense of like, oh, we’re just going to crunch the numbers and see how many users it affects. And we’re okay with that. Versus when you have principles and you have kind of a hard line and you, you care a lot more than you should. And you really push yourselves to create a more ethical, safer, avoiding tragic design. There’s a solution out there. You actually get to innovation. You actually get to solving the problem versus when you just rely on, oh the cost benefit analysis we did is that it’s going to take an engineer a month to fix this and blah, blah, blah. If you have those values, if you have those principles and you’re like, what, we’re not okay shipping this. Then you’ll find that they’re like, okay, there’s, there’s a cheaper way there to fix this. There’s another way we could address this. And that happens so often. And I know a lot of engineers deal with that. A lot of saying like, oh this is not worth our time to fix. This is not worth our time to fix. And that’s why you need those principles is because oftentimes you don’t see it, but it’s right there, right outside of the edge of your vision.

Jeremy Jung 00:32:13 Yeah. I mean, with the Pinto example, I’m just picturing — obviously there wasn’t JIRA back then, but you can imagine that somebody having an issue that, Hey, when somebody hits the back of the car, it’s going to catch on fire and, and going like, well, how do I prioritize that? Right? Like, is this a medium ticket? Is this a high ticket? And it just seems insane, right? That you could make the decision like, oh no, this isn’t that big an issue. We can move it down to a low priority and ship it.

Jonathan Shariat 00:32:39 Yeah, exactly. And that, and that’s really what principles do for you, right? Is they help you make the tough decisions? You don’t need a principal for an easy one. And that’s why I really encourage people in the book to come together as a team and come up with what are your guiding principles? And that way it’s not a discussion point every single time. It’s like, Hey, we’ve agreed that this is something that we are going to care about. This is something that we are going to stop and fix. Like, one of the things I really like about my team at Google is product excellence is very important to us. And there’s certain things that we’re okay with letting slip and fixing at a next iteration. And we make sure we actually do that. So it’s not like we always address everything, but because it’s one of our principles, we care more.

Jonathan Shariat 00:33:23 We have more, we take on more of those tickets and we take on more of those things and make sure that they’re fixed before we ship. And it shows, like, to the end user that this company cares and they have quality. You need a principle to kind of guide you through those difficult things that aren’t obvious on a decision-to-decision basis, but you know strategically get you somewhere important, like design debt or technical debt where it’s like this, should we optimize this chunk of code, like, nah, but in grouping together with a hundred of those decisions, yeah. It’s going to, to slow down every single project from here on out. So that’s why you need those principles.

Jeremy Jung 00:34:02 So in the book, there are a few examples of software in healthcare. And when you think about principles, you would think that generally everybody on the team would be on board that we want to give whatever patient that’s involved, we want to give them good care. We want them to be healthy. We don’t want them to be harmed. And given that, I’m wondering because you interviewed multiple people in the book, you have a few different case studies. Why do you think that medical software, in particular, seems to be so, it seems to have such poor UX or has so many issues?

Jonathan Shariat 00:34:44 Yeah. That that’s a complicated topic. I would summarize it with a few, like maybe three different reasons. One, which I think is a, maybe a driving factor of, of some of the other ones, is that the way that medical industry works is the person who purchases the software is not the end user. So it’s not like you have doctors and nurses voting on which software to use. And so oftentimes it’s more of like a sales deal and then just gets pushed out, and they also have to commit to these things like the software is very expensive. And initially in the early days was very much like needs to be installed, maintained. There needs to be training. So there was a lot of money to be made in that software. The investment from the hospital was a lot. So they can’t just be like, oh, can we actually, don’t like this one, we’re going to switch to the next one.

Jonathan Shariat 00:35:35 So because like, once it’s sold, it’s really easy to just keep that customer, there’s very little incentive to like really improve it, unless you’re selling them a new feature. So there’s a lot of feature add-ons because they can charge more for those, but improving the experience and all that kind of stuff, there is less of that. I think also there’s just generally a lot less understanding of design in that field. Because there’s sort of like traditions of things, they end up putting a lot of the pressure and the responsibility on the end individuals. So you’ve heard recently of that nurse who had made a medication error and she’s going to jail for that. And oftentimes we blame that end person. So the nurse gets all the blame or the doctor gets all the blame. Well, what about the software who like made that confusing, or what about the medication that looks exactly like this other medication?

Jonathan Shariat 00:36:27 Or what about the pump tool that you have to type everything in very specifically, and the nurses are very busy. They’re doing a lot of work, they have 12-hour shifts. They’re dealing with lots of different patients, a lot of changing things. For them to have to worry about having to type something a specific way… And yet, when those problems happen, what do they do? They don’t go in like redesign the devices or they: more training, more training, more training, more training, and people only can absorb so much training. And so I think that’s part of the problem is that, like, there’s no desire to change. They blame the wrong person. And lastly, I think that it is starting to change. And I think we’re starting to see like the ability — because of the fact that the government is pushing healthcare records to be more interoperable, meaning like I can take my health records anywhere. A lot of the power comes in where the data is. And so I’m hoping that as the government and people and initiatives push these big companies like Epic to be more open, that things will improve. One is because they’ll have to, to keep up with their competitors and that more competitors will be out there to improve things. I think that there’s the knowhow out there, but like, because there’s no incentive to change, and there’s no like turnover in systems, and there’s the blaming of the end user, we’re not going to see a change anytime soon.

Jeremy Jung 00:37:51 That’s a good point in terms of like, it seems like even though you have all these people who may have good ideas may want to do a startup, if you’ve got all these hospitals that are already locked into this very expensive system, then where’s the room to kind of get in there and have that change?

Jonathan Shariat 00:38:09 Yeah.

Jeremy Jung 00:38:10 And another thing that you talk about in the book is about how, when you’re in a crisis situation, the way that a user interacts with something is very different. And I wonder if you have any specific examples for software when that can happen?

Jonathan Shariat 00:38:29 Yeah. Designing for crisis is a very important part of every software because it might be hard for you to imagine being in that situation, but it definitely will still happen. So, one example that comes to mind is let’s say you’re working on Cloud software, like AWS or Google Cloud, right? There’s definitely use cases and user journeys in your product where somebody would be very panicked. Right? And if you’ve ever been on-call with something and it goes south, and it’s a big deal, you don’t think right, right? Like when we’re in crisis, our brains go into a totally different mode of like that fight or flight mode. And we don’t think the way we do. It’s really hard to read and comprehend, very hard. And we might not make this the right decisions and things like that. So thinking about that, like maybe you’re, let’s say, like, going back to that, the Cloud software, like, let’s say you’re working on that.

Jonathan Shariat 00:39:21 Are you relying on the user reading a bunch of text about this button, or is it very clear from the way you’ve crafted that exact button copy and how big it is, and where it is in relation to a bunch of other content? Like what exactly it does, it’s going to shut down the instance or it’s going to do it at a delay or whatever. All those little decisions like are really impactful. And when you run them through the furnace of a user journey, that’s relying on a really urgent situation, you’ll obviously help that you’ll start to see problems in your UI that you hadn’t noticed before. Or different problems in the way you’re implementing things that you didn’t notice before, because you’re seeing it from a different way. And that’s one of the great things about the systems in the book that we talk about around, like thinking about how things go wrong or thinking about designing for crisis is it makes you think of some new use cases, which makes you think of some new ways to improve your product. That improvement you make to make it so obvious that someone could do it in a crisis would help everyone, even when they’re not in a crisis. That’s why it’s important to focus on those things.

Jeremy Jung 00:40:24 And for someone who is working on these products, it’s kind of hard to trigger that feeling of crisis if there isn’t actually a crisis happening. So I wonder if you can talk a little bit about how you try to design for that when it’s not really happening to you. You’re just trying to imagine what it would feel like.

Jonathan Shariat 00:40:45 Yeah. You’re never really going to be able to do that. So some of it has to be simulated. One of the ways that we are able to sort of simulate what we call cognitive load, which is one of the things that happen during a crisis, but would also happen when someone’s very distracted. They might be using your product while they’re multitasking. They have a bunch of kids, with a toddler constantly pulling on their arm and they’re trying to get something done in your app. So one of the ways that has been shown to help test that is, like the foot tapping method. So when you’re doing user research, you have the user doing something else like tapping or like, so like they have a second task that they’re doing on the side, that’s manageable, like tapping their feet and their hands or something. And then they also have to do your task, and you can like build up what those extra things are that they have to do while they’re also working on finishing the task you’ve given them.

Jonathan Shariat 00:41:34 And that’s one way to sort of simulate cognitive load. Some of the other things is really just listening to users’ stories and find, okay, like this user was in crisis. Okay, great. Let’s talk to them and interview them about that. It was fairly recently within like the past six months or something like that. But sometimes you don’t, you just have to run through it and do your best and those black swan events or those, even if you’re able to simulate it yourself, like put yourself into the exact position and be in panic, which you’re not able to, but if you were that still would only be your experience, and you wouldn’t know all the different ways that people could experience this. So there’s going to be some point in time where you’re going to need to extrapolate a little bit and extrapolate from what you know to be true, but also from user testing and things like that. And then wait for real data.

Jeremy Jung 00:42:25 You have a chapter in the book on design that angers, and there were a lot of examples in there on things that are just annoying or make you upset while you’re using software. I wonder for our audience, if you could share just like a few of your favorites or your ones that really stand out.

Jonathan Shariat 00:42:44 My favorite one is Clippy because I remember growing up writing documents and Clippy popping up, and I was reading an article about it. And obviously just like everybody else, I hated it. As a little character it was fun but like when you’re actually trying to get some work done, it was very annoying. And then I remember a while later reading this article about how much work the teams put into Clippy — like, I mean, if you think about it now, it had a lot of like, so the AI that we’re playing with just now around like natural language processing, understanding, like what, what type of thing you’re writing, and coming up with contextualized responses, like it was pretty advanced, very advanced for the time adding animation triggers to that and all that. And they had done a lot of user research.

Jonathan Shariat 00:43:29 I was like, what you did research. And like, you had that reaction. And I love that example. Oh. And also, by the way, I love how they took Clippy out and highlighted that as one of the features of the next version of the Office software. But I love that example again, because I see myself in that and here you have a team doing something technologically amazing doing user research and putting out a very great product, but totally missing the point. And a lot of products do that. A lot of teams do that. And why is that? It’s because they’re not thinking about… they’re putting the business needs or the teams needs first and they’re putting the user’s needs second. And whenever we do that, whenever we put ourselves first, we become a jerk, right? Like, if you’re in a relationship and you’re always putting yourself first, that relationship is not going to last long, or it’s not going to go very well.

Jonathan Shariat 00:44:27 And yet, we do that with our relationship with users where we’re constantly just like, okay, well, what is the business? The business wants users to not cancel here. So let’s make it very difficult for people to cancel. And that’s a great way to lose customers. That’s a great way to create this dissonance with your users. And if you’re focused on, like, this is what we need to accomplish for the users, and then you work backwards from there, you’ll lower your chances of missing it, of getting it wrong of angering your users. And always think about sometimes just be very real with yourselves and your team. And I think that’s really hard for a lot of teams because we don’t want to look bad. What I found is those are the people who actually get promoted. If you look at the managers and directors, those are the people who can be brutally honest, right?

Jonathan Shariat 00:45:14 Who can say, like, I don’t think this is ready. I don’t think this is good. I’ve done that in the front of like our CEO and things like that. And I’ve always had really good responses from them to say, we really appreciate that you can call that on. You can just call it like, it is like, Hey, this is what we see this user. Maybe we shouldn’t do this at all. At Google, that’s one of the criteria that we have in our software engineers and our designers of being able to spot things that are things that we should stop doing. And so I think that’s really important for the development of a senior engineer to be able to know that’s something like, Hey, this project, I would want it to work, but in its current form is not good. And being able to call that out is very important.

Jeremy Jung 00:45:55 Do you have any specific examples where there was something that was like very obvious to you, but to the rest of the team or to a lot of other people, it wasn’t?

Jonathan Shariat 00:46:04 Yeah. So here’s an example. I finally got, I was early on in my career and I finally got to lead a whole project. So we were redesigning our business microsite. I got assigned two engineers and another designer, and I got to lead the whole thing. I was like, this is my chance. Right? So, and we had a very short timeline as well, and I put together all these designs. And one of the things that we’d aligned on at the time was like, as really cool. So I put together this really cool design for the contact form, where you have like, essentially a kind of like ad lib. It looks like a letter by the way, give me a little bit of leeway here, because this was like 10 years ago. But, it looked like a letter and you would say like you’re addressing it to our company.

Jonathan Shariat 00:46:49 And so I had all the things we wanted to get out of you around like your company size, your team. And so our sales team would then reach out to this customer. I designed it and I had shown it to the team and everybody loved it. Like my manager signed off on it. Like all the engineers signed off on it, even though we had a short timeline, they’re like, we don’t care, that’s so cool. We’re going to build it. But as I put it through that test of, does this make sense for what the user wants? The answer just kept being no to me. So I had to go back in and pitch everybody and argue with them around not doing the cool idea that I wanted to do. And eventually, like, they came around and that form performed once we launched it performed really well.

Jonathan Shariat 00:47:31 And I think about like, what if users had to go through this really wonky thing? Like this is the whole point of the website is to get this contact form. It should be as easy and as straightforward as possible. So I’m really glad we did that. And I can think of many, many more of those situations where we had to be brutally honest with ourselves of like this isn’t where it needs to be, or this isn’t what we should be doing. And we can avoid a lot of harm that way too, where it’s like, I don’t think this is what we should be building right now.

Jeremy Jung 00:47:59 So in the case of this form, was it more you had a bunch of dropdowns or selections where you would say like, okay, these are the types of information I want to get from the person filling out the form as a company, but you weren’t looking so much at, as the person filling out the form, this is going to be really annoying. Was that kind of…?

Jonathan Shariat 00:48:19 Yeah, exactly. So their experience would’ve been, they come at the end of this page or on like contact us and it’s like a letter to our company. It’s like we’re essentially putting words in their mouth because they’re filling out the letter and then yeah. It’s like you have to like read and then understand like what that part of the page was asking you, versus a form where you’re it’s very easy, well known, bam you’re on this page. So you’re interested in, so like get them in there. So we were able to decide against that. We also had to say no to a few other things. We said yes to some things that were great, like responsive design, making sure that our website worked in every single use case, which not like a hard requirement at the time, but was really important to us and ended up helping us a lot because we had a lot of business people who were on their phone, on the go, who wanted to check in and fill out the form and do a bunch of other stuff and learn about us.

Jonathan Shariat 00:49:10 So that sales microsite did really well because I think we made the right decisions in all those kind of areas. And like those, those general, those principles helped us say no to the right things, even though it was a really cool thing. It probably would’ve looked really great in my portfolio for a while, but just wasn’t the right thing to do for the goal that we had.

Jeremy Jung 00:49:29 So did it end up being more like just a text box, you know a contact desk fill-in, yeah?

Jonathan Shariat 00:49:34 Yeah. With usability if someone’s familiar with something and it’s tired, everybody does it. But that means everybody knows how to use it. So, usability constantly has that problem of innovation being less usable. And so, sometimes it’s worth the trade-off because you want to attract people because of the innovation and they’ll get over that hump with you because the innovation is interesting. So sometimes it’s worth it. I’d say most times it’s not. And so you have to find like when, when is it time to innovate and when is it time to do what’s tried and true, and on a business microsite I think it’s time to do tried and true.

Jeremy Jung 00:50:14 So in your research for the book and all the jobs you’ve worked previously, are there certain mistakes or just UX things that you’ve noticed that you think that our audience should know about?

Jonathan Shariat 00:50:29 I think “dark patterns” are one of the most common, tragic design mistakes that we see. Because again, you’re putting the company first and your user second. And if you go to darkpatterns.org, you can see a great list. There’s a few other sites that have nice list of them. And actually Vox media did a nice video about dark patterns as well. So it’s gaining a lot of traction. But you know, things like, and if you try to cancel your Comcast service or your Amazon service, it’s very hard. I think I wrote this in the book, but I researched what’s the fastest way to remove your Comcast account. I prepared everything. I did it through chat. Cause that was the fastest way. And not to mention finding chat by the way was very, very hard for me, even though I was like, okay, I have to find, I’m going to do it through chat.

Jonathan Shariat 00:51:18 I’m going to do all this. It took me a while to find like chat. Like I couldn’t find it. So once I finally found it from that point to deleting for having them finally delete my account was about an hour. And I knew what to do going in just to say all the things to just have them not bother me. So, that’s on purpose. They’ve purposely because it’s easier to just say like fine, I’ll take the discount thing you’re throwing in my face at the last second. And it’s almost become a joke now that you have to cancel your Comcast every year, so you can keep the cost down. And Amazon too, like trying to find that delete my account is like so buried and they do that on purpose. And a lot of companies will do things like make it very easy to sign up for a free trial and hide the fact that they’re going to charge you for a year hide the fact that they’re automatically going to bill you.

Jonathan Shariat 00:52:05 Not remind you when it’s about to, to expire so that they can surprise get you in to forget about this billing subscription or like if you’ve ever gotten Adobe software, they are really bad with that. They trick you into like getting this like monthly subscription, but actually you’ve committed to a year. And if you want to cancel early, will charge you like 80% of the year. And it’s really hard to contact anybody about it. So, it happens quite often. The more you read into those different things, different patterns, you’ll start to see them everywhere. And users are really catching onto a lot of those things and respond to those in a very negative way. And we recently looked at a case study where this company had a free trial and they had like this standard free trial kind of design. And then their test was really just focusing on like, Hey, we’re not going to scam you.

Jonathan Shariat 00:52:55 If I had to summarize that the entire direction of the second one, it was like, cancel any time. Here’s exactly how much you’ll be charged and it’ll be on this date. And five days before that we’ll remind you to cancel and all this stuff. That ended up performing about 30% better than the other one. And the reason is that people are now burned by that trick so much so that every time they see a free trial, they’re like, forget it. I don’t want to deal with all this trickery. Like I don’t even care about to try the product versus like, Hey, we are not going to trick you. We really want you to actually try the product. And, we’ll make sure that, if you’re not wanting to move forward with this, that you have plenty of time and plenty of chances, people respond to that now. So like, that’s what we talked about earlier in the show of doing the exact opposite. This is another example of that.

Jeremy Jung 00:53:41 Yeah. Because I think a lot of people are familiar with, like you said, trying to cancel Comcast or trying to cancel their New York Times subscription and they, you know, everybody is just like, they get so mad at the process, but I think they also maybe assume that it’s a positive for the company. But what you’re saying is that maybe that’s actually not in the company’s best interest.

Jonathan Shariat 00:54:03 Yeah. Oftentimes what we find with these like dark patterns or these unethical decisions is that they’re successful. Because when you look at the most impactful, like, immediate metric, you can look at, it looks like it worked. Let’s say for that, those free trials, it’s like, okay, we implemented like all this trickery and our subscriptions went up. But if you look at like the end result, which is like farther on in the process, it’s always a lot harder to track that impact. But we all know, like when we look at each other, like when we talk to each other about these different examples, like we know it to be true, that we all hate that. And we all hate those companies and we don’t want to engage with them. And we don’t, some, sometimes we don’t use the products at all. So, it’s one of those things where it actually has like that very real impact, but harder to track. And so, oftentimes that’s how these patterns become very pervasive is page views went up; this is high engagement. But it was page views because people were refreshing the page trying to figure out where the heck to go, right? So oftentimes they’re less effective, but they’re easier to track.

Jeremy Jung 00:55:08 So I think that’s a good place to wrap things up. But if people want to check out the book or learn more about what you’re working on, your podcast, where should they head?

Jonathan Shariat 00:55:18 Just check out tragicdesign.com and our podcast, you can find on any podcasting software, just search Design Review podcast.

Jeremy Jung 00:55:27 Jonathan, thank you so much for joining me on Software Engineering Radio.

Jonathan Shariat 00:55:30 All right. Thanks Jeremy. Thanks everyone. Hope you had a good time. I did.

Jeremy Jung 00:55:34 This has been Jeremy Jung for Software Engineering Radio. Thanks for listening.

[End of Audio]


SE Radio theme: “Broken Reality” by Kevin MacLeod (incompetech.com — Licensed under Creative Commons: By Attribution 3.0)

Join the discussion

More from this show