Episode 13 – Conversational Maxims

Many of the conversational design principles are based on the idea of the cooperative principle, which states people work together cooperatively in conversation. The principle was defined by Linguist Paul Grice, and it is supported by four conversational maxims, with a fifth being added from linguist, Robin Lakoff. These simple maxims make the foundation for much of voice design. Erika Hall joins the show to give a detailed look at these maxims, which are the Maxim of Quality, Quantity, Relation, Manner and Politeness, and you’ll learn how to apply these ideas into your conversational interfaces and experiences.

Guest – Erika Hall

Erika Hall is the co-founder and Director of Strategy at Mule. She has advocated for the importance of evidence-based design and strong language since the late 20th century. This represents decades of fisticuffs.

Even though writing is ever so much harder than talking, she was driven to produce Just Enough Research and Conversational Design, both from A Book Apart. Erika loves helping people overcome the often invisible organizational barriers to doing good work.

Transcript

Jeremy Wilken
Welcome to this episode of design for voice. I’m your host, Jeremy Wilken. And today we’re going to be talking about conversational Maxim’s, these are some rules that we can apply to how we build out our conversational experiences, so that they feel more complete and not out of place. And in more touch with reality. I’m joined today by Erica Hall, who is co founder and director of strategy at field design. Welcome to the show. Erica.

Erika Hall
Hi to it’s great to be here talking with you,

Jeremy Wilken
once you give us a little bit of your background and your experience here in the conversation space.

Erika Hall
Wow. So I’ve been doing design consulting, I just say for most of the 21st century at this point. And, and this is involved just about all aspects of you know, research strategy and front end design. And I got particularly interested in language because it seemed like designers were really neglecting that aspect of it. So it’s not like I came out of content strategy specifically, or writing specifically, it’s that my approach in design and in working with clients has always been to try to find the areas that I feel other designers or technologists are neglecting and really work on that area.

Jeremy Wilken
Awesome. And I think, as I read your book and tried to figure out more about the conversational principles, it applies to voice it applies to UI web design, it can apply to print design, it’s a business across all design and trying to be more conversational, not just verbally and vocally but also in the way that things are presented on the screen and the specific terms and words that are used, even on a web page.

Erika Hall
Yeah, absolutely. And I first started talking about this stuff, like way back in 2007, I did my first conference talk, it was called copy as interface. And it was about the fact that the language, even just the words, on web pages, because at that point of view, the web was a whole different place, and you know, far less interactive and more constrained, and that the language choices that you made, were key design choices for but for brand for interaction design. And and a lot of and designers were kind of neglecting this very powerful mode, because they thought, oh, design is the part with the pictures and behaviors. And the words are we’re just going to warm up some those out. And so that’s when I really first started trying to bring this to the attention of people working in you know, design and and software is not the last thing you fill in before you launch some software service or something.

Jeremy Wilken
And to that point, I think today, we want to talk about the couple of maximums from grace and Lake off has added one extra. So there’s five total, that give us some sense and place to start from when we want to start building out these conversations on these terms and words and things that we put on the screen or verbalize or bring to the user in whatever format that is. So before we start talking about all of the maximums, let’s step back and just say what, what is the maximum? What do we learn from it?

Erika Hall
So there is this British guy Paul grace, and he was a philosopher of linguistics. And he worked in the area of linguistics known as pragmatics which is the things that carry meaning outside of just you know, the words themselves, but like what aspects of the content of your communication convey meaning. And in way back in 1967, he wrote an essay in which she articulated this principle of cooperation. And then he articulated four Maxim’s, which is how they’re, they’re essentially the the principles that you follow in order to be cooperative, because his idea, his fundamental idea was that conversation between or among people requires cooperation, like everybody in the conversation kind of has to agree that, okay, we’re in a conversation, we’re proceeding towards this goal. And, and then he said, Well, you’ve got to in order to cooperate, you have to follow these prints eyeballs of quantity, quality, relation, and manner.

Jeremy Wilken
So with these, these four ideas, quantity, quality, relation, and manner, we’ll kind of dive into those a little bit more deeply, individually. But what about these applied to just human to human are these also something we can apply to both human and computers?

Erika Hall
I think you can definitely apply them to, to both humans and computers, because the idea was that, you know, we all have a sense of when a conversation is going well, versus when it it’s going badly, you know, when you walk up to somebody, like the example I always use is, is asking for directions, because that’s the, I’d say one of the most common cases in which people talk to random strangers on the streets, you know, if you’re in a different neighborhood, or you’re a tourist, a lot of times you have to describe to a stranger and ask for directions, and you don’t have to establish some sort of Treaty of communication, right, there are these underlying principles. And that’s, that was grace as idea was that there are these, these things that we all follow to a greater or lesser degree, in order to make the conversation work and not get very, very frustrated, and have miscommunication? And if you’re interacting with a system, right, that sort of interaction. To a large extent, I think, to an extent that, that some people also ignore is is based in verbal meaning. And so to the extent that it’s an exchange of verbal meaning between you and a system, which is really what a conversation is, right? It’s sort of doesn’t matter, like you can’t know the mind of another person just in the same way that you don’t totally into it, like what’s behind the system that an interface is helping you interact with, it’s a lot of the same principles, like the application isn’t exactly the same. Because there’s a lot of ways that interacting with a piece of software, a digital system, and intelligent agent, whatever differs from a person. So it’s not exactly the same, but the same principles, I would say, apply to the extent that the interaction is, you know, verbal and conversational.

Jeremy Wilken
Alright, so let’s take it from the example perspective. So the quantity, let’s take a look at that one first, maybe that will help flush out a bit more detail around what these means. So quantity is how would you describe that as as one of the maximum,

Erika Hall
I’d say that’s the the principle of supplying just enough information, and not too much. So it’s enough, but not too much information. And because, you know, if you’re holding back necessary information, and making like we all have this experience of, you’ve asked somebody a question during a conversation, and you feel like you really have to dig for the information. Or if you’re talking to somebody and and they’re going like, on and on and giving you way too much information. Those are both kind of like broken conversations. But you see, a flow goes really well, when people are, you know, kind of going back and forth and giving each other you know, just the right amount. At that time.

Jeremy Wilken
I see this a lot in the personal assistant space, we you need to give people enough information, but you also don’t want to overwhelm them, especially with the somewhat artificial voices, people pick that up. And it just, it can create a bit more when it goes on and on too long. But also, it’s bad when it just says something and people don’t understand it. So this, this is a challenge, challenging place. And I think also, what’s important is that Justin, I’m from formation might be different in this moment, then maybe in another moment. For example, if you’re beginning is your first experience with with a voice app or with a website, you might need a bit more information than you do and a repeat visit.

Erika Hall
Yeah, I think the thing that’s easy to forget about conversation, because people really focus developers especially focus on the technology. And and forget that the most important part of having a conversation is the idea that you’re present in a particular context. And that context could be, you know, what level of skill you have, in a language, how easy it is to hear how technical the subject matter is, you know, our memories, as humans, like computers have amazing memories. Right? So I, I think that as much as possible, you need the machine to be doing all the remembering. And the problem is if you give the human and too much information and require them to remember too much. That’s a huge failure point.

Jeremy Wilken
Yeah, people’s short term memories. I mean, I have kids, so I don’t remember anything. But it’s like, if I’m told to work through a problem, what’s what’s the number of digits you can remember is like, seven, that’s why seven years or seven, there’s little guidelines, and a lot of people have heard these kinds of things, but they don’t necessarily carry that through. And also Taylor, as you said, to the context, I think that’s a really important aspect of it. The system only understands what said, on every turn. And unless somebody stores context to what came before it, it’s just an isolated statement. So imagine you just get a sound bite randomly brought to you What news stories about are, you know, how does it relate to anything else? So carrying that information through, is really, really important, especially as the scope of this of the system increases beyond a certain set of tasks.

Erika Hall
Yeah, and I think one of the issues that people are really having is the fact that especially the home assistants, like they’re constantly, the voice assistants are just losing context. Like they can’t retain anything like you might say, you might tell Alexis something and then say, Oh, another thing about that, but but you always have to go back to square one. And it’s also true in the web, when you navigate between pages, maybe there’s no breadcrumbs, or there’s no context about where’s this piece of information coming from? When I thought I was clicking on this link. It’s there’s a disconnect sometimes, right? Or you could just have 10 tabs open. Yeah. That’s always a problem, but might not always be the websites fault.

Jeremy Wilken
All right. Well, let’s move on to the next one, which is quality. So how would you summarize quality?

Erika Hall
It’s a it’s about being truthful, right? It’s the quality of the information. So it’s not just that you you give the right amount, but you have to be, you know, representing your authentic agenda, you have to be like, transparent. You have to be building trust. And just I think it’s about being really, really clear about your motivations.

Jeremy Wilken
Interesting. So the motivations are really important, you know, if I’m trying to sell something, how, you know, how do I balance the truth about right, gauging with you properly? You know, do I get a commission or whatever that sounds like, if you go into a car dealership, you know, sometimes they’ll tell you, you know, we’re not working on commission, we’re working on a flat rate sale, or whatever the case may be. But, you know, that may be is useful in providing that trust and saying, okay, they’re in my best interest as well.

Erika Hall
Yeah, and I think a really common example of this is, you know, say, for example, in, in search, and like, a search interaction is a pretty conversational interaction, you know, whether you’re typing into a text box, or, you know, talking to Google allowed, if you are looking for a really, really good match, and some of those matches are sponsored, like that has to be represented up front. If you ask something like, Oh, you know, where’s the nearest coffee place? You want to know, if you’re really getting the nearest coffee shop, or you’re getting the top sponsored result? And, and so that’s, so if those sponsored results, no matter how they’re communicated, in response to a question, or prompt, are, are labeled as such, or it’s like, oh, hey, here’s like a sponsored recommendation, then you’re adhering to the maximum quality. Excellent. Yeah,

Jeremy Wilken
that makes a lot of sense. It’s not just about not lying and saying something that’s inaccurate entirely. It’s also about proactively providing some additional information to just make that experience more clear about okay. Yeah, this is a sponsored listing, or we recommend these three, but this one has the most reviews or works with us, whatever the case may be. So we have quantity, which is just enough information quality, which is being truthful. And then the next one was relation.

Erika Hall
Yeah. And the the maximum relation is about being relevant. Yeah, and not digressing. And going off and talking about other things. And this is really like understanding what is most relevant to a particular conversation, like that is the place where I think humans are often much, much better. Then computers like that sort of associative learning, and unstructured problem solving. And all of that is, is a, those are things that are really easy for people. Like you can say like, oh, because I understand your context, the context behind a request or a question or behind your contribution, oh, I can I can use my, my social intelligence and powers of associative thinking to, to provide a good, a good response to that. Whereas that is very, very difficult for for any sort of, you know, computer system.

Jeremy Wilken
Yeah, I think the the would call like the core platform, like Google Assistant, or Alexa, do the best of this as far as trying to maintain relevancy across because they’re just such a broad thing. But when it comes down to being in a small set of things, a lot of The Voice Actions or websites, you have sort of a one track purpose. And so it’s hard to step out of that and say up, we’re not the thing for you, or to redirect them just something else. I think if I go to a CPA, or you know, some specialist, and they look at whatever paperwork or a lawyer, and they’re like, actually, I’m not the right person for you. Like, that’s been very relevant. And I guess, also truthful about the relationship that we have, and making sure that we moved in the right direction, even if it’s not necessarily what they do.

Erika Hall
Yeah, and I think this is the place where things are really going to start breaking down once more and more interactions go to voice because you see what’s happened on the web, where websites are violating the maxim of relation with all of the ads and calls to action into the one everybody’s talking about now is the newsletters are just popping up immediately, right? Because you didn’t ask for that, like you made a request. Because you know, if you’re using the web, you’re, you’re just like requesting pages. And instead of giving you what’s most relevant, the system is handing you the thing that helps them meet their business goals, where you’re like, Oh, you know, I was looking up movie times. But instead, I got, you know, a newsletter pop up, and I got an ad pop up. And I got something totally unrelated. And so that you can see how frustrating that is. And I think, once we’re doing more using voice, that means that the sponsorship and things are going to creep a more into voice. And I’m really afraid that there’s going to be a lot of irrelevant information presented because it’s relevant to the sponsors to give you that information.

Jeremy Wilken
Yeah, the monetization strategy and voice is still ethereal. And lots of cases, there’s transactions. There’s some basic stuff that doesn’t scale for a lot of things. And that’s why essentially, it’s free to develop these these apps, because there’s not a lot of money in them at the moment. And so it’ll be interesting to see. And I, I’m curious to see what what will happen with the voice space, but I know, people need to be thinking about this, and how do I relate properly to my years, and wide over extending the context that was given to me and trying to abuse that power?

Erika Hall
Yeah, because you can imagine what this is, like, if you’re talking, if you ask somebody directions, and they all of a sudden try to start selling you something, you’d be incredibly annoyed. And you’d feel I think, because most people aren’t, just as lay people familiar with these underlying principles, you have a sense of like, Wait a second, that’s not cool. It’s not cool that I asked somebody for directions, and all of a sudden, they tried to sell me something. And yet, when we’re using the web, that’s just sort of our condition of interaction is like, Oh, I’m going to try to read a newspaper. But the newspaper is going to try to sell me something and also interfere with my quest for information.

Jeremy Wilken
All right, let’s move to the last one, which is manner. So how would you summarize matter?

Erika Hall
Well, manner, managed kind of a big one, I think grace, like, packed a lot into this one, I don’t know if he was just going for for. But because it’s really, I said, it’s, it’s to be brief, orderly, and an ambiguous which is, you know, just be, you know, get to the point. And, and communicate in a, in an organized sort of way, like don’t meander, you know, make sure that you lead with the piece of information, because you might give somebody the right amount of information that you give it to them in the wrong order. And that’s, that’s a helpful because I started thinking about you said, you’re like, Oh, no, wait, I met, it’s like, when somebody is a really bad storyteller, and they’re telling you a story. They’re like, Oh, no, I meant to say this other thing before that. And it really like messes up your processing and understanding. And so so it means like, if if you’re, you’re having conversation, you know, with a human, or a system, like you want after you say something you want the response to lead with, you know, the first most important thing, you need to understand the rest of what follows, and you really need it to be brief. So that you can then take your turn, because, you know, conversation is a matter of, of each party taking their turn. And if you say something, and the other person just goes on, and on and on, it’s not really a conversation anymore.

Jeremy Wilken
Right? So we want to be brief to the point and very clear about what we’re sharing, because if we don’t we risk losing attention or offending them or frustrating them and preventing them from getting to their turn, whatever their goal might be. Yeah, and this is, and these are always have, you know, being a good conversationalist, all right. And then the last one was about politeness, which wasn’t part of crisis original for, but you’ve, you’ve added it in your book, and I think it’s a really important piece to add in. So what else do we have to say about politeness in this conversation?

Erika Hall
Yeah, so um, a few years later, in America, in 1973, a linguist at UC Berkeley, Robin, like, came up with, she wrote a paper called the logic of politeness. And she said that the principle of politeness is that you need to not impose yourself in a conversation, you need to give options, and you also need to make the listener feel good. And so I think this is just something to move beyond what makes a conversation purely functional, to actually feeling respectful.

Jeremy Wilken
This seems like a good citizenship, if you’re talking about how do I converse with somebody without, you know, trying to hurt their feelings in the process? Which, unless your whole persona is to be rude, or you know, there’s a few rare cases like that, which I think are not very often and I’m not a big fan of them myself. But typically, you offend your user, you’ve lost a potential customer or user whatever the case.

Erika Hall
Yeah. And I think one of the most common examples of, of seeing politeness embodied or or not embodied in interactive systems is an error messages, right? There’s so many, and it’s, it’s gotten better. But I think a lot of times, because by the time it got to, you know, system level error messages, or problems with interaction, like those cases, used to be designed less well, you get an error message that made you the user feels stupid, right? It was like, Oh, my God, I’m giving you the information. But I’m giving it to you in such a way where you are supposed to fill out this form, right? And so instead of the system helping you you do the right thing. There are so many cases where the system is actually chastising you for doing the wrong thing, and then you feel terrible. And in many cases, if you’re dealing with somebody who isn’t, as you know, familiar with that particular system, you know, they just you feel like, Oh, I should have been doing something right, as opposed to being angry at the designers.

Jeremy Wilken
Yeah, and I friend of research, where people blame themselves more often, or at least often, I don’t remember the percentage, but often, they blame themselves for system errors that were presented to them. And you know, I’ve seen error messages, they put things with an exclamation mark at the end like this is required, and you feel like, Oh, it’s like yelling. Yeah. It’s it may be subtle, and people don’t even rationalize it in their head, like what’s happening, but there’s some underlying feeling gut quality kind of feeling that can come from that, especially if it’s in your checking out. It’s like, I typed my credit card in the right way. You know, it’s, it’s negative feelings. For sure.

Erika Hall
It’s a bad feeling. You should always try to make somebody feel better for interacting with your system, not stupid, or sad.

Jeremy Wilken
So before we wrap it up, I kind of want to reframe the question a little bit. Are there cases where we don’t want to follow one of these Maxim’s or places where they break down things that we can basically flip it over, say this, this doesn’t apply? And in this case, or is it always universal?

Erika Hall
Well, I’d say in in general, you want to follow these Maxim’s right, you always want to give people the right amount of information, and be truthful and be relevant all these things. The only time to not do that is, you know, for humor purposes, but you have to be really intentional. Like if you’re creating any sort of system, where you’re really trying to help somebody perform some sort of task, even if it’s a pretty trivial task, you want to be on their side, you want them to feel like oh, this person I’m interacting with like, if you’re in a store, and you’re trying to get some help in the store, or if you’re talking to an assistant you know it like you’re talking to you’re talking speaker something in your house, you don’t want to feel like it’s making fun of you. So there are times you know, especially if you look at like the humor, a lot of money pythons humor was based on violating the maxims, you know, people who take things too literally, or they go on and in a mustard length, or you know, the, with the Soup Nazi from Seinfeld, or whatever, you go in a restaurant, we’re like, oh, this waitstaff is being charmingly rude to me, because that’s their, that’s how they authentically behave. And that’s their style. But I think these things I don’t often Tran, like, they take so much skill, to do something like this in a way. And so much I think shared relationship and shared context around it, that I would say, follow these, like, there’s not a reason to be rude or to lie, or to not get to the point, like, if you’re doing it for a fact. It, you have to be really, really sure that what you’re doing is context appropriate.

Jeremy Wilken
And I suppose there’s also cultural issues, language barriers, where does it translate to the culture to the language, you know, I’ve worked with people from different languages and backgrounds and things that are not always as clear between one another even if it’s a little bit of sarcasm, and you have to be very clear about that or explain it, which then sort of destroys the purpose of it. So you kind of give up on trying to use those things, which basically means don’t violate that maximum.

Erika Hall
Right, because I think a big part of it is that the sensors in the systems can’t pick up all the nonverbal cues, right? Like if you, if you see a friend walking towards you, and, and they look upset, or they look rushed, or they look very anxious, you’re not going to make a joke at their expense. But if you know you have a speaker that can’t Well, you know, more and more these, like, what my nest has a camera or something now, but you can’t put computers can’t read all those nonverbal cues to understand somebody’s emotional state, there might not be this pre existing relationship. So you can’t really play with those things, because you’re not sensing enough of the whole picture of that other person’s mental state so that you can be really context aware.

Jeremy Wilken
Awesome. Alright. So this is the final wrap up here. I like to call this the endpoint detection part of the show. So with that, can we recap, what is a top takeaway from this from the show?

Erika Hall
Let’s say Well, I’d say it’s the it’s the fact that all these Maxim’s are just a break out of the cooperative principle. And if you think take one thing away from this in the design of these systems, it’s the question of, is what I’m designing? Does it really feel like it’s cooperating with the user? Or does it feel like the user has to put in a lot of work to make to hold up the conversation and keep it going? And I think that’s the conversational Maxim’s themselves or details, but you really have to ask if to say, did I design this in such a way that people are going to feel like, Oh, this system is on my side? And I think if you look at some of these things, you know, like, the places where Siri or Alexa or, or Google Home, like break down, it’s because people feel like, Wait a second, it’s, it’s just making things harder on me, or it’s just making me feel less capable than I actually am. And so that’s, that’s a real breakdown. And in many of those cases, you know, using a website, instead of talking to a system might actually feel more cooperative to people. And in that case, that’s a more conversational interaction, even though you’re not talking. I can

Jeremy Wilken
attest to that. I know that. I know, people who have had trouble with their their voice or accent or something. Simple things, maybe as weather don’t tend to work well. And so they just pull out their phone instead. And at that moment, you can see the first attempt failed. And the second attempt worked. You know, it’s not for us, like you said.

Erika Hall
Collaborative, the more conversational it is. And hopefully the happier people are using your experience. Yes, yeah. Yeah.

Jeremy Wilken
All right. I like that three questions to all of my guests kind of wrapping things out. So what is one interesting voice app or experience? you’ve, you’ve had recently?

Erika Hall
I would say the thing I’m doing the most in terms of voice interface is I am, I am dictating my texts constantly. Like, if I’m texting people, I’m I’m doing voice to voice to text. So

Jeremy Wilken
yes, I know that voice dictation is, is becoming quite popular. It’s been quite popular for some time, for, I think, a limited set of people. But as more phones are in people’s hands, and you know, the keyboards even as phones get bigger, they’re still hard to type. A lot of people use that I think it’s a really good use case, that that can also be an interface to your apps or experiences. We just haven’t necessarily wired up all the pieces yet.

Erika Hall
Yeah, but it’s really good. Like the speech to text is getting real really impressive. And I think it’s also, like, that’s really fantastic for accessibility. So I’m really excited, like not about necessarily any service that depends on that. But I’m just excited that that’s an option that, that given any text field, somebody can speak, and, and that input can be interpreted into characters. And I’m excited for anything that goes the other way too. Because I think the key is, all of these systems, like it doesn’t matter what the interface is, interfaces should be multimodal four people. And what matters is, what is that interaction? What value is that interaction helping you get to through the interface, because it sort of doesn’t matter what the interface is. It’s about what you get out of it.

Jeremy Wilken
For people who want to learn more about design, what resources do you recommend?

Erika Hall
Yeah, I mean, I wrote a book called conversational design that is designed to be about the first one principles, not about the technology, because I think a lot of designers go to what’s possible with the technology before they really think about, oh, what’s the context? What’s really, really desirable. And, and, you know, I think at this point, it’s such a new field that, that I’m just always looking to see what people are writing about, you know, on medium or on on LinkedIn about it, because I think it hasn’t been around enough for there to be a lot of, you know, books, or a lot of really stable static resources that people will turn to, like, there’s no canon of, you know, conversational design at this point. Unlike, you know, programming languages or something of that nature, which have, often dozens upon dozens of books for each matches, quite large. So I like programming languages, visual design, like there’s so much stuff out there. But I think right now, it’s about finding people like literally talking about it and talking to them about it, because people are figuring out new things like literally every day.

Jeremy Wilken
All right, and last question, how can people learn more about you and your work?

Erika Hall
All right, well, I’m I’m on Twitter too much. I’m your girl on Twitter. I’m on medium. Our website is mule design.com. So it’s all things mule.

Jeremy Wilken
Excellent. I’ll make sure all those are linked on the website for folks if you want to check it out and at design for voice calm. And that rounds this out for today. Thank you again for joining me today Erica, this was I think a really good oversight on all of the conversational Maxim’s and how they can apply to people. Whether you’re designing or developing, you can learn a lot. And it’s, it’s really, I don’t wanna say it’s common sense, because we don’t often stop, step back and think critically about the conversations we have with people around us. But if we do that, we see these Maxim’s at work and we probably adhere to them most of the time. Hopefully, we just aren’t always aware of it. And it’s great to like a little bit more critical that and bring that same thing to our actual systems and interactive experiences. So thank you again for joining me and hopefully we’ll have you on again sometime to talk more.

Erika Hall
Yeah, thank you, Jeremy. us good conversation.

Jeremy Wilken
Thank you for listening to today’s episode. And if you liked the show, please rate us on your favorite podcast player. Follow the show notes are available on design for voice com