The Product Experience

Conducting user research at scale - Kenton Hansen

September 08, 2021 Episode 133
The Product Experience
Conducting user research at scale - Kenton Hansen
Show Notes Transcript

When we sat down with Roll20.net Product Director Kenton Hanson for this week's podcast, we were hoping to hear tales of how he and his  product team (or band of adventurers!) went on quests to conduct user research. We wanted to hear about dodging orcs and dragons (stakeholders) while collecting  treasure (insights).

Instead, we learned that it's only Roll20's marketing team that cast spells.  But we also learned a lot about conducting research for such a large, passionate community, qualifying candidates, sharing learnings, and much more.

Featured Links: Follow Kenton on LinkedIn and Twitter | Kenton's Website | Roll20 | Patrick Lencioni's teamwork system The 5 Dysfunctions of a Team | Erika Hall's book Just Enough Research 

Give Sprig a try for free by visiting Sprig.com to build better products.

Our Hosts
Lily Smith
enjoys working as a consultant product manager with early-stage and growing startups and as a mentor to other product managers. She’s currently Chief Product Officer at BBC Maestro, and has spent 13 years in the tech industry working with startups in the SaaS and mobile space. She’s worked on a diverse range of products – leading the product teams through discovery, prototyping, testing and delivery. Lily also founded ProductTank Bristol and runs ProductCamp in Bristol and Bath.

Randy Silver is a Leadership & Product Coach and Consultant. He gets teams unstuck, helping you to supercharge your results. Randy's held interim CPO and Leadership roles at scale-ups and SMEs, advised start-ups, and been Head of Product at HSBC and Sainsbury’s. He participated in Silicon Valley Product Group’s Coaching the Coaches forum, and speaks frequently at conferences and events. You can join one of communities he runs for CPOs (CPO Circles), Product Managers (Product In the {A}ether) and Product Coaches. He’s the author of What Do We Do Now? A Product Manager’s Guide to Strategy in the Time of COVID-19. A recovering music journalist and editor, Randy also launched Amazon’s music stores in the US & UK.

Randy Silver:

Hey Willie, when we talk about strategy, you know, we always talk about initiatives and mission statements and you know, we just use a lot of words are kind of boring. Have you ever wanted to go on a product quest? Oh, that

Lily Smith:

does sound a lot more exciting, but you know, probably be a bit worried that my team would be rated by orcs or shuffled by dragons. Oh, you're gonna go somewhere else with this?

Randy Silver:

No, you caught me straightaway. But I was not expecting snaffled. you're chatting to Kenton Hanson this week from role 20 dotnet and you know, I got a bit carried away. Can you blame me?

Lily Smith:

Ah, yes, I love playing games. But I also love doing user research and Kenton spills the beans to us on how his team keeps user research in focus and some of the top tips from their work.

Randy Silver:

Okay, well, let's roll for initiative and see what kind of treasure we get from our chat with Kenton.

Lily Smith:

Are you done with the tea's yet, Randy?

Randy Silver:

I've never done with the cheese, the cheese just keeps coming. I'm really good with cheese and dad jokes and cheese. That's pretty much all you get from me.

Lily Smith:

The product experience is brought to you by mind the product.

Randy Silver:

Every week, we talk to the best product people from around the globe about how we can improve our practice and build products that people love.

Lily Smith:

Because it's mind the product.com to catch up on past episodes, and to discover an extensive library of great content

Randy Silver:

and videos, browse for free, or become a mind the product member to unlock premium articles, unseen videos, ama's roundtables, discounts to our conferences around the world training opportunities, or

Lily Smith:

mind the product also offers free product tank meetups in more than 200 cities. And there's probably one way you can turn Welcome to the product experience. It's so great to be talking to you today. Thank you so much. I'm really happy to be here. So we're gonna check to say all about user research and how to do it. Well, Graham, before we get stuck into the topic, if you could give us a real quick intro into your origin story as a product person, and kind of how you got into it, and also what you're up to today.

Kenton Hansen:

Sure. So it all started for me when I was going through an abandoned factory and I fell into a vat of toxic waste may not be the origins where you're looking for. Now I actually got into product inadvertently, which I think is not that uncommon, especially in the stories I've heard. I was working, basically building building, you know, people websites for marketing purposes and getting into the marketing aspects of that. But as programming and scripting and Ruby on Rails really helped with this, I got into the technical side, and I'm not a very good developer. So focusing in on the product and trying to define the problem and help people find the solution was much more exciting to me. And my first real big product experience was in 2007, I really started diving into an internal tool for a company that told admissions basically to different events around the United States. And from there, I taught myself and learned a lot and applied it back to business school education. And today, yeah, so today, I'm the director of product at roll 20, which is a software as a service, completely web based way to play tabletop role playing games and you know, other tabletop games in a virtual environment. You can play with your friends around the globe, just like they were sitting at your kitchen table. And yeah, so I do a lot with Dungeons and Dragons Call of Cthulhu or Pathfinder those kinds of games.

Lily Smith:

Awesome. So I bet you've had a good year with the the say,

Kenton Hansen:

Yeah, well, the way I like to say it is that we were incredibly lucky to provide a way to distract from the realities of the world and still connect with your friends in a socially distance way. Yeah, absolutely. It was a privilege to serve that purpose for a lot of people.

Lily Smith:

So let's start talking about user research. Let's let's kind of start with the the main challenges of you know, what stops people from doing good user research?

Kenton Hansen:

I think one of the biggest challenges is the just the first like the blank page syndrome of how do we get started? What do we do? There's so many different ways you can slice the data segment the audience's that it becomes overwhelming even to get started. So the first thing that I like to tell especially you know, people on my team or the people that I coach is just to get started ask a question. Any question that you ask is one step further along the process. And that's, that's the first hurdle get get a question out of the way and it'll help you To find what questions you need to ask later on. The next one, I think that approaches is focusing on what kind of results you're looking for. And that is one reason why I say just ask a first question. The kinds of answers you get back help define the questions that you want to ask in the future and help you move along the user research path. Even you're kind of doing user research research for your product and the way that it works for your team even. And then finally, overcoming the myths. And this is a fun one, because all organisations work off some decision making criteria for organisations who are making decisions. Without user research, there is a myth that gets told within the organisation, this is how people do a thing, oh, when our users come come to us, they This is the path that they take. And that gets told over and over again until where it's believed internally, and you have to fight the the myth with the data and prove it over and over again. So getting traction with your data, even after you've done their user research. And you can point to the results you expect. You have to continually hit those and make that a pattern of behaviour rather than a one off one off event.

Randy Silver:

It sounds like you're doing a mix of qualitative and quantitative when you do this, how do you how do you put those two together? What do you looking for from each it for me, it

Kenton Hansen:

really depends on the state of that particular product or that particular even feature, sometimes the initiative that the team the product team is focused on. I think that qualitative and quantitative has to be together like their their siblings, you can't approach one without the other, you can make a qualitative decision, I like to it is a joke, I want to make sure that everyone understands that but I always say I can reduce subscriptions in our product. By removing the Cancel button. If users don't have a way to do that, qualitatively, we would reduce our subscriptions down to almost zero, or sorry, reduce maybe your subscriptions who reduce our cancellations down to almost zero. However, I think qualitatively that would have more negative impact than you could possibly imagine. So they are married? I kind of feel like that's a non answer. But it's it's a very situational and situationally aware decision to make whether you want to be in that. I think, as I said, product lifecycle and where you are in that stage, early in the product lifecycle, you're looking for acceptance, and you're looking for us. And qualitative results are much more important early in the stage, whereas qualitative comes later, when you're especially at the scale, you know, you're looking at actual business metrics.

Lily Smith:

And tell us a bit about how you do user researcher role 20? Like, who owns that part of things? And how do you, you know, schedule it and nor decide where you're focusing? And,

Kenton Hansen:

yeah, so user research is a shared ownership for sure there's, there's not any one person who it belongs to. Each Product Manager is responsible for the research on the changes that they're scheduled to be making, that they're researching, and setting a baseline for how they expect users to interact with those changes. The methods are, you know, very similar to probably what any high functioning product organisation is doing. We talk to users, we reach out to them and schedule conversations with them. We put up surveys in the user experience through the process and kind of inject that in a place where people can opt into given information. We send out emails with with links to surveys and incentivize those in various ways. In order to get information back most of the time, those help us stratify the users and then we'll take those subsets the cohorts that we find in there and ask the questions that we're asking for those NPS score is a very important tool that we use, and we measure it regularly in order to determine you know, how well are we doing, first of all, but then second of all, how each of those users might respond to a change or a problem that we're trying to solve for them?

Randy Silver:

Do you do these things that kind of the gross level, you know, the mass level? Or do you do them in terms of segments or cohorts? And if you do in segments and chords and how do you how do you qualify them?

Kenton Hansen:

So yes, both. We do them in mass scales in order to understand where the segmentation lines come in, in our total audience, first of all, so once we stratify our users, we'll approach each cohort in a certain way. For a particular change. We may be trying to help the users who are very involved, they've been a subscriber for a long time and we're we want to make a change to make it easier for them in some way, we'll focus in on those people. There's a lot of changes that we're making on making the onboarding experience easier and more convenient. So we don't need to talk to our veterans, the people who've already done the onboarding, we need to talk to the people who maybe have never even used the software before. And that's how we we break those down. Those are the two that I'm thinking of in my head. But we also look at, you know, subscribership, you know, other purchases, different interactions that they've had in critical user journeys, is a term that's being thrown around a lot, right now. We look at those and who's accomplished them or who hasn't? And then we may end up asking, Why haven't you? or How was that experience? Those kinds of things.

Lily Smith:

So, you know, when you have such a large audience, as you do at roll 20, and you're kind of, you know, you're deciding who you want to talk to, when you're doing a more sort of generic, you know, we want to speak to people who have played Dungeons and Dragons. How do you go about sort of qualifying the user before you perform the research? What kinds of

Kenton Hansen:

questions that you asking at that point? It's wide and varied, for sure. And so I can't say like one particular set of questions in any given time. But one of the steps that we do to help create that is, before we do any user research, before we do even something as simple as a survey, we take a couple steps. First of all, we identify our hypothesis, right? Like, we're going to be asking a question, and we think that we're going to find x, you know, so we're going to ask how many people have played dungeon dragons in the past 30 days, and we think we're going to find 50% of people that we asked will have said, this is something that they've done on our platform. By identifying those we call out our own biases. And the second step that we do is, once we've set that hypothesis, we then identify other biases that may be in our question that helps identify what cohorts we need to find and what steps we need to take to make sure that our our large audience that we you know, the large ocean that we can pull from gets sorted down into a representative sample, and breaks apart the unintentional biases that we may be coming up against. So we don't over sample, you know, this particular day of the week in which we have found a lot of people play one particular game, or we don't over sample in another area that would taint the data and give us research that wouldn't be actionable very well.

Lily Smith:

One of the things I always worry about when inviting people for user research interviews, is that you are by the very method, you're doing the interview, you end up with people who are happy to talk about, you know, what they're doing and stuff like that. Do you find that that's a problem? Or like, do you have the same concerns?

Kenton Hansen:

it for sure, is a concern. And I think it is good to connect any person that you're you're talking with, you're interviewing that you're researching with some of the other metrics that you may have collected earlier, that's, that's one way. Now that takes a lot of work and a lot of planning, right. And so I don't think it's the most important thing, because at the same time, you're gonna get, you get the people who are very willing to talk to you. But some people are very willing to talk to you when they're extremely happy. Other people are very willing to talk to you when they're extremely not happy. I think it's that middle section, that's the actually the hardest to get to. And in the middle section is a lot of times where the most juice is ready for the squeeze to have getting, you know, the person who's like, Oh, this is fine, I'd be happy. You know, it would be amazing if this right, like we don't find those insights nearly is readily. So some kind of the incentive actually taints the data in a different way. But yeah, I am very worried about that, too.

Randy Silver:

So there's this like Olympic diving, where you take all the scores that come in all the interviews that come in, you throw away the top rated one, you throw in the bottom rated one, and you work on the ones in the middle,

Kenton Hansen:

almost always Yeah, the the people who are leaving glowing reviews are have something else baked into it. The people who are leaving, the most negative reviews are typically have something else going on in their life that is making things more difficult for them. And somewhere in the middle. There's there's that there's that section, I mean, from an NPS score, NPS score perspective, I am much more interested in the sevens and eights than I am in the zeros and ones. Absolutely.

Randy Silver:

That makes sense. It's those are the people that are reachable and convertible and the zeros and ones. I mean, if they're keep using the product and there is zero or one. Why Yeah, exactly.

Kenton Hansen:

You don't need to do this. Like, I mean, I appreciate the support, but maybe you'd be happier somewhere else.

Lily Smith:

Maybe they're just having a really bad day anyway. Yeah. And you kind of hinted at this earlier, but one of the things we were going to talk about tonight was the different kinds of research that you do throughout the lifecycle of a company. And how kind of qual is probably a little bit more important early on than quants. But how do you see the different phases that a company goes through and and the different focuses that you need to have throughout those those different phases?

Kenton Hansen:

Sure. So when when I break down the product lifecycle, in the different stages, I look at problem solution as like the first stage where you're really trying to define what is there a problem that's worth solving, and is the solution viable to solve it. And in that stage, I'm looking for adoption, that usage, right like this is where we have completely free services or a beta service, or I'm just trying to get people to adopt the thing that I'm making and put it in their lives and accept it as one of their own. As that starts to, as you get the adoption that you're looking for, and you set those goals, I move into the product market fit, right. And at that point, asking for money becomes an issue or driving driving sales. And I'm thinking about this from a direct to customer type sales. You know, in b2b, it's much more sales driven. And so you end up doing these as a mix. At the point of product market fit, I'm looking for happiness and acceptance and beyond just like will they adopt it? And will they put it into their use case? at any given point in the day? Are they happy about doing it? Do are they? Are they putting it into their user experience? Or are they adopting it into their lives more? So reluctantly? Is this just what I have to use? Or will I am, am I excited about using it? social networks fall in that for me right now, I want to know what's up with my friends and family. But goodness, I don't want to be on a social network anymore. That really doesn't add much to my life. Is there a market for the product, as you move into the business model fit, you start to be able to hone your pricing strategies and and the value transfer that actually happens there. And then also, as you get into scale, of like, dumping money into marketing to get the largest audience to see the solution that you've tested in works and everything else, you're looking at typical business metrics, you know, your balance sheet type metrics, profitability, sales numbers, that's really where it shifts from the happiness and acceptance to cold hard cash.

Lily Smith:

Sprake formerly usually is an all in one product research platform that lets you ask bite sized questions to your customers within your product or existing user journeys.

Randy Silver:

Companies like Dropbox, square, open door loom, and shift all use springs, video interviews, concept tests, and micro surveys to understand their customers at the pace of modern software development. So they can build customer centric products that deliver a sustainable competitive advantage.

Lily Smith:

So if you're part of an agile product team that wants to obtain rich insights from users at the pace, you ship products, then give sprig a try for free by visiting sprig comm again, that's sprig SP r I g.com. So one of the things I've found work, because I've worked with quite a lot of early stage startups is that in doing that very early stage research, it can be sort of much easier to get the people to talk to you, when you say that you're building this brand new thing and hasn't been launched yet. And you want to have a chat with them and show them some stuff. And then as you begin to grow, you obviously get your earlier, early adopters, but then you get your people who are just like, No, I just want to like come in, do my thing and then leave. And so that bit and you kind of hinted at this again, earlier, of those people who are you know, they're not like super engaged, high early adopter types, but they're also not like desperately unhappy. Like, do you have any sort of strategies for trying to get to these people and, and get to research with them?

Kenton Hansen:

So what I have found, and this is actually like career long findings, is that when you're trying to focus in on that, that middle, that middle mass, really, the most of most users are going to be there if we're talking about a giant user base, or we're talking about even an internal product right? That Nobody has an option to use, everyone has to use this thing, but are you happy about it becomes the metric that that middle section, you can't use the automated methods, right? Like you can't use, or not nearly as easily use an automated method to just throw a survey app or send an email and expect that you're, you're gonna have to reach out and make a human connection with those people. And once you can make a human connection with those people, usually, there's a way to convince them, even if it's just a few quick questions. And then it really becomes part of a sales technique, in a lot of ways. The same techniques that one might use in a sales scenario, I guess, is what I mean. Where you have relationship with them, you can check in with them regularly. And you essentially move them up the scale, if things go well, of like, I'm in this middle mass, but now I'm a promoter, they listen to me, they take my you know, and then you have to go start all over again and find the people in the middle again, as long as you have usage metrics, and that's one of the reasons why at an early stage, the usage is the most important part to measure is that as you move through the lifecycle, you can segment by the amount of use that someone has, so someone might be using a whole lot, but they don't interact with automated surveys, that's probably your middle mass that you need to really hone in on and focus on.

Randy Silver:

Okay, so one of the challenges for people who do lots of research, is you do lots of research, there's just a lot of data that you have, how do you store it? How do you share it? How do you ensure that you actually realise value from it, and people are able to, you know, make use of the corpus of information.

Kenton Hansen:

So I'm going to rely on the collective intelligence of the internet to to chase me for not writing down where I found this from, but somewhere on the internet, I believe, I believe there's a tweet probably from an influencer, that we're all reading, if we're reading product stuff, talked about a change in the way that they were storing their findings. This came up naturally at the in the product to user experience design teams at 20. And oh, my goodness, it's been amazing. It's been life changing. So I want the collective intelligence to scold me for not remembering who it is, and tell us who it was, who came up with this. But what we've done is, is we've changed from basically storing in our wiki or Confluence or wherever we're putting these insights, we're storing the insight that we found. So it may be one user interview that says, boy, I'd like it, if that that button were purple, instead of pink, for some reason. We store it and title it button should be this button should be purple, not pink, and then link to the user research that the artefact that was created for that. So it may be recording to the interview, or maybe the write up for that interview or maybe a survey. But once we have enough of that information, it starts to surface the most popular and most important findings that we'd have in that way. That's been changing for us or our conference has been amazing since then.

Randy Silver:

And is that only based on the user research that you commission? Or is it also coming from other places? Like due support teams put in from their tickets? And, you know, if Yeah, you're not b2b sales, but you know, in that case would be salespeople? would, would that be considered just as valid?

Kenton Hansen:

Yeah, it, we have to be a little bit more careful about that, because we're interacting with kind of a different system, you know, our ticketing or trouble ticketing system needs to be filtered into that way. But it would be just as valid to to link to this problem created, you know, 100 user created tickets over the course of the last week. And so we put that finding up there. Yeah. I think that's a really good point, Randy about the insights that we commission are worth probably 1/3 of the insights that come in through a natural state until they're not but you know, those are incredibly valuable. The the general work that happens is just as good as user research that we commission and I think that answering trouble tickets is for a product manager is worth its weight in gold. Time doesn't have weight, but if it did,

Randy Silver:

and then you've got just a steady stream of digestible what's called tweets of or nugget right. And how do you how do you serve as as like a Slack channel for that? Do you broadcast it out? Or what do you do with them?

Kenton Hansen:

We actually have a confluence space that that has this information associated with it and with the right labels it gets it gets put into this place. The things that are more often the time red is what I use to look at it so as more is added to it the time red goes up. So you know it takes three minutes three this it takes seven minutes to read this Oh wow. This is really important. Let's take a look at this. The other problem The other way is through like the problem solution discovery when when these sometimes these are problems, but sometimes their solutions to problems we are only now just discovering so connecting those discovered items back to a solution that we're looking at, because we found this other problem helps reduce the need to do additional user research, because we've already found these through the natural course of either commissioned or accidental user research.

Lily Smith:

I like that method. That sounds sounds good. Okay, so we were talking about myth busting earlier. So the general consensus, I believe on the internet, is that you should talk to your customers every week. And I just want to ask as a business that's doing user research. Well, do you actually talk to your customers every week? There's consensus on

Kenton Hansen:

the internet. Maybe no. I think this is a myth that may need to be busted.

Lily Smith:

Okay. Yes. Like you, please.

Kenton Hansen:

Well, I mean, so yeah. I don't like I think frameworks serve a purpose. Absolutely. And this is a framework, right? As I mentioned before, the first and biggest hurdle is just getting started. The overhead of talking to your customers, and is different for every business. For us. We listen more than we talk. We listen to our users every day Absolutely. Are the product managers on my team are looking, you know, we're looking at social media. And I think this is another benefit that direct to consumer businesses have, right, if you're talked about on social media, you can hear that as long as you've got a thick enough skin for it. We also have, you know, the community aspect, we have forums and there are there, you know, discord servers, or slack channels, or Reddit has a wealth of information for us from the ttrpg standpoint, tabletop role playing games, sorry. All of those come together into kind of this stream that you can dip your toe into and learn, right? And then regularly, I think regularly is the key, you need to actively reach out and have conversations with your users weekly. Yeah, I mean, that's regular. But I feel like you could do the same thing monthly, right. And if you're talking to your users, and that takes an hour and a half per week, you could do the same thing in a day, per month. And pull that out, you just need to solve for your own biases, again, right, like writing down hypothesis and solve for the biases that come up.

Randy Silver:

So when you're getting ready to do these interviews in the first place, how do you decide on on what topics or questions you want to ask how do you know what's the right thing to conduct research on that week? Or not? week?

Kenton Hansen:

Right. Yeah, that that time that we in that interview? That has, you know, from a specific standpoint, we have some strategic initiatives in any given time that we're working towards, like both quarterly and longer term as well as shorter term. There are different pieces of strategy that we're trying to accomplish different tactics that we're implementing at any given time. And those pieces are really going to define the questions that we ask, tying it to, you know, anyone putting this in any organisation, any product organisation, starting with product strategy, right? What is the product strategy? And how are we trying to affect that, connecting that directly into your key performance indicators? Or your okrs? Right, like, how are we applying product strategy to the day to day to the things that we're releasing to next week, in the week after, or hopefully, next month and the month after? And let's ask the questions that help us move that needle, what worked? What didn't? And what can we ask our users about in order to work better next time?

Randy Silver:

Just please tell me you don't use that terminology. I mean, your role 20 you've got to have strategic quests or missions and other initiatives, right.

Kenton Hansen:

So we, the organisation as a whole subscribe to some concepts by Patrick lencioni. And so they're not nearly as interesting as you might think. We do have the marketing team and I absolutely love this. They use, they have spells that they cast. So different specifics that they'll do to help promote the product and content that we're we're selling at any given time. I'm sorry to disappoint. Sometimes clarity is better than branding, I guess is really what?

Lily Smith:

form over function? No, no, no, hang off, right. Should I ever fall into one of those two? Okay, and then, so just thinking about user research interviews, particularly, how do you prepare for those and and then go about conducting them like what are your top tips for doing the actual interviews themselves?

Kenton Hansen:

Yeah, so I mentioned the defining the hypothesis and you know, that necessitates defining the question that we're trying to ask and then what we hypothesise the answer, we're going to get back, diving into the biases that that we have internally which uncover some of those myths, again, that we need to be aware of. Those are all the first steps. Absolutely. I highly recommend Erica halls book, just enough research. I think it's one of the best textbook examples of just like getting your feet wet and understanding the lightest possible framework in order to do user research. And then from there from her book, one of the pieces that I think is really good is just writing the couple open ended questions that you definitely want to ask and letting the interviewee talk right? Every once in a while, you'll need to do some prompting. And since you've got the one or two questions, you can use those to leverage that again. But you're going to learn a whole lot more from the question you didn't know you needed to ask then you are from 50 questions that you really want to get in there.

Lily Smith:

Awesome. That's great. Kenton, it's been so amazing. Talking to you tonight about user research. I love it. Thank you so much. And yeah, it's been fantastic. I've learned a lot.

Randy Silver:

I've had an excellent time. Thank you both. I am so disappointed that their product teams use the same old boring language as the rest of us.

Lily Smith:

But the marketing team get to cast spells they are living the dream.

Randy Silver:

And so are we because we're back next week with another amazing guest. See you then.

Lily Smith:

Hey, me, Lily Smith

Randy Silver:

and me Randy silver.

Lily Smith:

Emily Tate is our producer. And Luke Smith is our editor.

Randy Silver:

Our theme music is from Humbard baseband power. That's p au thanks to Ana Cutler who runs product tank and MTP engage in Hamburg and plays bass in the band for letting us use their music. Connect with your local product community via product tank or regular free meetups in over 200 cities worldwide.

Lily Smith:

If there's not one Nagy you can consider starting one yourself. To find out more go to mine the product.com forward slash product tank.

Randy Silver:

Product tech is a global community of meetups during buying for product people. We offer expert talks group discussion and a safe environment for product people to come together and share greetings and tips.