The Product Experience

How to estimate responsibly in product - Neil Vass (Engineering Manager, The Co-op, BBC)

Mind the Product

Listen to our conversation with Neil Vass, Engineering Manager at Co-op. We unravel the intricacies of making estimations in product and discuss the fine balance between setting attainable goals and avoiding undue stress on your team.

In our discussion, we address the common pitfalls in estimation, from the pressure to deliver overly optimistic figures to the misuse of estimates as bargaining chips rather than planning tools.

Featured Links: Follow Neil on LinkedIn | Neil's website | Neil's 'Minimum Viable Estimation - Part 4' feature | Neil's 'Minimum Viable Estimation - Part 5' feature | 'Estimating Complexity' - Liz Keogh episode on The Product Experience | Buy 'How to Measure Anything: Finding the Value of Intangibles in Business' book by Douglas W. Hubbard

Our Hosts
Lily Smith
enjoys working as a consultant product manager with early-stage and growing startups and as a mentor to other product managers. She’s currently Chief Product Officer at BBC Maestro, and has spent 13 years in the tech industry working with startups in the SaaS and mobile space. She’s worked on a diverse range of products – leading the product teams through discovery, prototyping, testing and delivery. Lily also founded ProductTank Bristol and runs ProductCamp in Bristol and Bath.

Randy Silver is a Leadership & Product Coach and Consultant. He gets teams unstuck, helping you to supercharge your results. Randy's held interim CPO and Leadership roles at scale-ups and SMEs, advised start-ups, and been Head of Product at HSBC and Sainsbury’s. He participated in Silicon Valley Product Group’s Coaching the Coaches forum, and speaks frequently at conferences and events. You can join one of communities he runs for CPOs (CPO Circles), Product Managers (Product In the {A}ether) and Product Coaches. He’s the author of What Do We Do Now? A Product Manager’s Guide to Strategy in the Time of COVID-19. A recovering music journalist and editor, Randy also launched Amazon’s music stores in the US & UK.

Speaker 1:

Hey everyone, Randy Silver here. It's one of those weeks. Lily and the rest of the Fabulous Pod production team are away at Mind the Product in Pendo's Roadshow in Amsterdam, having a great time and recording some of the speakers there for future episodes. I estimate they got, oh, let's say, four, maybe five fantastic guests. So today it's just me and Neil Vass, Engineering Manager at the UK's Fabulous Co-op, someone I've been wanting to chat to for a long time about how to get better at estimating. I reckon you'll love it.

Speaker 2:

The Product Experience podcast is brought to you by Mind, the Product part of the Pendo family. Every week we talk to inspiring product people from around the globe.

Speaker 1:

Visit mindtheproductcom to catch up on past episodes and discover free resources to help you with your product practice. Learn about Mind, the Product's conferences and their great training opportunities.

Speaker 2:

Create a free account to get product inspiration delivered weekly to your inbox. Mind, the Product supports over 200 product type meetups from New York to Barcelona. There's probably one near you.

Speaker 1:

Neil, thank you so much for joining us on the podcast this week. How are you doing? I'm doing great.

Speaker 3:

Thanks for having me.

Speaker 1:

Excellent. So for people who don't know you, I've known you through communities and talks and things for years, but for anyone who doesn't know you, let's do a quick intro. What are you doing these days? And you're not actually a product manager at this point, but how did you get into this whole product-y world in the first place?

Speaker 3:

you get into this whole product-y world in the first place? Good question, so I'm an engineering manager at the co-op in Manchester now. So co-op is 180 years old and employs 65,000 people. There's a couple of headline facts I like to give, and that's across stores and online sales and funeral homes and insurance Really varied in there.

Speaker 3:

I've had many different jobs at the co-op in just six years I was reeling them off the other day in delivery management and coaching and other things like that, and in the past I've varied between software and delivery. So I've never actually had product in my job title. But I was thinking earlier how did I get near the idea of this product game when I was a software developer? I came across all this Agile stuff too many years ago to count and I tried out Scrum, so it was a long time before I met anyone whose job title had product in it. But I was explaining to people. You're in the business, you're paying for this team, you know how people use this. Your role here is product owner and I'll help you navigate that. Here's what it says. Here's where you get to make decisions. Here's why it's better for you if we don't make all the decisions up front it works. I'm going to play much better than what we tried before.

Speaker 1:

So I've been quite a fan of producty things ever since Fantastic, and the thing we're going to talk about today is you ran an exercise that I attended, oh God, quite a while back now about estimation. That was an awful lot of fun, and estimation has always seemed like a bit of a dark art. But let's start at the very basics of this, and this may seem really obvious. But why do we even bother with estimation? I mean, we spend all the time with our teams estimating things. It always takes longer than we say, and all we're really doing is creating this noose for our stakeholders to hang us with when we say oh yeah, it's going to take about this long, so why do we do it?

Speaker 3:

Oh, there's some good parts in that question. I would start with like, if you absolutely, it's a lot of energy, it takes a lot of time and often you get less good results out of it than you hope. If you're in a position where you can get away with not estimating, absolutely don't. It's my advice. I've been there before. I've worked on teams before where you've got loads of challenges and things that worry you, but estimation doesn't have to be one of them. So the hashtag no estimates, the woody zero style, will take a small step. We'll back off if it doesn't work.

Speaker 3:

That can exist, but I've never managed to just say we're not estimating now and that and that'd be fine if you're not got the conditions for success already.

Speaker 3:

So, depending on where you are right now, you'll have to do some other things that might be a useful goal to aim towards. But you have to try and move people on that journey. Uh, just saying I'm not gonna doesn't work and that part about and this, that's like the major part of the headache if we're all honestly saying like the future is uncertain, but it would be useful if we knew something about when this might go live or what we're going into with this um, can we try our best guess to get to us? If we could do that and I'll honestly, like believe we're all trying and giving our best info it'd be super valuable. Even if it's vague, often it does feel like a battle you've got to cover yourself. I dread to think of the amount of time and mind share lost to how can we not get in trouble with this or how can I make sure it doesn't come back to bite me later.

Speaker 1:

What a waste yeah, obviously I'm being a bit facetious. I mean, I'm the first to ask for about how long do you think this is going to be? Because I need it in my role to help forecast and say, well, okay, people are asking me for lots of things. How many things can I afford to say no to? What can I afford to say yes, or maybe to and setting some expectations, or is our team big enough, or things like that that. Do we have what we need to to do all this? So I totally see the need for it, but, yeah, it always seems to backfire on the other side as well. It's this double-edged sword absolutely.

Speaker 3:

It's yeah. What are you going to do with this info after I give it to you? And a good phrase is I believe in estimates responsibly. Whatever you say now not going to be all against you, but that needs to be true. People need to believe it. Interesting thing I've discovered where if you can get away with it, the more you can make it. It doesn't matter. We're working in small steps so you'll get something tomorrow and you can do something different the day after. And anytime you stop and walk away, you've got that. That's banked. That's a good position to be in, but lots of work isn't like that. When we start this, how long will it be before you've actually got something? You can move on from. What am I getting into right here? Quite often people get exasperated with that um, and they'll reply how long is a piece of string? It's a sort of classic question. Twice as long as half a piece of string.

Speaker 1:

It's a sort of classic question Twice as long as half a piece of string.

Speaker 3:

It all gets very philosophical. I think people often know something about it. It may be quite wide and vague, and I think what people are really saying is I don't think I can give you an answer to within some precision that's going to be useful for you, and if you can talk about that, about maybe a super wide range, is all the info I need, because if it's not within something we're not going to do it, that can be useful.

Speaker 1:

So what's what's the question going to lead to, rather than just repercussions, or or us spending time and you writing it down and we'll never get in and out, yeah, and, and you also suggested I mean, you've done a bunch of blog posts on this as well, and one of the things I really liked is you suggested asking a different question not how long will this take. So what should we actually ask or how can we turn it around?

Speaker 3:

Ah, I like that. So in the situations where we have to sit down and say what do you think Like, the more I can focus on. If you don't need to estimate, that's great, or if you're doing things often enough and you're keeping track of how long things have taken the past, we can just look it up in the chart, get all the way away from asking anybody what they think those are good, but sometimes it's. You've not got that info. You're not in that situation. It's a different kind of work and you just need to say what do we reckon? So something that's really helped me is that idea of, rather than how long would this take, including all the various things this might be? Give me all the options and all the ways we might be able to add more people. If that helps, or reshape teams or stops something else, it can feel like just endless possibilities. I could be here forever messing about with different things. A good question can be like what date would make you make a different decision?

Speaker 3:

So, what answer would make us not go ahead with this. And that's useful in two ways, because sometimes people say, okay, like the information value of this is zero, like there's no decision getting made based on it, which is interesting. But the other one is um, sometimes when you put like a line in the sand, if it's less than this, we're doing it more than that. Sometimes it's still super uncertain. I don't know where it is, but a lot of the sand, if it's less than this, we're doing it more than that. Sometimes it's still super uncertain, I don't know where it is, but a lot of the time. You'd be surprised how often that just uh, turns out a difficult, thorny question into a very, very clear cut one like I need this by june, or else we've missed our window and it's just not worth it absolutely.

Speaker 1:

And then you can ask what is it? What is it that you actually need? Is it this full-blown system, or is it just need to do one? Yeah, one basic thing.

Speaker 3:

Absolutely and that says so. If the rough thing, the wide range, if that's roughly about June, we're still in trouble, but a surprising amount of time it's like June. You'll be lucky if it's 2027. We're not going to use out, I've got no idea, but it's way past your date or equivalent. No, no, no, it's a couple of months. So it's not going to be. It's still wide, it's still vague. But if you need that answer you can go. So that is a nice shortcut. And that thing you're talking about where, well away from that line, super useful. Staying away from that line is very important.

Speaker 1:

Okay, so we'll run the exercise that you did shortly, but a couple more questions about setting the basics. What is estimation actually for?

Speaker 3:

Who is it for? I like it. There's a lot of reasons people ask for estimates and a lot of them are the thing we talked about. It's a caricature, but sometimes but sometimes it feels true, like give me a number so I can come back and tell you off later because you said it, including possibly I will force you to make it a smaller number, and sometimes the people asking are professional negotiators and others just want a quiet life. Sometimes it's for that, not always, but I think there's shades of that in any discussion and that's some of what makes it painful.

Speaker 3:

You fear you might fall into that, even if it's not true, and the other thing it can be for helpfully, the most useful things I've seen estimation used for is a when it's something important, when it's actually going to make a decision based on it, and ideally, what you can do is get your estimate of if we only do this, if we only get this much done, we're okay, the thing's live. I wish I had more or I've got a version of the future. It's done it. If you can get that number as far away from possible as a time when you absolutely want to stop working on it.

Speaker 3:

You're in that beautiful. You've got time to iterate, we're getting out early. We've got time to do things about it. It's fabulous. And equivalently, if somebody doesn't want to, if you know you're making the decision, it's the whole roadmap is getting cleared for a very long time to come. If I choose to do this, it's useful, because that's very different from this is just one more thing amongst many bets. I can try and see what's useful, so you have to be very, very confident.

Speaker 1:

It's a good idea if you're going to do that, so that's in communication with stakeholders or people who are making plans based on it. Is that really who it's for, or should it be used mostly within a team? Is it more of a health metric of you know, we have an idea about how much we can do in a given time period in a sprint or if you're not using sprints, if you're doing Kanban whatever, just to give you a rough estimate of how well the team is performing against their own estimates.

Speaker 3:

So I have seen a trend I couldn't believe it was true for a while the say-do metric. And we say it's this and it turns out it's that and people get ranked on it and your team is better than others, far from useful, incredibly gameable and isn't actually related to any of the things you wanted. That'll drive you to do repeatable, safe and boring. Let's do something similar to what we've done before so I can do it. You're missing out on lots of good opportunities. Where I've seen estimates.

Speaker 3:

Some people make a differentiation, like forecasting, because you use data, but I think estimates just mean we're predicting the future. If you keep track of when things start and when they finish in your team and looking at, especially if you said what did we think was a big job and what was a small job, that can be super eye-opening, because some of the things I've seen teams realize is the size of a work. We've got a sense that this is a short job and this is a bigger job, and I'm not much clearer than that. Sometimes that's got little relation to how long any of this actually takes to finish. There's a lot else going on. So possibly your most important question when you're asked to estimate something is what else do we have going on while we're working on this as a team? What else have we got? And also, as a team, what's our discipline? How many things do you put in the next up column that then get leaked from, sometimes for months at a time?

Speaker 3:

So getting some insights into, when we say this, what actually goes on and what might we do about it, can be super useful. So it depends on the team. You'll find different things out about what's going on and it may be every time we touch some part of the system. That's where endless bugs and confusion come out. We should learn about that system or do something else about it so it's less scary to work with. And it can feel sometimes like everything we do might turn into a six-month odyssey and just having a bit of data on that they stick. They loom large in the mind and stick with you, but actually 90% of the things we do are done fairly quickly. We're not actually in lots of danger of the the multi-month odyssey coming up.

Speaker 1:

it's rarer than it felt some of what you just said reminds me of an episode we did a while ago with liz kio uh, where she has a a way of estimating complexity, essentially a complexity multiplier, which is you know, it's a one if everyone on the team has done this. We know exactly how long it's going to take. It's a two if, uh, you know someone on the team has done this, but we know exactly how long it's going to take. It's a two if you know someone on the team has done this, but we've got a new person and they're going to try it this time, so it might take a bit longer, but we've got someone on the team who can help them. And there's three if there's someone else in the org has done this. So we know it's possible, we know we have access to them. I think a four is if some other company has done it, but we've never done it here, so it might be different tech stack, maybe different capabilities. You know different priors, essentially, and if I was no one's ever done this before.

Speaker 3:

I've seen that scale. It's in the Kinevin for Everyone blog post. It's fabulous because that is once you think about it and apply it to some work. It's like of course this matters so much much, but it so rarely comes up. They're like how long will this be? Just how many weeks, how many person hours, how many whatever, and some things are a super different category of work. You're doing scientific research with some of those categories.

Speaker 1:

You said so we know how long this took us last time, but we've got we're in different circumstances, we or we've got different people, or we're being asked to do three other things now, whereas with all of these things are really critical as part of your estimation. Well, what other kind of information when you go into a session with your team and you're saying, ok, we need to estimate a piece of work, well, what kind of information do we need to have at hand? What kind of research do we need to have done to create decent estimates?

Speaker 3:

at hand. What kind of research do we need to have done to create decent estimates? So definition of decent is probably the most useful thing you can go in. What everyone wants and imagines is an estimate with as little error bar on it as possible and as true to what actually comes out in the end. Things can be vague and uncertain and I think sometimes that matters and sometimes less so. Some of the qualities that might be useful is how much flex and discretion will the team have as we get into it?

Speaker 3:

I've had things before where somebody wanted a fairly fluffily defined feature like a new capability in the app that might touch some parts of the code I didn't know about. That might involve doing some things we haven't done before, but what I was confident in was some version of this is possible and if there's reasons why we can't do as we get into it, they really wanted this. If we have to do a super simple version of it. I've worked with these people before and I know it will be fine, so that made me able to give a super confident estimate like if we spend a few months on this, you will get some version of that and I hope it'll be a really nice version as well. But we've got the flex, whereas in other things it's specified to the nth degree and we don't actually know until we get into doing the work, or people will assume you know, when you've said I will, uh, I'll get a website. How long will a website take? And then that can turn into everything. I imagined that that's in scope, that's on it, so that kind of danger can help as well. Shorter than that the more info you need, the more danger you're in.

Speaker 3:

If there is the first version that's done, done and out, there is the last version and that is extremely close to when we're going to put it live to the public. It's almost impossible to get confidence on that. So the kind of information that we would want there is is there ways to get decision points? Is there ways to get versions of this we can back off from? Is there things we can get value out of sooner? We talk about a lot, but it's hard to actually keep in that discipline.

Speaker 3:

So the idea that we could do a very small amount of this and then you've still got a lot of the value out of it if you can get into that position as well again, you can make a super decent estimate because I would say you'll get. I don't know when mvp will be, but it'll be a long time before we've said we'll stop iterating on it. So we we're getting a good number of rounds improving this once the first version's out. Yeah, I love estimating things like that, rather than what's the absolute first version. Could Could you make it? Could you make it less? Could you make it less? Can we cut it to the bone before you've even started?

Speaker 1:

This episode is brought to you by Pendo, the only all-in-one product experience platform.

Speaker 2:

Do you find yourself bouncing around multiple tools to uncover what's happening inside your product?

Speaker 1:

In one simple platform. Pendo makes it easy to both answer critical questions about how users engage with your product and take action.

Speaker 2:

First, Pendo is built around product analytics, enabling you to deeply understand user behavior so you can make strategic optimizations.

Speaker 1:

Next, Pendo lets you deploy in-app guides that lead users through the actions that matter most.

Speaker 2:

Then Pendo integrates user feedback so you can capture and analyze how people feel and what people want.

Speaker 1:

And a new thing in Pendo session replays a very cool way to experience your users actual experiences.

Speaker 2:

There's a good reason over 10,000 companies use it today.

Speaker 1:

Visit pendoio slash podcast to create your free Pendo account today and try it yourself.

Speaker 2:

Want to take your product-led know-how a step further? Check out Pendo and Mind, the Product's lineup of free certification courses led by product experts and designed to help you grow and advance in your career.

Speaker 1:

Learn more today at pendoio slash podcast. So there's a cognitive bias that comes up in this stuff, neil. Um, we all think we're pretty good at this, it's you know. We all think we're above 50. We all think we're in the top quartile of things like this. In general, though, how accurate are we by default on estimation? Oh, that's a good question.

Speaker 3:

It's something that, um, really shocked me when I looked into it because you think sometimes by default on estimation oh, that's a good question. It's something that really shocked me when I looked into it because you think sometimes it's the super precise to the hour. When will this be ready? Obviously that's super hard, but if somebody is willing, like enlightened, and willing to say just give me a wide range, give me a 90% confidence range on this. There's been studies and I've tried things myself.

Speaker 3:

Most people, if you took a bunch of random like range questions and stuff and answered 10 of them, giving a 90 range on all of them, you should get 9 out of 10, just the way luck and the sticks works out. So how big is this? How long is that? How long is this? Give your range so you're 90% confident. The right answer's in there. Make it as wide as you need to. On average, most of us get three out of ten or less, which means with all the professional pressure the boss wants it to be this. I'm meant to be a competent professional who can get things done. All that pressure and drive to make it smaller numbers that would bias it With some trivia questions that don't matter, just to test your calibration 30% feels like 90% confidence to us, so no wonder we're in such trouble.

Speaker 1:

Okay, so we've defined the scope of the problem. How do we actually get better at this? What are the first steps? Should we run the exercise, or are there things that we should do as a warm-up first?

Speaker 3:

I think jumping into the exercise might give us a chance. It's something I'd heard about the problem for some time of how bad we are, and I'd seen it play out in real life Like give me a wide range, what do you reckon? And you're still wrong. So there's a book called how to Measure Anything by Douglas Hubbard that tackled this exact problem. What I like so much about this book is this is really detailed. This exercise with lots of stories of how much you'd used it and detail on how to run it. It's fabulous.

Speaker 3:

I would love that would make one whole book for me, but one chapter out of lots and lots of similar really useful things that aren't even about estimation. So what he talks about is becoming a calibrated estimator. So your first step is when I say I'm 90% confident the right answer's in here and I might have to be very wide. I'm right 90% of the time. That's where we're trying to get to with the game. So you write out interesting trivia. Questions like that don't matter, like how how many liters of water are in the pacific ocean? Uh, it's one ten, it's less than I don't know. Just give me a number and the first time you do about 10 of them. I think everyone will get one of those terrible ones. It's interesting, even when somebody's told you how bad you are and you feel like you're making it wider. You do that and then there's a series of we'll try it again and there's a little thought exercise you can do. There's a few different ones to run through and I found running through these and it's a bit like a pop quiz. You feel you feel funny, you feel you feel satisfied when you get something right. Two things I've found really useful about it is one it really really does help, like we get better at it later.

Speaker 3:

But even if it doesn't, I've seen a lot of value in it, because there's just often it's the same people in a team who got asked to estimate again and again and there's a bit of a feeling that you're just being truculent. And how hard can it be? So this puts everyone. If you get stakeholders to do this as well, everyone realizes that whoa, saying what you think is really, really hard. The last bit that's good.

Speaker 3:

This isn't just about estimation. So all kinds of value questions about the worth of doing anything are in a similar category, like how many people will press this button that we're talking about, how long it will take to go live. How much money might that bring in? It's a similar class of, I'll give a guess, but business cases are my favorite genre of fiction. So we all get to realize we've all got hard problems and when you're getting asked to estimate, similar to all the other types of predicting the future problems. So if we're all in this together and we've all played a fun pub quiz game together, hopefully we'll have a bit more empathy for each other later okay, are you?

Speaker 1:

are you up to trying it? We'll do. Uh, how many questions should we do? Should five questions? And just to get a feel for it yeah, that'd be absolutely great okay, and I've pre-prepared some uh questions to quiz you.

Speaker 3:

So the first one how many islands make up the Philippines?

Speaker 1:

Ooh, definitely more than one, and I have no idea. I'm guessing dozens to hundreds, I'll say between one and a thousand.

Speaker 3:

Two and a thousand actually.

Speaker 1:

I know it's more than one, so we can reduce the range slightly.

Speaker 3:

Cool and the aim with the 90% confidence. If we're asking for 100% confidence, you could say between 0 and 10 billion. I know it's in there 90%. So you should be very Surprised if your lower bound Is too low and Very surprised if your high bound Is too high. So 2 in a thousand on the Philippines.

Speaker 1:

I mean, I think it's quite a bit lower, but I would. It kind of depends. You know, he's a little tiny rock in the ocean and it considered an island for this case. So I'm just going to be really, really generous and give a ridiculous a number that I find ridiculous to this point and try and get to that 90%. But yeah, who knows?

Speaker 3:

Cool. The correct answer for how many aliens in the Philippines is 7,641.

Speaker 1:

I am very surprised.

Speaker 3:

That right, there might be the equivalent of this platform, migration somebody's talking about and you think you've given a comically large upper bound. You are actually seven times too small with it, and that's exactly what teams get into.

Speaker 3:

Okay, give me another. Hello. So the first thing that helps with this is just called repetition and feedback. So doing another is exactly what we want to do, and the feedback on wow, I need it to be a lot bigger. It's already helped you. A lot of us get no feedback ever on any of our estimates. Nobody tells you exactly where you got wrong, so we're already ahead of the game for the next one. What's the weight of a one pound coin? Now, lou, you live in the uk, so you've seen some. That's good okay.

Speaker 1:

so yes, you're saying a pound, not a one pound in weight coin, a one pound in value coin, that's right. So, the obvious answer would be one pound. But yeah, that's the dad joke, isn't it? I have taken lots of coins to the bank in the past, back when coins were still commonly used, and I could have sworn that a pound was a pound. But the problem is I'm also really bad at metric and this is going to be a metric answer, I'm sure.

Speaker 1:

No let's go. I'd be even worse at that. Now I've lived here too long. I'm still flabbergasted by the fact that the British used things like stone. Because I'm a big guy, I measure myself in boulders rather than stones. Anyway, I'm still flabbergasted by the fact that the British used things like stone, because I'm a big guy, I measure myself in boulders rather than stones. Anyway, I'm sorry, proper answer to this. What is the actual weight? Somewhere between 50 grams and 500 grams.

Speaker 3:

Cool, and for this, before we accept that your final answer, you can do the equivalent bet test. So if I had it set up, I'd show you you can actually make with the protractor and a spinny arrow. I've done this before a sort of pie chart, and there's also there's online shows. There's various ones. You can go and do this where you want 90 of it to be the green and just 10 of it a little wedge to be the red. I'm losing out here.

Speaker 3:

And what you can imagine doing is you get to choose one of these games to play. If the right answer is inside your range you just gave your 90% range you would win £1,000. Or you could play that game, or instead we could spin this wheel and remember the 90% of it is green. Spin the wheel and if it comes up in the green part, you win £1000. So unfortunately it's just a thought experiment. I'm not giving you money either way, but I can't. It works just as well either way If I don't give you the real money, which saves me a fortune. If you were to choose between those two games, would you like to play that roulette style spinny or would you like to do the bet that the right answer is in your range?

Speaker 1:

oh um. I'd much rather do the roulette spinny.

Speaker 3:

I don't feel that confident and that's uh, that's where most of us are, and that's because we make our ranges too narrow. What you need to do is widen your range as much as you can until you would be equally happy Roulette wheel or your range you're just as happy, because that's what 90% of us and lots of us just don't have a sense of it.

Speaker 1:

Okay so a pound coin weighs how much the suspense is killing me 8.75 grams. And I said from 50, which means I have no idea what I'm talking about with grams.

Speaker 3:

Okay, so this, yeah, in a real one we'd have several more, but I'll just give you a couple. So let's say the third one what year was Nina Simone born?

Speaker 1:

Whoa good question. I saw Nina Simone sing once. It was the 90s, I would late 90s. I would say she was in her 70s or 80s. So I will say between. Again, I'll go extra generous in case I'm wrong. I'll go between 1890 and 1920. Let's go 1930,. Between 1890 and 1920. Let's go 1930. Between 1890 and 1930. Although I think she's probably around, if I had to narrow it I'd say probably around 1910-ish.

Speaker 3:

So the extra trick you can try here is pros and cons and, if you want, you can pick two reasons why the right number is absolutely surely in your date, and the other one is two reasons why it really might not be so could you?

Speaker 1:

I mean 1850 to 1950, but that just seems absurdly uh easy, and it's not very useful information at that point.

Speaker 3:

So the level of use of an information is like how much do you know about a topic? And I think this is quite like real estimation as well. There'll be some things I'll ask you and you will know to the, to the foot, how tall everest is when you come up, and other things you just don't know about it. And that's that's true. With the work we're asked to estimate as well. Some of these things you've done something just like it before. You're pretty sure it is between five and six days. Other things I'm gonna have to say, but you know you're gonna hear, like, honestly, that's the most I know and that's where we can be, nina simone, because I'm really making you wait, for it is uh, was born in 1933, so you're adjusted or had it in there. But again and again it's the ones, it's the ones that feel like I've made it far too late. Now that's what you have to do.

Speaker 1:

Because she's deceased. Now I feel like I can get her with it. She was in rough shape for her 60s. At that point I thought she was in her 80s, but still sounded amazing. But yeah, wow, she had a life.

Speaker 3:

Yeah.

Speaker 1:

What a fantastic show. Okay, let's just do one more for giggles.

Speaker 3:

Okay, I'll give you one more, and there's a couple of different tools you can use, but here's a new one. Question you have to think about is how old is Nintendo in years? How old is.

Speaker 1:

Nintendo, yeah, in years Good question. I believe, if I remember correctly, that they were not originally a video game company and they did something else. They were definitely around when I was a kid and I think they were around quite a while before that, and I need to give a range, so let's say, between 40 and 150 years old, which is very large, but I have a high degree of confidence about that one.

Speaker 3:

Cool, and if you want to adjust it, one more tool we'll look at. There's a few we could choose. It's called reverse anchoring Because I think when we start with, oh, it's probably about 80. I'll try and go a bit either way. That's naturally. We do lots of things and the first number you think of pulls you towards that. So you tend to and this happens. If I suggested it's a thing, you would start for that number, and especially if your boss or an important stakeholder had said about six months, you would not wait for that.

Speaker 1:

So when you're in, that meeting with other people and someone says, uh, no one's really sure, and then someone says, how about you know this? That's not a really great way to start, is it?

Speaker 3:

that's, yeah, an anchor that ties us closely to it and all your things. Try and move through that, but we'll struggle. You can try reversing that to say what is a ridiculously big number. There's no way they were around in the stone age, it's definitely not millions of years ago. And you come in and at some point all these numbers are ridiculous. They weren't around 2 000 years ago. Would have heard about it if, jesus, I had nintendo carbs. And at some point you say that seems unlikely but it's not possible and that's the edge you're looking for.

Speaker 1:

Okay. So in that case I'd say definitely not more than 250 years. That seems impossible to me. So 40 to 250.

Speaker 3:

Cool, and for this one the right answer is 133 years.

Speaker 1:

So I was actually pretty close in my original one, but that's not bad. I was right in that they weren't originally a video game company. Did they do playing cards or something like that originally?

Speaker 3:

yeah, so it was in the I think 1890 ish was when they first got started.

Speaker 1:

Yeah, that's a useful piece of trivia that I don't need to know anything about this.

Speaker 3:

What's in your brain now?

Speaker 1:

Fantastic. Okay, so we talked about a few things. We talked about anchoring and reverse anchoring. We talked about repetition and feedback and pros and cons. There was one other trick that I really liked, I think mostly for the name. What are the lines of hope and despair? Sounds like something from the Princess Bride.

Speaker 3:

Oh, I love it. So I learned about that from, uh, someone called dr sal freudenberg like to cite my sources and I asked her. She's not sure where she heard it from first. Just somebody came up with this. I've been using it for years.

Speaker 3:

So if you've got a stack of stuff, we're wanting to get all these features or iterations in by some deadline, or we want to know how the team's going to do by our okr or something some other reason. You've got a time bound amount, oh, and you're trying to put some work in there. That's a really hard problem. So what you do with this is make a like force rank list. I'm not saying you might have a big team and people can work on it like two or three things at a time. It might be like a whole department. But if you got one thing, if it was an absolute disaster of a quarter or whatever you're looking at, and we got one thing out of it there was sickness and other reasons and stuff would you want the one thing to be go all the way down the list? So you're saying, if it's this, you're actually happy you got this and not that one. Right, then the the lines come in later where you go away, you use your best, whatever data you've got, or educated, hunch it whatever you've got, to say how far down this do I think we're gonna get and you put in somewhere in your line of hope. So, above that, it would have to be an astonishingly bad time if we don't get that far. You're like, in this amount of time, based on everything I know so far, we're getting to there and that might be how many releases down like a story map or anything like that. Beyond that you'd have you've only got hope. So up here it feels pretty certain. It was stunned if you didn't get it beyond the line of you're hoping to get things as you go lower down.

Speaker 3:

So sometimes when people see that they're I don't want to hope to get that, that matters and that makes them want to reorder the list and a bit lower down through your hope it's getting progressively less likely because this is the order of priorities. You get to the line of despair where, once you go past the line, we can just despair of getting any of these in this time set Again. That's super useful because people are like no, I don't want it and I want to have a chance of getting that it makes it really really clear what's going on. It separates it from who's going to work on this, the big tetris game and it's also really useful because you go into something if those lines are far apart. It communicates really clearly that we're not sure how fast we're going to go, how much work things are going to be.

Speaker 3:

We're uncertain, ideally, over time. If you're coming towards whatever that deadline was, these are moving close. You're getting more certainty. That's a signal something strange is going on and it lets you do all kinds of things like reorder and reprioritize, and when a new thing comes in, is it more important than this or more important than that? It's a single, like visual list of this is what's most important. If we worked on it in order, even though you don't have to be working on it, you can swarm on things, you can divide them up however you like that's interesting and it goes into the idea of you know you don't want to try and do too much work too far ahead.

Speaker 1:

So all those things in the above hope, the things you're pretty confident in, or you're you desperately need to do, uh, and you're putting making a a real commitment to doing those things desperately need to do, and you're making a real commitment to doing those things you need to do the work on. You need to have the discovery in place, you need to have everything prepped and ready to go, but beyond that you need less fidelity, at least until you cross a few things off. But the stuff below that line of despair, there's no point even worrying about it. If we're talking about a now, next, later roadmap, that's the stuff that's firmly in later and you're definitely not doing it anytime soon.

Speaker 3:

Absolutely. It's super useful for pointing that out to people.

Speaker 1:

I love that. Okay, neil, this has been fantastic. We've got time for one more question For anyone who wants to get better at estimation, starting tomorrow. Obviously there's the exercise that's in your blog post, that we'll have in the show notes. There's some other stuff, but what's the one thing you would suggest people start adding to their practice tomorrow? What's one thing they can do?

Speaker 3:

Oh, that's a nice question. I think when I was looking back at like, sometimes I've used this and it was less work, but sometimes I've had to use this and it seemed more effort. Why can't I use this every time? The pattern I've spotted more and more is um, where people just don't trust that you're doing a good job. They're wanting to use it against you, like people are thinking I have to look out for here. I think nothing works. So, and I and the other the corollary of that is where people do trust you. I've seen almost anything work. I've had all kinds of anything. If it's used with good intentions and we understand it's our best guess, we can do that. If you're in a high trust environment, pretty much anything works and you're free to experiment them all. And if you don't have that, almost nothing will. So perhaps counterintuitively, if you want to get better at estimating, I might not focus on that at all. It's that sort of trust issue and believing we're all on the same team.

Speaker 1:

I'd go looking into first. If you feel that's not in place. Okay, I do have one last question. It's a quick one. Of all the people who are going to listen to this episode, Neil, what percentage do you think are going to get better at estimation and use these techniques?

Speaker 3:

Oh, I would guess roughly 30% of people. I feel like it's 90 and I know that intuitively I'm going to be wrong about that. So let's say 30% and if, if, listeners could maybe tell you whether or not we've got a scientific study done.

Speaker 1:

Do let us know people. We're really curious. I think 30% is actually a fantastic result. I'm of the opinion that anyone who uses this to get better is a great thing and a wonderful thing that we've done for the world. I would hope it's closer to 90%. That would be even better, but my level of confidence is not quite that high as well. Neil, thank you so much. This has been fantastic.

Speaker 3:

Thank you, it's been lots of fun Cheers.

Speaker 2:

The Product Experience hosts are me, Lily Smith, host by night and chief product officer by day.

Speaker 1:

And me Randy Silver also host by night, and I spend my days working with product and leadership teams, helping their teams to do amazing work. Luran Pratt is our producer and Luke Smith is our editor, and our theme music is from product community legend Arnie Kittler's band Pow. Thanks to them for letting us use their track.