Quality Bits

Modern Testing with Alan Page

March 05, 2024 Lina Zubyte Season 2 Episode 14
Quality Bits
Modern Testing with Alan Page
Show Notes Transcript

More and more software delivery teams embrace the whole team quality approach. Where does testing fit into that? Modern testing principles are there to help to understand what the industry is moving towards.

In this episode, Lina talks to Alan Page - one of the creators of modern testing principles. He's an avid blogger who has quite a few opinions on better software delivery. Tune into this episode to learn more about how modern testing principles can help to embrace the whole team quality approach, and more.

Find Alan on:

Mentions and resources:

Links to books are to Amazon and as an Amazon Associate I earn from qualifying purchases

Follow Quality Bits host Lina Zubyte on:


Follow Quality Bits on your favorite listening platform and Twitter:
https://twitter.com/qualitybitstech to stay updated with future content.

If you like this podcast and would like to support its making, feel free to buy me a coffee:
https://www.buymeacoffee.com/linazubyte

Thank you for listening! ✨

00:00:05 Lina Zubyte 

Hi everyone, welcome to Quality Bits - a podcast about building high quality products and teams. I'm your host Lina Zubyte. Alan Page is quite a known name in the testing world and not only - especially for sometimes questioning the traditional testing definitions. One of the things that he has been a part of creating is modern testing principles. And this is exactly what we're talking about in this episode. You're going to learn more about modern testing principles, how to change your role if you're more of a traditional tester to approach the more modern ways of testing, which is not only about testing actually, and why psychological safety matters. Enjoy this conversation. 

00:01:07 

Hi Alan, welcome to Quality Bits. 

00:01:10 Alan Page 

Hello, it's good to be here. Thanks for having me. 

00:01:12 Lina Zubyte 

Could you shortly introduce yourself to those that do not know you? 

00:01:17 Alan Page 

Sure. My name is Alan Page. I live in Seattle, WA. In the upper left hand corner of the US. I'm a big soccer fan. I love hiking. One of the great reasons to live in Seattle: lots of outdoor things to do, so I end up doing a lot of long hikes during summer. And then I've been doing software for about 30 years and currently leading a team called Engineering Experience, which is a platform engineering team at NBC Universal. Before that I worked at Unity Technologies and before that at Microsoft for a long time. 

00:01:49 Lina Zubyte 

Your nickname is Angry Weasel. Why? Are you very angry? 

00:01:54 Alan Page 

I'm not angry and people ask like, why are you so angry? I say you don't know me. I'm not angry. It's a long story, but I'll try and make it short. My background is in music. I studied music in college and actually my first job out of school was teaching high school band for four years. But I played in a lot of groups, bands. Usually horn bands, like, I did all the arranging and played saxophone in the band. And we were doing a gig somewhere and we're a cover band, not anything big, but we're talking to some people and, you could call them fans or not, after a show and we're drinking and they're just telling us this story of some German roadway and someone said: “Be aware of the tooth of the weasel!” I thought that was the funniest thing. And we started talking about angry weasel and we wanted angry weasel to be the name of our band, but our guitar player vetoed it. But I already obtained angryweasel.com and I just kind of kept it in the background. And then my nickname in the band became angry weasel. 

00:02:52 

Actually, we all had weasel nicknames for a while. It's really funny. So fast forward a couple of years later, I wanted to move my blog to a new location off of the Microsoft sites. I thought, you know what, I'm just going to make it on angryweasel.com and I've used angry weasel with usernames and of course my blog, my substack – angryweasel.substack.com, so I'm now angry weasel everywhere. But I'm not angry, just a nickname that I I gained through happenstance. 

00:03:21 Lina Zubyte 

Well, we'll see in this conversation if you are angry or not. How angry you will get. 

00:03:26 Alan Page 

You'd have to try hard to get me mad. 

00:03:29 Lina Zubyte 

Challenge accepted! So first of all, the big controversial statement... You said that you used to work as a QA, then now you're working more with engineering excellence and platform, leading the platform team and you said that you are making more impact for quality having this role. 

00:03:46 Alan Page 

Yeah, I am. 

00:03:46 Lina Zubyte 

Why?  

00:03:48 Alan Page 

Well, I didn't mention in my intro... I spent a big chunk of my career doing dedicated testing and in that time, I wrote a lot of cool tests. I did a lot of cool testing. I wrote a lot of great frameworks. Towards the end I ended up coaching developers on how to test. But as I moved into more of a role of leading teams, building the infrastructure and it was actually, it's a really close transition. I built test tools for the team. And then I got devs to use those test tools. And then I looked at things that were blocking quality. I started working on CI systems before we really had CI. I started adding gates to our build system so that we could detect more bugs during the build time and that sort of developed into me running teams that built our CI systems that focused on how we got from, as I used to say, from developer desktop to customer desktop. 

00:04:47 

And I realized in controlling, not even controlling, but being a part of defining which tests were run, how we evaluated risk during the CI process....That I had more control over end user quality with that overview than I ever did as a tester. I still believe that way. And then I was talking to Bryan Finster a month or so ago, and he mentioned something that's absolutely true. I never thought of it this way before. He said: the best way to highlight where quality issues are in your system is to try and do continuous delivery and that's what I was trying to do back then. I never made the connection, but yeah, when you try and deploy as often as you want and keep high quality, it highlights where all of the issues are in your system. I focus a lot on velocity and especially adaptability, and when you try and deliver quickly with quality.... 

00:05:48 

Any problems in the system really highlight themselves, so by looking from that angle, yeah, a lot more influence over quality and quality being the customer’s evaluation, the customer’s feeling of using the product. I think a lot of times when I was a tester, I focused on testing. I'm going to do a really good testing. And customers don't care if you do really good testing. I gave a talk 20 years ago at a STAR conference then. 

00:06:15 

I mocked up this box, back in the days we used to buy software in boxes, and I put bullet points like 85% code coverage over 90% of the tests passed. You know these stupid things we measure as testers which influence quality. They're not even really a proxy for quality. Quality is really only evaluated by the customer. So I really focus on understanding what the customer is getting out of it and how we could learn from them and adapt from them and improve quality versus just focusing on the testing part. 

00:06:47 Lina Zubyte 

I really like the idea of trying to get continuous delivery in. I've heard of this before as well: ask the team why they're not doing it and likely you will find a bunch of quality problems. I think the most common one is: “ohh we don't have quick enough feedback. We're still doing this huge manual regression testing cycle before each release”. And that is our already information on how we could build a better quality product. 

00:07:16 Alan Page 

Well, I think it's important to ask, is there a better way? Fast feedback loops are so important, whether they're from the developer doing their check in or getting feedback from the customer. If there is something blocking our ability to get a fast feedback loop, we have to stop and ask ourselves, is there a better, more efficient way to do this? 

00:07:36 Lina Zubyte 

I love feedback loops. I started drifting off thinking that maybe my title should be related with feedback loops. 

00:07:43 Alan Page 

Chief Feedback loop optimizer! 

00:07:46 Lina Zubyte 

Wow, I like that. 

00:07:48 Alan Page 

I just made that up, yeah. 

00:07:49 Lina Zubyte 

Yeah, but most people wouldn't really understand what we mean when we talk about feedback loops because it's so many areas. It's not just testing as you say. And one of actually initiatives I would say that you have is modern testing. I really like that in the definition of modern testing you say: “It's not that modern, and it's not that much about testing.” 

00:08:10 

And I love the some kind of like mixed feelings of this and how I found myself in this state as well. I started in a very traditional kind of testing role. And then I went forward realizing, hey, there's much more there that I can do. And sometimes there are reasons why we find this bug and it's not what I think of. It's not just something that is my first guess. So tell me more about modern testing. Where did it come from? How did this idea come to your head? 

00:08:41 Alan Page 

Yeah, it came to... And I have to give Brent Jensen, my podcast partner, you know, full half credit, you know, Brent and I started our podcast 8 years ago, I think. And one of the things we wanted to do in that podcast was talk about what we were seeing happening in the industry. And of course, Agile has been around a lot longer than that, but... We were both at Microsoft then. Oh, it's been longer than eight years. I've been gone from Microsoft for eight years. 10 years? I don't know. It's maybe it's been 10 years... And we wanted to talk about what we were seeing. Brent had just left a testing role, and I was still in a kind of a testing. 

00:09:16 

But we were seeing things change. We're seeing software delivery change. Brent was working on Bing, which was delivering hundreds of times a day. I was an Xbox or on Xbox Live. It was delivering, you know, several times a day. And what we were seeing there was, boy, the role of test is changing. We weren't trying to define anything. Our goal was at the time, so we wanted to describe what we're seeing. So that other testers know what that path may look like for them. 

00:09:47 

So we talked about modern testing. It was really not about testing - we're talking about traditional testing - and we started with that. Traditional testing, maybe test last, maybe not, but it was a siloed testing, doing a bunch of testing, blocking the release being the gatekeeper, trying to test quality into the product that was our sort of straw man of what traditional testing was. 

00:10:15 

So we wanted to talk about what does that look like in this world we're seeing of fast feedback loops and quick delivery and adaptability? And so we talked about that as modern testing. They're really modern delivery principles, interestingly, not intentionally, they line up pretty well with the three ways of DevOps, which are coming out around the same time. But it was... How do we look at what testers are doing today? How should they think about things moving? 

00:10:42 

So we came up with these modern testing principles and really we didn't just start with them. We talked about modern testing for a year at least before I read Ray Dalio's principles and thought, you know, we should have principles that sort of align around what we talk about, very agile approach. I sketched some things out. There may have been eight at first or six, it wasn't 7. And we shared them with our community and we iterated it and we iterated and we came up with these seven principles around modern testing, which are really around modern delivery, which really line up a lot with just fast feedback loops and focusing on improving the business versus focusing on improving testing. I think a lot of testers still focus on like I said before, doing better testing instead of helping improve the quality of the product. 

00:11:32 Lina Zubyte 

You know those principles have so many key words that I like. For example: bottlenecks, continuous improvement and it really resonated with me and it's something like a manifesto of sorts, right? A list of principles to refer to with what I would mean is actually the role that I'm in as well to reflect on it, so it's extremely helpful to me. We're not going to go over all those principles, but just to mention a few, the first one is: “Our priority is improving the business”. And one more I wanted to actually quote is that: “We are a force for continuous improvement helping the team adapt and optimize in order to succeed rather than providing a safety net to catch failures.” This one hits because at the start of my career I was the safety net and this idea of making yourself obsolete or not being the safety net is still so relevant nowadays as well. I still work with QA teams who say yeah, but if we're not involved there, if I'm not involved there, how will we know what features are there? Or how can we control this? And it's this almost psychological thing of letting go of control and it's really hard. 

00:12:53 Alan Page 

Yeah, I see that today wven managing platform engineering. Had a team come to me last week and say, you know, we're really dependent on part of your team to get this feature out. You do some platform, you know some Kubernetes or platform work. And I told them... Same thing I would tell people 10 years ago about testing: We exist to accelerate your team. I never, ever want you to depend on us. You have to be able to deliver on your own. It's faster with us. You can't depend on us. And that's really just an evolution of where I came from in testing that was I don't want you to depend on the test team or the tester. And I believe testers today shouldn't put themselves in that position of being the bottleneck or the handoff. They should help accelerate the team's ability to do great testing. 

00:13:36 Lina Zubyte 

When you say in the principles “we” who is “we”? 

00:13:42 Alan Page 

When we first wrote them, we were thinking “we” were testers who were moving into more of a different role. Recently on moderntesting.org where I have these written down, if you scroll a little bit past the modern testing principles, I have another version I've written which is “we” is the team. Because I think it's really more relevant now. It started off with “we” being OK, testers moving into a new world, but “we” as the team now. 

00:14:10 Lina Zubyte 

Makes sense, yeah, because the whole team quality approach, right? You could say. 

00:14:13 Alan Page 

Yeah, yeah. Remember where we started? It was how do testers navigate this new world that's happening? It's happening whether you want it to or not. And the thing I always bring up, you know, with Brent, we always bring up is that we didn't invent these principles, hoping people would adopt them. It's really just a documentation of what we are already seeing. This stuff's already happening. It isn't some weird stuff we're coming up with. We're just documenting. 

00:14:38 Lina Zubyte 

I like those principles. The first time I read it this was years ago. And what is a little bit challenging there is  the how. How do you actually do this? You know, because it sounds great and there is this moment in your career when you did it sort of traditional way you could say, and then you start realizing more and more that this is the way I would want to go. And I remember finding myself in this place where I would be like: where do I start? How do I do this? When it comes to the response to these principles, what is the most common thing that you hear people reacting with? 

00:15:19 Alan Page 

It's kind of all over the place. Testers, people who are focused on testing, as almost as the means to the end, they really hate principle #5 where we say that we believe the customer is the only one capable to judge and evaluate the quality. And to me, that's so obvious because quality is nothing without the customer. But people really get up in arms about that one. We had no idea when we documented that cause. We're big fans of Eric Ries and the lean startup and how we get, you know, feedback loop involves the customer. But that one really sets some people off so. 

00:15:54 

As far as reactions go, maybe that's one, but going back to your how do you start question, is... To me, it all starts with getting rid of handoffs like instead of testing something for a developer, go test with the developer and to me that is the gateway into the rest of these. Like I don't know how you say it's whole team quality if you're not testing together. To me, that's what testers do today. And in principle #7 not going to ignore it. It says that we expand abilities, know-how across the team understanding that this may reduce or eliminate the need for dedicated specialists, and the original dedicated testers. And that scares people because where's my job go? 

00:16:37 

And I have made a career out of working myself out of a job because in doing that you always make a new job for yourself and a lot of testers today will continue testing, but rather than test in isolation and prepare a report that they give to the stakeholders, instead they're testing along with developers, helping them do better testing and 100% of the time when I have paired with developers, they have done great testing and I've learned more about the product which has made me a better tester. So that's the gateway to all of this is: just start pairing with developers, test together. 

00:17:13 Lina Zubyte 

I really like that. I think it's a little bit scary sometimes because we like to put people into boxes: what is very technical? What is not technical. My skill set is different than yours. And then... but that's exactly the charm of all this pairing that you can learn so much that you don't know actually. 

00:17:33 Alan Page 

Yeah, there is so much of software development that’s better done collaboratively. You know, I've worked on, you know, Windows years and years ago, we had developers who would lock themselves in their office for 48 hours straight and come out with a new caching system for the kernel. Sure that still exists, but most software today is built collaboratively. We talk about things, we try and make sure we're building the right thing. We try and make sure we're solving the right problem, and if you keep on doing all the little pieces of software development in isolation, it's slow and it's ineffective. We talked about the different, you know, testers versus developers, I think we're mostly past that, but I view testing as part of development. Development is the whole big thing we're making, testing is a part of that, it's not a separate thing. 

00:18:21 Lina Zubyte 

I also recently started getting triggered by the story life cycle, which is a line. Because it should be a cycle. You learn from it and you come back somehow. Even though of course it's one item. But overall the product.... We should learn from it once we release it and the testing stage is another thing that yeah, I also always try to remind people that, hey, I wish this was actually part of development. But it's a difficult fight sometimes. Is it one of the conversations that somehow, you keep repeating with people, or is there something else that you wish you had less with people when it comes to product quality? 

00:19:04 Alan Page 

To me, it really comes down to keeping the conversation going. I'm a big fan of retrospectives, you know, I've told my team, I've told hundreds of people. After your one-on-one with your manager, your team retrospective is the most important meeting on your calendar. Because that's where you talk about where things can get better, how you can get faster feedback loops, how you can work better together, how you can improve quality together. So I forgot your question already, but it reminded me to talk about that. I think just reflecting with the team on how we improve quality and how well we adapt to feedback and our fast feedback loops... Thus, the most important conversations you can have, and the most effective on improving quality on the team. 

00:19:45 Lina Zubyte 

I love retrospectives. I always feel like this quote. I don't know who said it. Mark Twain, let's say, because all the quotes are to Mark Twain. “If the team does not want to have their retrospective once a month, they should have it every week”, basically, because they have a problem. 

00:20:06 Alan Page 

Yeah, I think that the Mark Twain quote is, “I was going to write you a short letter, but I didn't have time. So I wrote you a long one instead.” 

00:20:13 Lina Zubyte 

Yeah, it's not Mark Twain, though. I love this quote. 

00:20:17 Alan Page 

I've always heard it as Mark Twain, but that you're right, maybe it's not. 

00:20:20 Lina Zubyte 

Yeah, I actually did the rabbit hole for this one because I was like, it cannot be Mark Twain, but I love the quote. Yeah. “I would have written you a shorter letter, but I did not have time.” 

00:20:29 Alan Page 

Yeah, we do a retrospective once a month, but we didn't have time. So we do it once a week. 

00:20:35 Lina Zubyte 

We have to basically. 

00:20:37 Alan Page 

Yeah, but it's just so important. 

00:20:39 Lina Zubyte 

Yeah, it's extremely important. I I love retrospectives and I'm glad you raised this and it is about the conversations, right? So, to come back to one question I wanted to ask is, what is the conversation you wish you had to have less when it comes to software quality? That you keep repeating and you're like, oh, come on again this? 

00:20:58 Alan Page 

You know it's different, but the conversation I seem to have on the Internet the most is this... And it's going away. It's getting better. I think... I don't know if my bubbles moving or if the industry is changing, but the conversation I wish I'd have to have less is whether or not it's possible to ship high quality software without dedicated testers. Or the flavor of that is: people flat out telling me you can't ship high quality without dedicated testers. This whole idea that quality is only something that exists with testers or all these flavors of this. 

00:21:41 

I've talked about it a lot. The idea that you can do quality without testers again as it’s evaluated by the customer. And some long time testers get really mad and they tell me I'm harming the craft of testing by making such a statement and things like this. And that developers can't test because they don't have the right mindset, which is all a bunch of crap. Some of the best testers I've known have been developers, they just need a little bit of like push and some don't even need that. Some are great without it. 

00:22:16 Alan Page 

And I just don't believe in any of that. I believe that there is a need for testing expertise and just ripping testers out of a team is a horrible approach. But I think also having a dedicated test team that does all of the testing is also a horrible approach. So it's just those conversations around ownership, a little bit of the flavor of the conversation I just told you around platform engineering and what, no, we're not going to do your companies files for you, you can do them yourself or here when you get really stuck. 

00:22:48 

Same thing with testers. I want a dev to come to me and say hey, I'm having a really hard problem figuring out how to test the way these two components work together. Great. My expertise. Let's go talk about that, we can ask some reflective questions. We can kind of get to the bottom of it. We'll figure it out. I want to have more of those conversations. Now I'm thinking about there's so many conversations I hate. One other thing, "Accelerate”, Nicole Forsgren and others book, people take the DORA metrics away from that a lot, which are fine. They're very good. But one thing in that book is it finds, it has research that shows that: automated tests owned by the Dev team have a high correlation with product quality where no such correlation exists when the separate test team writes these automated tests. And I look at that and I look at what's in my LinkedIn feed and so many chunks of teams writing test automation in isolation of the developers, it's just... 

00:23:51 

That that conversation bugs me as well because I bring up the fact that this would be better served if the developers wrote those tests. And then people tell me, oh, no, the developers don't have time to write those tests. They're too busy developing software, or it doesn't make sense to have the developers write those tests when they're paid so much more than the testers. And... my jaw drops, I shake my head and, like, look at the data. Try and speed things up, but we're ways away from that. We still have 10s of thousands, 10s of testers writing selenium tests all day when the data show it's not a good idea to do so, yet they continue. I'm tired of that conversation too because that's one. I have managed to, I won't say win, it's not winning or losing. I feel like I've had some influence in every other area and this one that last one's weird because there's data to back me up, but it's the hardest battle to win. They dig in their heels. Nope. We need to have a dedicated team just for writing automation for all this work done by the developers. 

00:24:49 

And it just absolutely blows my mind. Cause what I found in coaching developers to write throughout automation and not just in unit tests. A lot of times when I talk about, I tell people the developers need to write the vast majority of the automation, they say, you mean just the unit tests? No, I mean every single automated thing. And when they do that, they write more testable code and testability is highly correlated with good design. So you end up with better quality code just because it's more testable because they wrote it that way in order to test it and people just are lost on that and it blows my mind. That's the conversation I'm getting tired of having the most cause. I can't for some reason I can't get any headway there. 

00:25:31 Lina Zubyte 

Recently I read a book about Edwards Deming and one of the things that he would always do is talk to the management about quality, not with the workers. I feel like I do this so much now, working in more like world leadership role, whatever that is, because a lot of this is systematic. What you're talking about this fact that, you know the devs are too busy and they are somehow put on the pedestal... This is coming from the whole team and likely even management’s idea of them, which then reflects on them so this tester person likely cannot even challenge it. It's sort of not in their power to challenge this, and they internalize this as well. And they say, hey, yeah, it's my job and it's not me. 

00:26:20 Alan Page 

Another thing that came out of Deming's work there in Lean was the idea that any worker on the assembly line if it simply didn't look right, they could push a button and it would stop. And I don't believe most of the testers today live in a psychologically safe environment where they can do that. They feel like all they can do is test and send the test report. And if you're stuck in that world again, going back to collaboration, if you have a very collaborative development process. Chances are you have a higher psychological safety and people can feel free to raise their voices. I think a lot of times, you know what we could learn from Deming is that everybody needs an equal voice in how the software is delivered. I think that's also lost a lot of teams. I feel like they don't have a way out because they feel like they don't have a voice in order to change what maybe their gut knows is wrong. 

00:27:10 Lina Zubyte 

Yeah, because we're so siloed and somehow isolated that we maybe are more than the survival mode and we do not feel safe to get out of it. 

00:27:20 Alan Page 

Yeah, it's funny you bring that up. Silos in software development are a huge thing. And another one of the reasons I like working in engineering experience, developer experience, platform engineering, whatever you want to call it is it's a team of silo breakers. We make teams work together, we make them learn from each other. And I really love that, even though I'm not part of the delivery pipeline for these teams, I make sure they have platforms and tools for their development done, by making them learn from each other and providing a core set of tools, I'm still having a larger improvement on quality than I ever did as a dedicated tester. 

00:27:57 Lina Zubyte 

I really like this idea of collaborating and working together. There was an article that you wrote about this topic, exactly called “Don't blame me” and you say that we do not need a dedicated testing specialist. And there's one quote that I wrote down here which said “Functional correctness is the responsibility of software developers and it doesn't help in the long run if you are cleaning up after lazy developers.” Well, first of all, the lazy developer stereotype is there. And sometimes we allow them to be lazy, as we were talking as well because they're too busy, they have to write new features. As a QA, very often in my career, I find myself more and more in these clean up roles. Most recent one was: I joined a project and their JIRA is a mess. 

00:28:50 

They have so much noise that they they do not understand anymore what is important or not. So I go on and clean up. I go on and I poke and I say what's the state here? Is this actually in review for two years or it's forgotten? And people do appreciate this, but sometimes I feel like a kindergarten teacher, you know, like going and, like, checking... Why are our feedback loops so slow? So from one side of the coin: am I a modern tester because I'm looking for, you know, bottlenecks and trying to sort it out and trying to understand the process and maybe improve the process? Or am I a part of the problem because I'm cleaning up after the people? 

00:29:34 Alan Page 

I don't think there's a dichotomy of whether you're a modern tester or not, so it's a spectrum, right? You are working on improving the business, so you're following at least one of the principles and also there's a limit to how much you can do at once. So one thing I don't do with the modern testing principles is to find someone... You are a modern tester. And you are not. And I know... And I won't say any names. There are people in the industry who deem others, as you are a tester and you are not because you don't follow things the way I do. I don't believe in that. I believe in trying to deliver better software, more efficiently to customers so they can get stuff done. 

00:30:15

So there's no yes or no. I think you do what you can. The principles are there. There's a couple of things you said I wanted to build on a little bit and I'll see if I can remember them all. I'll go back to the quote last cause hat's easiest to remember. But you talked about JIRA being a mess and this is something I've brought up with a few people recently, but I want to bring up today as well... is... 20 years ago at Microsoft, another developer, Bill Hanlon and I, found a pretty high correlation between one thing, one interesting metric and how sort of the engineering maturity of a team, like how well they could, they really cared about delivering quality software. I'll go back to “Zen and the art of motorcycle maintenance” and care and quality are two sides of the same coin. There's the whole book in a nutshell. 

00:31:02 

But what we found for teams that really delivered predictably and with high quality. Super high correlation with average bug age. And we would actually track that. And it was just amazing. If you look at all the bugs in the system and just track average bug age and there's a correlation between that if we would find teams that had thousands of bugs that had been opened for two or three years and those teams were delivering crappy software and a lot of agile teams today, they don't even use a bug database. That's debatable. Whether it's good or not. For history and things, but they would just either decide to fix things or decide they weren't worth fixing in the moment and they wouldn't waste all that time retriaging the same bugs over and. And they had their very predictable high quality as perceived by the customer. So it's really interesting. So JIRA being a mess like... Is this thing really in review for two years? Probably not and probably their software sucks. 

00:31:59 Lina Zubyte 

This is what I did. You know I created dashboards and I visualized certain aspects of it. For example, how many bugs haven't been updated for more than six months. How many are still in progress? And there were high numbers of that. And bug age is an amazing one and the team actually likes it because I'm trying to show them that if we are reacting quick enough, the age is going down so we're no longer in this huge spike. So we're actually improving. As well as this clean up kind of chart that motivates the team as well. So so far it has been extremely successful actually and the age is decreasing and they finally can react to things. 

00:32:44 Alan Page 

Yeah, it's a good aspect. If part of your house is messy, the rest is probably messy too. If it's very clean, likely the rest of the house is clean as well. Of course, you could gain that and just close all your bugs or average bug age is 0, but we're still messy everywhere else, but it would quickly fall back. It's like when you're a kid and you clean your room to make my room's super clean, and two days later it's messy again - you don't have a system in order to keep things under control. That's OK. We're kids. 

00:33:09 Lina Zubyte 

Exactly. Yeah, the system, that's again the continuous improvement kind of thing that I think relates to this lazy developers kind of thing. That the current process should have these continuous improvement kind of habits. It should have quality habits in in order not to become again a mess. Be it: pairing, coding together, writing tests. It's not that you just write it now and that's it. And then you're just park it somewhere. 

00:33:39 Alan Page 

Yeah. And going back to the original quote and the lazy developer, it's a co-dependency. The lazy developer says here's my code, please find my bugs and tester says Ohh here. I found your bugs. And they feel validated and the and the developer says thank you for finding my bugs and they go back and forth and it's just dumb. 

00:33:57 Lina Zubyte 

Oh my goodness, I think the next degree I need to get this psychology. We learned so much in this role about ourselves, our lives, and all kinds of things. 

00:34:07 Alan Page 

Yeah, you know, this stage of my career, it's all just... Programming's easy, people's the hard part, handling the people. But we gotta get rid of that dependency between developers and testers. Again, accelerate the developers, help them do better, don't do their work for them. And what I'm finding more and more now, I'm finding more and more developers who know that testing all of it is part of their job, but they're not saying get rid of testers. They say the tester’s job is to help the developers do better testing, is to provide that expertise, that consultancy that ,person to bounce things off of, like, here's a really hard testing problem. Help me figure it out. That's where they do really well. 

00:34:43 Lina Zubyte 

I agree that there are actually developers that could be even better testers than most of testers, that are better testers than most of testers. I think sometimes we get stuck with certain roles for who we are, but thinking of the people I've met in my career, there were plenty of people that had a role developer, but they were a much better tester than a lot of testers as well in that company. And here I think there's a fear, naturally, of those testers in the company if they go to a developer who knows how to test. And then they say, hey, I will share this knowledge with you. Let's pair together. And sometimes I'm thinking that they would be eaten alive if they're not ready to pair together with a developer who is actually strong in testing and contribute. Then it will be a disaster. Is there something that everyone could do to upskill themselves so that they feel more confident as well, pairing with developers and anyone actually who is really good at their job and maybe even better, you know, at certain people's jobs? 

00:35:55 Alan Page 

And I think it comes back to that people aspect. I think learning facilitation skills, learning active listening, and learning critical thinking like I have talked to plenty of people. It doesn't matter if you're someone super smart, you can always ask questions and don't be afraid to ask questions and good questions, coaching type questions. I had an employee last week. He just kept on posting little brain dumps to slack and I just asked clarifying questions. And after all, he said I really like these questions you ask me, they make me think.... I'm just coaching. So in one of his books, Richard Feynman tells the story of how he was working on the bomb. 

00:36:35  

And he was sent to go review some blueprints at some other site somewhere else in the country. I forgot, it’s been years since I read this, but he went there and he was at that awkward place in the conversation where he didn't know anything about blueprints. He couldn't tell if little X's were vents or windows. And it's that space we've all been and now the conversation has gone on too long. It's awkward to ask the question, but he was unafraid, so he asked the question anyway. He wanted to figure out what it was, so he kind of stuck his finger on the blueprint and said, what about this vent right here? And he was ready for them to tell him: Mr. Feynman, that's not a vent, that’s the window, but all of a sudden the folks started flipping through the pages of the of the blueprint up and down and up and down. And they said, Mr. Feynman, you're absolutely right. We're gonna look at this right away. Whoa. And I didn't know I was trying this once, but I was on a job at Microsoft once, and I didn't look at all the code coming in. Actually, one of the best ways I've coached testing is to just do a bunch of code reviews. And ask people about where are the tests for this, what kind of test are you gonna write? Maybe your test shouldn't do 10 things at once? Those things, but. 

00:37:36 Alan Page 

I did look at the PRs come in and there was one that was adding functionality for a feature that I was kind of waiting for that would be interesting. And I looked at the code review and I didn't understand how it solved the feature but also it was kind of complicated code and I thought maybe I was missing something with the imported library. So I just asked the developer. I sent them a message and said, hey, when you have a chance, can you come walk me through this code? Because I'm not quite sure how it solves the problem. I'd love to learn more. Again, approach with curiosity and a lot of folks would go into that and say this looks wrong. Come tell and how it works. But I assume that people are smarter than me all the time. Anyway, he wrote back half hour later and said... 

00:38:17 

Thanks for pointing it out. I missed a big chunk of code in this check in. I'll get it fixed and get it back to you right away. What? So that's the long story. The short story is, even if you're going to meet with that senior architect who understands exactly how every single change is going to affect the system and thinks about testing it all the time, there's always questions you can ask: how will we know if this is broken? If they have a good answer, then I've learned something. 

00:38:46 

If they go oh, well, we wouldn't, but we could add a monitor here and that would tell us, OK, great, we should probably do that. Again, I don’t have to know how the thing works, but I want to understand... Here's what I'd love to ask. How will we know if customers are being successful with this feature? And you know, I used to joke at a previous job about forum driven development: people would ship things, then wait for people to go to the forums about the software product and complain about it. Very slow feedback loop by just asking questions like how will I know if this is helping them? How will I know if this is working? What would happen if this stopped working? And they may say, oh, it can't stop working. Oh, cool. Tell me more about that. So approach those conversations as learning. And my experience tells me that often, even though I may get the most learning if I'm approaching that really senior person. More often than not, we will find something that they can improve just because having a conversation comes back in the retrospective. Having a conversation about it forces them to think about it in a different way. So way back, rewinding the question, focus on facilitating, asking questions, emotional quotient. People skills. 

00:39:55 Lina Zubyte 

I love asking questions that was on my business card at some point. Question asker -  QA. 

00:40:01 Alan Page 

QA is Question Asker!

00:40:04 Lina Zubyte 

Yeah. To ask a good question, it's a very powerful skill, not always easy, but I think more often than not, we're afraid to ask questions. 

00:40:15 Alan Page 

It comes back to psychological safety because when you ask a question, it implies you don't know something and people are afraid to admit they don't know things. And if you are in that world, you're stuck. 

00:40:27 Lina Zubyte 

And we may also not question someone's work when they really want some feedback. They want to understand is it working or not. And with the recent example that I'm thinking of in one of the projects I worked in, there was first of all code review and then there was testing column. There's two places where someone could ask where is the test? Where is the automated check? And there's no habit of asking about test code in the pull request, and there's no habit in asking about it in the testing stage because the tester may just do a manual check. They're not checking the code, and then there is this culture where nobody's asking about it. 

00:41:15 

It not because that they wouldn't want it. They would want it. But they don't have the habit of it, and they're a little bit afraid. They may be like, oh, maybe I don't understand something. Maybe it's difficult to add this test. Maybe this is why it wasn't added. So then we're sort of like holding ourselves back from asking this. But I think that is a very powerful question I've learned ask. Maybe I don't understand a line of this code, but I can ask is there a test here that I could see? 

00:41:44 Alan Page 

Yeah, 1,000,000% Yes.  And I mentioned that story earlier about code review. That was a question I asked. I just looked at code reviews coming, I sat in my office, didn't even walk around. I sat there and said where are the tests for this? Because we had a team early on. It was a science project. It was a done product, but I would ask people: What are the tests for this? And I asked enough that they were thinking ohh, Alan's gonna ask where the tests are. I'll write the tests. And the next time then I’d look at a code review and I'd say what are ways this test could give a false positive? Or whatever. I look at the test and like I said, I'd make all the mistakes beginning test automators made. I just ask questions and they just after a while after seeing me ask all those, they would assume and ask those questions. Eventually they wrote really good tests. 

00:42:24 Lina Zubyte 

I had the same. Yeah, they would be like Lina will ask about this, so let's add this. And I was like, yeah, I'm not even saying anything. 

00:42:33 Alan Page 

Yes. It works! And I think a lot of times people are just afraid or if testers think that writing tests is their job and their job alone, they're never going to ask that question because it's job security. 

00:42:44 Lina Zubyte 

Yeah, I think psychological safety is a very, very big point, which now leads me to this question... What is the one piece of advice you would give for building high quality products and teams? 

00:43:00 Alan Page 

I love this question because it has nothing to do with software testing. It's about building a culture and an environment, and for me, you know my principles for leadership are... You focus on building that environment. And I focus on three principles. Values. Transparency: don't hoard information. Tell people what's going on. Accountability: talk about what you're working on, what you're going to deliver. Talk about your mistakes and then.. Psychological safety. And those are three points of a triangle: transparency, accountability, psychological safety. If you focus on those and build a team and that’s the way they work, the team cannot help but deliver high quality software, but you got to focus on the team first. A lot of organizations focus on this feature factory approach where they focus on just delivery, delivery, delivery and hope the team falls into place. My viewpoint is you build the culture and the team and a little bit of Ted Lasso in there. Maybe... But you try and build a good culture and the team and quality will happen. 

00:44:02 Lina Zubyte 

Wonderful. Thank you so much for your time. I really enjoyed our conversation. 

00:44:07 Alan Page 

Yeah, it was great hanging out with you today. Thanks for having me. 

00:44:10 Lina Zubyte 

That's it for today's episode. Thank you so much for listening. Check out the episode modes, subscribe and until the next time, do not forget to continue caring about and building those high quality products and teams. Bye.