
Code with Jason
Code with Jason
265 - Software Design with Paul Hammond
In this episode, I chat with Paul Hammond about effective testing strategies, the joy of working with well-designed TDD systems, and how synchronous collaboration improves code quality. We examine what true agility means and how technical excellence enables fearless releases and sustainable development.
Hey, it's Jason, host of the Code with Jason podcast. You're a developer. You like to listen to podcasts. You're listening to one right now. Maybe you like to read blogs and subscribe to email newsletters and stuff like that. Keep in touch.
Speaker 1:Email newsletters are a really nice way to keep on top of what's going on in the programming world, except they're actually not. I don't know about you, but the last thing that I want to do after a long day of staring at the screen is sit there and stare at the screen some more. That's why I started a different kind of newsletter. It's a snail mail programming newsletter. That's right. I send an actual envelope in the mail containing a paper newsletter that you can hold in your hands. You can read it on your living room couch, at your kitchen table, in your bed or in someone else's bed, and when they say what are you doing in my bed, you can say I'm reading Jason's newsletter. What does it look like? You might wonder what you might find in this snail mail programming newsletter. You can read about all kinds of programming topics, like object-oriented programming, testing, devops, ai. Most of it's pretty technology agnostic. You can also read about other non-programming topics like philosophy, evolutionary theory, business, marketing, economics, psychology, music, cooking, history, geology, language, culture, robotics and farming.
Speaker 1:The name of the newsletter is Nonsense Monthly. Here's what some of my readers are saying about it. Helmut Kobler, from Los Angeles, says thanks much for sending the newsletter. I got it about a week ago and read it on my sofa. It was a totally different experience than reading it on my computer or iPad. It felt more relaxed, more meaningful, something special and out of the ordinary. I'm sure that's what you were going for, so just wanted to let you know that you succeeded, looking forward to more. Drew Bragg, from Philadelphia, says Nonsense Monthly is the only newsletter I deliberately set aside time to read. I read a lot of great newsletters, but there's just something about receiving a piece of mail, physically opening it and sitting down to read it on paper.
Speaker 1:That is just so awesome Feels like a lost luxury. Chris Sonnier from Dickinson, texas, says just finished reading my first nonsense monthly snail mail newsletter and truly enjoyed it. Something about holding a physical piece of paper that just feels good. Thank you for this. Can't wait for the next one. Dear listener, if you would like to get letters in the mail from yours truly every month, you can go sign up at NonsenseMonthlycom. That's NonsenseMonthlycom. I'll say it one more time NonsenseMonthlycom.
Speaker 1:And now, without further ado, here is today's episode. Hey, today I'm here with Paul Hammond. Paul, welcome to the show. Hi, thanks for having me. Good to have you here. I found you on LinkedIn. You had some what the kids call spicy takes and I liked what I saw and we can just dive straight into talking about some of this stuff and as we get into the episode, maybe we can learn more about you and your background and stuff like that. But for now I want to dive straight in. I went to your LinkedIn profile just now and I saw a post that I really enjoyed. It's a short one. I saw a post that I really enjoyed. It's a short one. If you think your life is stressful, just take a moment and remember that there are people out there using cucumber to write their tests. It's all about perspective. Somebody else somewhere probably has it worse than you.
Speaker 2:Yeah, yeah, thanks for that. That's quite an appropriate way to start, I guess.
Speaker 1:I love it. I've been a cucumber shit talker for years, so that warms my heart to see that post.
Speaker 2:Yeah, I'm definitely not a fan of cucumber itself, but one thing I would say is I recently saw fairly recently, actually only in the last few days, say is I recently saw fairly recently, actually only in the last few days I saw a post. Somebody on LinkedIn commented on something and I saw somebody who originally worked on Cucumber kind of saying how he found it a bit upsetting people talking about this stuff so much. And just to be clear that, because I did see that and it did make me think a little bit and I think the original ideas behind cucumber were good ideas, right, I think the core concepts behind it were good and, and to be really clear, the people who originally worked on it you know it's fair play to them they had a real good try at it and you know, I think they were trying something new and something really, you know it was a really good concept. It's just the problem is that for me anyway, that I I've seen cucumber in so many different environments. Now we can talk a little bit about my own.
Speaker 2:Um, you know experience and stuff in in a bit if you want, but um, I've seen it in so many different environments and it doesn't live up to that promise in my experience and I think there are significantly better ways of writing tests and capturing business requirements and all all of this kind of stuff. So it was kind of written a little bit out of frustration as well, because I've been on the recent stuff I've been recently doing. We we managed to recently move away from cucumber and it was like I was a big part of the reason we moved away. And again, as seems to be a repeating pattern in my experience, um, everybody is is happy. I think it's it's hard to say it's like much happier now because it's much easier to maintain the tests and stuff. Um, I don't know like how much detail you want me to go into about cucumber specifically. I mean, obviously you kind of introduced that like that's something, that it's the kind of rant that I I occasionally have on linkedin. I'm happy to talk about that.
Speaker 1:Whichever way you want to go, really yeah yeah, I have some thoughts myself on the topic, um, and I think it ties into some broader, deeper principles regarding testing and programming in general. Um, and I share your sentiment about cucumber. Like the creators of cucumber, I think, had a different idea that then what is actually being done in practice, and so it's. I'm not going to say that I think cucumber is a bad idea. I'm not going to say I think it's a good idea. I'm not going to make a comment about that. My criticism is of the way that it's used in practice, or maybe we should say abused in practice.
Speaker 1:And here's what I think people are maybe seeking when they reach for cucumber, which I think is a good thing to seek, but cucumber is maybe not the way to get there. I think what people are wanting is a layer of abstraction over their tests so that the test can be easier to understand. Abstraction is something we talk about in programming a lot.
Speaker 1:We don't often talk about what it means, and so I've talked about this on the show a few times lately and I've tweeted about it. You know, what exactly does abstraction mean? And I messed myself up because I had a nice tidy definition that I really liked, and then I read something and it made me change my mind and I decided I was wrong and I came up with a new definition.
Speaker 1:But now I don't know that new definition, so I'm going to look it up real quick because I have notes about it. But while I do that, paul, what to you? Not to put you on the spot, but what does abstraction and programming mean to you?
Speaker 2:it's very, uh, it's, I mean it's an interesting question to be put on the spot about. I mean, to me it's quite a difficult one, I guess, to answer, because it's about the kind of level of detail, isn't it? Um, the kind of level of detail that you find yourself, I mean the abstraction for something like Cucumber. My understanding of what the tool was originally designed to do was to help. My understanding was it wasn't supposed to be just a testing tool. It's supposed to help people capture the kind of business requirements and create these executable specifications that would be executed against the same language, if that makes sense. So the abstraction, in that sense it's supposed to be the requirement itself, I guess.
Speaker 1:Yeah, so basically the business people and the programmers speaking at the same level of abstraction.
Speaker 2:Yeah, and the thing is I don't know if you're familiar with the original blog post by dan north. I think it's dan north who came up the idea of behavioral bdd, like behavioral driven development, um, and he does talk about, he does use that language kind of given. When then in that in that um original blog post, but what the what I always took from his post there was that well, there were a couple of things that at the time my understanding is, and I've seen him talking about this directly on linkedin since as well, so I'm taking some of this based on what he's actually said, that I've seen him talking about um. At the time, people who were doing test driven development were often falling into this trap and this still happens to this day and this is definitely something we can maybe talk about in a bit, because I'm a big fan of TDD. But they were falling into this trap of doing test-driven development in a way where they were kind of testing, kind of focusing too much on what people think a unit is. You know, when people think a unit is a block of code, it's a. You know it's a function or it's a class, or it's a method or whatever. You know and it's. You know, in order to test this thing in isolation because I want to test my unit in isolation I have to mock out all collaborators to this unit. So you end up with these tests that are tightly coupled to your implementation details and they don't help you to refactor code. They don't help you to create new functionality easily.
Speaker 2:My understanding of the original idea behind BDD was that it was an attempt to do two things. So one was to bring the business and engineering closer together and to create these kind of executable specifications based on business need, but also to try and force the hand of developers a bit more to write tests that were actually based on behavior, if that makes sense. But what I found in my career for about the last 12 or 13 years now I mean, I've been doing this for nearly 20 years now, but for about the last 12 or 13, I've been doing test ribbon development daily, and I've always. I was very lucky at the start of my kind of TDD days, if you like, because I um, I watched a video by a guy called Ian Cooper, a talk called TDD.
Speaker 2:Where Did it All Go Wrong? He has a more updated version of it now, but around 2012,. I think it was when I watched it. It was pretty similar to the content that's still there now and what he argued in that talk was pretty much that that people, often they try to do tdd, they, they, um, test internals because they think that's what a unit is. They find that um, they can't refactor, they don't get the promise of tdd, this doesn't work and they and they scrap it. But he, he advised actually consider a unit to be a unit of behavior and not a unit of code. And where that makes a real difference is that if a unit is a behavior, then you only need to mock out anything that you need to to test that behavior, not the code. Do you see?
Speaker 1:what I mean, I think I do and I've seen that that talk too by Ian Cooper, I thought it was. I thought there was a lot there to to like, um, and I want to focus on something you said a second ago because I think it's really important um I'll kind of paraphrase what you said.
Speaker 1:um, I've noticed that a lot of people have either a negative view of tdd or uh, uh, they, they, they have kind of a neutral view where they're like, okay, that's, that's cool if that's your thing, but like that's not really, I don't do that, I don't know about that, whatever. Um and it's a great tragedy, because my experience is that the vast majority of the time, when people say they don't like TDD, the thing they don't like isn't TDD the thing they don't like is this thing that they experienced that was labeled TDD but was not TDD.
Speaker 2:Yeah, that's exactly the experience that I've had as well. It's interesting, if you don't mind, because I'd quite like to talk about how around the time when I started getting into it, because I was kind of neutral at the beginning right around the 2012 time. So a little bit about my background, I guess. So you know, I got into the industry. I actually have a degree in history. I didn't study anything technical. Long story how I got into the industry um, I basically made a few websites, practiced on my own and somehow managed to get in around 2005 six, something like that. Um basically worked in a lot of different places and then around uh, 2012, I was working for um electronic arts. You know the people who make FIFA. I was working there and I basically managed to get into the BBC. It was BBC Sports and you know we have your like month period between jobs, your notice period. The BBC had a job requirement at the time that people needed to understand test room development and even though I'd got in was the only thing that I I didn't really know um. So I had this kind of month between jobs and I was like, well, I know, when I start there, this is a big thing for them. So I'm gonna practice it. So for that kind of month, I read up on it and I practiced it a lot and that's why I came across that talk by ian cooper. You know, test behavior, not implementation.
Speaker 2:And I remember quite early on in my kind of career at the BBC there was a developer. I don't know if I should name him. I mean, it's all positive, but I don't know if maybe I won't name him. I don't know. But he, he was brilliant though, and he's a bit older than me. I don't know exactly how old he was I don't really want to guess, but a bit older and he's a bit older than me. I don't know exactly how old he was, I don't really want to guess, but a bit older and me and him paired.
Speaker 2:I became a big fan of pair programming through my time working there as well. But we were pairing on a problem and I don't really remember the specifics. It was a complicated problem, I remember that, and we tried going test-driven for the first time, right, and it's like we're going to do this, we're going to pair on it, we're going to take it quite literally, we're going to, you know, red, green refactor, we're going to do that, um, and we're going to test against behavior, so we're not going to mock out every collaborator and that kind of thing. It was a really good experience from the off, like straight away.
Speaker 2:It forced us to do a few things. So, for example, if you're testing against behavior that you care about from a business perspective, it forces you to be clear on what you're actually doing, like what requirements you have right at the beginning. Right, because I'm not, I'm not saying, oh, this function should call this function. It's more, I don't know, um, I should be able to display this user or get this user, you know, for this reason, from the database, you, you know.
Speaker 1:Right, you're interested in the ends, not the means to the ends.
Speaker 2:Yeah, and it doesn't matter. Yeah, exactly that. It's kind of black box, right, you don't care about how. You care about that you can, you know.
Speaker 2:And we had this experience, though, where the thing we were doing was fairly involved and complicated. We got it working all the way through to the end, and we were confident that it worked right, and it was like we've seen the test every step of the way. But even though we'd only just written it, both of us kind of agreed like the code is quite messy. Actually, we were so focused on solving the problem we weren't really thinking about the readability of it, you know. So we committed, uh, git, commit, and then we had this moment where it's like, well, let's try, let's let's try this refactor thing, right, and in reality, you're supposed to refactor, you know, as you go and you know constantly quite new to it. It was like, okay, let's just try it out, right, let's see. And it was so pleasurable, honestly, it was actually it was a proper light bulb moment for me where we took this code that was like quite hard to read, quite messy, that we'd just written ourselves, and what we had was on our screen. We had two monitors actually, and so, on one monitor, we had our tests running in a command line and whenever you hit save, the tests would just rerun and they'd just run in like a second, you know. And the other monitor, we had our code and we'd make a change to the code and in like a second if you broke something, you'd see and it's like oh no, I broke that, okay, so you fix it again, right. But then we could change the code and we changed the internal structure. We tidied it up. I remember we had this big long method and we really tidied it up, but not in the way where you're trying to be clever and do it in just a few lines. It was just so much more readable and that was a light bulb moment.
Speaker 2:But what's interesting is so a bit of a long-winded thing there.
Speaker 2:But you said at first that, um, people often kind of attack the thing that they don't understand. Well, what was interesting to me was when I was having lunch, um, I'd be talking to other developers who you know had tried tdd and I'd be saying look, you know we have this experience. That it was, you know, developers on other teams and they would keep telling me but what you did was wrong, because that wasn't a unit. You know that's not a unit test and I was like, yeah, but we, we have really high confidence and we could see it working and we could change the code and it worked and we know it works right, we've got proof that it works. Yeah, but that's, that isn't a unit and you should be testing, you know should be doing units and it's um, it's funny because since then most of those developers have kind of changed their mind a little bit and I still speak to a lot of them and they, they do it differently now and I gotta say something about that.
Speaker 1:There's so much talk around testing, about doing it right, um and it. And it really bugs the hell out of me because it's like, okay, you want to do it the correct way, but according to whom and so what? Like if it's right, then so what. If it's wrong, so what? And the thing isn't correctness or incorrectness, it's about how advantageously you're practicing testing.
Speaker 2:Yeah, for me, I think good tests are tests that help you make changes with confidence over time. Like if you can confidently make changes, you can refactor, you can add new features and you don't feel the need to have to constantly manually check stuff or even load up the app. I mean it's funny most of the time when I'm working on stuff I don't even have the app booted up half the time and I'm still working on features because I trust my test that much. And I guess one thing to add is that a lot of the time that I've spent doing this has been on the front end and there seems to be very few people out there who seem to understand how to do TDD well on the front end. And it's something that I am planning to write about. I started I've got like a splash page for a blog, but there's nothing on there yet. But I am planning to kind of write about this stuff in the near future. I've already started writing an article on it, but nothing published yet.
Speaker 1:We'll see yeah, there's a lot there that I want to dig into. Yeah, you know, adopting TDD can be hard. You know you shared your experience at the BBC doing TDD and it sounds like it was a positive experience right from the get-go. A lot of times, if you want to, I actually advise people to not try to learn TDD at work, because it's going to be really really really an uphill battle unless there are quite a number of different things that are all just so, and what I mean by that is, like a lot of times at your workplace, if people aren't already doing TDD, then almost by definition, there's not a strong culture of TDD, and so you're going to be fighting that If TDD is not being practiced. There's a reason why it's not being practiced. Maybe it doesn't have much managerial support, maybe the developers you work with don't buy it. There's some reason, and that reason is going to translate to resistance, and not just resistance, but the support that will need to be in place will not be there, and I mean both technical support and organizational support.
Speaker 1:You might go to write a test, but it's really hard to create the setup data that you need for that test, because all that infrastructure work is not there. Your test is the tip of the iceberg, but the rest of the iceberg is missing and you can't really write your test until the rest of that iceberg is there. The rest of that iceberg is there and then, organizationally, you might create a test and put up a PR or something like that, and it might not really fit the process. I've had this before. I kind of get my hand slapped because why this takes so long? Well, the feature took five minutes and then the test took four hours. It's like, well, what the fuck? Like, that's not reasonable, and so that gets squashed right there. Yep, so that's. Another reason why people come away with a bad experience of TDD is because they try it at work and they expect to be able to do it, but they don't realize that there are all these forces that act against them.
Speaker 2:Yeah, I mean that's a really good point. To be honest, from my own perspective, I've maybe been I don't know if lucky is the right term. I've been lucky, definitely, in parts, to have worked at places where these things were quite mostly quite well understood. What I would say is that, you know, when I interview these days, I tend to bring these things up in interview and I tend to. For me personally, I tend to have a bit of a um, you know, I take the whole interview process quite. You know, it's very much a two-way thing. It's not just them interviewing me, it's me seeing if I'm, you know, if that's suitable for me as well, and people don't have to be doing that. You know, it's a driven development for me to go there, but for me to be able to sell the value, I need people who are willing to kind of listen to certain things. So the thing that I would say is that what I try to do when I'm trying to sell it nowadays is I used to be guilty of just going on about it too much. We kind of started straight away just talking about it here as well, but nowadays I talk a lot more about the business value of these things and it it isn't just um, it's not just tdd as well, to be clear. I mean, we've kind of honed in on tdd and it's definitely something that I'm very passionate about, but to me there's a this tdd comes in. There's a bunch of kind of practices around tdd, like xp practices in general, things like pair programming, uh, for example, and continuous integration and continuous the business agility to test things out. I actually genuinely believe and I've seen this in practice a few times, not as many as I would like in my time in the industry but I have seen in practice on a few occasions where this level of technical agility actually did make a big difference to the end product in terms of the features we ended up building, because we could get feedback from um, from customers earlier, you know, and actually um, using that feedback we could pivot and build, you know, change our kind of um direction.
Speaker 2:But in terms of you know, if you're in that kind of situation where you know if it's a double whammy of the business doesn't understand and even a triple whammy, right. So if it's like the business doesn't understand, other technical people don't and you're, you've maybe never really done it and you want to learn, then, yeah, I can see why it would be quite difficult to get started, and maybe a personal project or something like that to get going and start understanding it might make sense. The one thing I would say is that, um, there's usually ways that you can. You can, you can get these techniques in a kind of smaller level, you know, at a kind of finer point in an existing code base. So there's a, there's a really good book called um. I've actually literally this is how exciting my life is I actually have the book on my desk right now, which is, uh, I can actually show that's. Uh, that's crazy actually I didn't realize.
Speaker 2:I thought you were gonna show me really yeah, um yeah, workingively with Legacy Code by Michael Feathers, and it's a really good book.
Speaker 2:It's Java and stuff, but it's all about concepts really.
Speaker 2:And yeah, in this book he talks about how to introduce a seam into the system, so a place where you can open up an area for testability, you know, and to kind of test. In that way, in this kind of situation I can appreciate that it isn't. You're probably not going to convince people overnight, right? You're not going to go in overnight and instantly get these amazing results, right? It's almost certainly not going to happen that way. But if you take the time to learn it and to understand the value, and if you understand how you can kind of do it at a smaller scale, first, you know, to start with a smaller feature that has a kind of a nice kind of boundary around it, like an interface and code, or just you know an area of the system that's safe to kind of do you know what I mean to kind of create a scene, to kind of create a scene, then it's possible to start from there and maybe show the value on a smaller scale and maybe you can convince people that way.
Speaker 1:There's this quote maybe you know it from Gerald Weinberg no matter what the problem is, it's always a people problem, and that's another thing that I found very true in my consulting work. Um. So I have two completely unrelated comments. Um, one is um, I think a lot of people expect the tdd learning process to be much shorter than it is in reality, and people could benefit by setting their expectations for the level of payoff they'll see and how soon they'll see it, and stuff like that. They could benefit by stretching out those expectations to a longer time scale. The other comment is that in my experience it's very unlikely.
Speaker 1:It rarely works to change somebody's mind, and really you know we're not just talking about changing people's mind. Yeah, and really you know we're not just talking about changing people's minds, we're talking about changing people's behaviors, and that's really hard. Think about how hard it is to change your own behavior when you want to. That's like yourself, and you want to. So to change somebody else's behavior when it's not you and they don't even necessarily want to, that's well nigh impossible.
Speaker 1:Um, what can be done is if there are people who want to learn and improve and stuff like that, they can be taught. That definitely is possible. Um, but if there are people who don't buy TDD and there's people who just like as crazy as it sounds like they're not really interested in learning and improving and stuff like that, uh I found a surprising amount of programmers in my consulting work who kind of have this blindness.
Speaker 1:Um, to me, I have this acute sensitivity to pain. I have an acute sensitivity to stupidity and like when I come across anything that's dumb, I'm like holy shit, this is fucking terrible. How can you guys live like this?
Speaker 1:this is awful um, yeah yeah, you know, metaphorically, it's like everybody's walking barefoot over glass and it's like, hey guys, like I don't want to walk barefoot over glass, like how are you guys okay with this? Yeah, but it's like somehow they have numb feet or something like that, and so they don't mind walking barefoot over glass those people yeah, I I'll say this this is painting with a broad brush, but those people can't be reached. There's nothing you can do with those people. You just have to go somewhere else to different people yeah, yeah, definitely.
Speaker 2:I mean it's definitely something that I've experienced as well in my career. I think it's one of those things. Like one thing I've really enjoyed doing and I haven't done as much of it recently, although I still do it actually, even on the team I'm on now now, I suppose, but I used to do it a lot more was um mentoring with um, especially kind of junior people and graduates and stuff like that and um. You know I'm a big fan of pair programming right. So, like um, I, I think one thing that can be effective when, when somebody does want to learn and when they are willing to to listen.
Speaker 2:I think a healthy skepticism is fine. By the way, skepticism to me is a good thing, but I think skepticism is not. Some people think it's the rejection of stuff and I don't think that's what it is. To me it's a position where you say, ok, at this point in time I don't agree with what you're saying or you haven't convinced me yet, but I will look at the evidence and I will assess the evidence and if you can show me the evidence, I will change my mind. That's what I think skepticism is really.
Speaker 1:I totally agree and, if I can interrupt with an angry rant, the vast majority of people. Nobody likes to think of themselves as being closed-minded, but the vast majority of people are really fucking closed-minded and most programmers are very unscientific. We probably like to think that we are scientific, but we're fucking not and I totally agree about this idea of skepticism.
Speaker 1:Carl Sagan talks about the marriage of skepticism and wonder, and we should simultaneously even though it seems contradictory, we should simultaneously be skeptical and open-minded. What a lot of people are is cynical and dismissive and closed-minded, and they decide what they want to believe and then, post hoc, they come up with rationalizations for believing that, and so they never change their mind about anything, which means they never get any more correct about anything than they started off being.
Speaker 2:Okay, that's my angry rant. No, no, it's. Uh, it's good, funny, it felt like reading one of my linkedin posts for a second. There it was, but no, it's. Um, yeah, no, I, I completely agree with that and it's. That's the same with anything, though.
Speaker 2:It's like any kind of walk of life, like if you, um, you know, if you're going to take a kind of dismissive kind of approach and you think you already know the answers before you understood the subject, then there's only so much that can be done there and, as you say, I think with some people it's just literally impossible. Well, I will say, you know, I think I've had quite a lot of success in my career at showing people this stuff, and the way that I prefer to do it it is by pair programming, because I think that it's. It's just to me it's so much more effective, right, it's, I love pairing, I think. I think it's. It's a great technique, it's, it's sorry, I just remembered why.
Speaker 2:Why you originally contacted me, actually on LinkedIn, because I think the post that triggered it was I said something about pull requests and I said something about I don't really like pull requests, I'm not a big fan of them.
Speaker 2:The asynchronous review side of it, right, and my kind of alternative to pull requests is to pair or to mob more right. And the reason I don't like things like pull requests just to move, to change the subject slightly it's not the pull request itself, right, because the actual act of making a PR, you know, you can trigger your pipeline and you can get feedback there and that's all fair enough, right. But what I don't like is the and it's the standard in the industry by far in my experience is the, the asynchronous review process and the way that it encourages this siloed kind of thinking. Right, this idea that you know if we're on a team, if we're me and you were developers on a team, right, and maybe we've got two of the devs on the team and maybe you've got a product owner and I don't know UX person and you know.
Speaker 2:I don't like it, and this is typical in my experience of most teams. Usually what happens is the devs all work on four separate things. You do some work for maybe a few days, whatever, put it in a PR, throw it over a wall. First time I look at it, you know I don't have the context on it. I'm trying to review it, but I'm doing my thing at the same time and maybe I'm asking somebody else for a review. It's such a cumbersome and convoluted kind of way of working. And the worst thing as well, even worse than all of that, is it's inspection after the event, right?
Speaker 2:I forget who. There's a famous quote I can't remember who said it and I'll be paraphrasing it and probably butchering it but somebody said something along the lines of you can't inspect quality into a system, and I believe that quite strongly right.
Speaker 1:You can't inspect quality into a system, and I believe that quite strongly right. I can't inspect quality into a system. Is that what it was?
Speaker 2:Yeah, I can't remember who exactly said it, but the thing for me is I found time and time and time again that so if you there's a really good talk by a guy, I'd have to let me make sure I get his surname right. Sorry, just one second. I'm looking on LinkedIn Dragan Stepanovich. He gave a talk on you can find it on YouTube and it's something about how asynchronous code reviews are killing your company's throughput or something like that, right, and he has a really good diagram that I really like on it where he shows he's got like two developers on one side and then he's color coded the kind of work that they're doing, and he shows how disruptive it is when one person is asking for a review and then moves on to another thing which is another color, and then you know it's. They get drip fed this piece of information back and then it goes back and forth for days and days. I have so many.
Speaker 1:I I have so many I have so many thoughts on all this stuff, um, so, for one, I always tell people that everything is connected to everything else, and and so, like your testing practices and ci, cd and your poll review process and all that stuff, like it's all connected and everything influences everything else. Um, another thought that stirs up for me is um, um, I lost it.
Speaker 1:But a different thought is um, by the time it gets to the poll review stage, um, it's basically always too late um it's kind of that, that, that, that, uh, you can't inspect quality into a system thing that you mentioned, um, and you look at the pr and it's like, wow, this is like what the fuck is this?
Speaker 1:and it's huge and nobody wants to hear you say, hey, this whole thing is like fucked and we need to just kind of start over and change the whole thing nobody even wants like minor, uh, unless it's like some kind of like superficial formatting change or something like that, like nobody wants to hear about a different approach that they, that that you think they maybe should be taking, or something like that um, oh yeah, I remember my other thing it's a big one.
Speaker 1:Um, people are so focused on the idea of being efficient, which obviously, like it's better to be efficient than wasteful in general, but they have this focus in a way that's penny wise and pound foolish um so it's like they want a 100 utilization rate with developers um yeah developers should always be working on something productive, and so you end up like you put out a pull request and then you can't just sit there and wait, so you start the next ticket and then, while you're working on the next ticket, somebody comes back with um a review, with feedback on your pull request. Now you have to stop your new ticket and go back to the old one and load all that back into your mind, which again is one of the big reasons why people aren't interested in anything but the most superficial feedback at that stage.
Speaker 1:You address that, then you go back to your normal work, blah, blah, blah, the idea being that this is efficiency, because you're being fully utilized. But by being efficient on that small timescale, you end up being grossly inefficient in the big picture. That small time scale, you end up being grossly inefficient in the big picture. Whereas if you made yourself comfortable with what seems like waste at the fine grain and you like, for example, do a pairing session with somebody and at the end, the pull request review process if there is one one is trivial because everybody's already familiar with the code um then it can be done synchronously. You're just done with it and as a pair you can move on to the next thing. And there's, there's like waste I like, in quotes, waste in that process, but in the big picture it's so much more efficient yeah, it's interesting.
Speaker 2:I mean I agree with literally everything you just said there, especially about the efficiency thing as well. Um, you know it's it's something that really frustrates me a lot, actually, because I one of the things I can't stand, and you see a lot of this talk. I see it all the time. At the moment, on LinkedIn, it always crops up this how do you measure developer productivity and these tools to do it, and all of this right, but the problem is that I don't think that even matters Individual productivity. I don't think it can really be measured anyway. For one thing, I think that what matters more is how the team is performing right, and I'm pretty much 100% certain. If you were to look at most of the unfortunately, most of the teams I work on have used Jira right. I could rant for hours on Jira, but if you were to look at my personal Jira stats, I don't think I would look like a particularly great developer, depending on how you looked at it. But the main reason for that is because I tend to not really care about my own statistics very much. I'm lucky enough to work on teams that are quite mature and get this, like these product owners that I work with, who I feel quite fortunate because they just understand this stuff right, and so I'm quite lucky in that regard. But I spend a lot of time trying to help the team overall. Right, so I'll pair with people, I'll pair on my tickets, but I'll see somebody else see something they're doing and I'm going to pair on that, and you know, it's interesting when you said about waste as well, though, because what I would say is that when you look at a board visually and you see loads of stuff in flight all right, if you're not technical, you're not used to this kind of this stuff very much you might say well, that looks productive, there's loads of people doing stuff, everything's in flight. But I think when, generally and this is I've seen this so many times in my career when developers pair and they sync on the work together, they work on fewer things at the same time, but the things they do work on, the flow is much faster. The rate at which the ticket moves from left to right on the board is a lot faster. So actually, overall usually, anyway in my experience the speed increases. You're doing less work at any one time, but the work you do do goes through all the way to the end, hopefully, people again, I'm used to the definition of done being released to production, right, that's what I think done should mean, but yeah, it should go faster.
Speaker 2:One other thing I'd just add to that as well is that we had I don't think we really touched on there is you also knowledge? The knowledge sharing increases dramatically when you pair. But also, what I found consistently, actually and I found this just last week, this week, sorry, just on Friday I was playing with somebody this week and what we ended up with, I think, was a better solution than if either of us had worked on it in isolation. Because if you imagine the supposed value proposition of a asynchronous pull request, right, people will. People will tell you well, it's all about quality and standards and all of this stuff, right, but in when you're working in a synchronous way, if me and you were, instead of you know, discussing stuff for a podcast now, if we were screen sharing and we're coding together, if I start doing something and you see a flaw in what I'm doing, that interaction happens instantly.
Speaker 2:Right, and let's say that you actually did find a flaw in what I'm doing that interaction happens instantly. Let's say that you actually did find a flaw and it's perfectly. Your point is valid and I haven't seen it yet. If I'm working now. So if you were to run two timelines, one timeline where I'm doing this on my own and you're working on something else, I don't spot this flaw and I continue and I go, I do two, three more days worth of work and it all seems good, put it in a PR, and you spot it at that point. Really expensive, right. I have to do all this rework. It's horrible. We're working in real time. You spot that instantly. We have a discussion about it, we fix it there and then right, and it's better. You know what I mean. So, but this kind of working though it it doesn't lead to nice, pretty looking graphs in jira where you can look at um individual. Do you know what I mean?
Speaker 1:it's, it's yeah, if yeah, and there's something there's something else um I, I'm just articulating parts of this in my head right now for the first time. Um, some of the thoughts that go into it are are very old, but, um, you know, people talk about productivity. It's like what exactly is productivity and why is it desirable? Um, and I think a lot of times people think of productivity as being like the amount of stuff you get done. If developer A gets 10 tickets done and developer B only gets six tickets done, then developer A is more productive than developer B, and that may well be true.
Speaker 1:But the significant thing is the amount of business value created.
Speaker 1:Um, it's like think about the iphone and google android, uh, for example. Like. Apologies if I offend anybody, but I think android phones are shit, um, and the iphone, actually, the iphone now is kind of a piece of shit, but when Steve Jobs was still alive, the iPhones that came out when he was in charge were quite good. Working on the first Android phone and at the same time, the Apple developers are working on the first iPhone. It could be that the Android developers are working faster they're knocking tickets out at 120% the speed of the Apple developers but the value they're creating is only like 5% of the value of the, the apple product, and you know those numbers are.
Speaker 1:Whatever I'm talking about like, not just like um calculable monetary business value, but also like value in the broader sense. Like you know, apple made valuable contributions to human society in a way that's like not tangible and monetary and all that stuff. So that stuff really matters and all other things being equal, it's obviously way better to go fast than to go slow. So that's one comment about productivity.
Speaker 1:The other one is like people again, penny wise, pound, foolish. It's like what did you get done today? Like that's not really the thing. Like if you look, for example, at my productivity in a day versus an average developer, it might not be that different. In fact it might even look like the average developer accomplished more than I did, but I might have done less stuff, but it's higher value stuff and I did it in a smarter way. And then fast forward to a year from then. Maybe the average level developer has like tied himself into knots and can hardly make any changes at all because everything is a pile of shit, whereas a year later the work that I've done has compounded on top of itself because every day has been an investment and now I can. Every new thing that I add is even more valuable than the thing before because the value of the system compounds and we're just in a completely different place after a year, even though after just 24 hours you really can't tell yeah, yeah it's.
Speaker 2:It's funny because, yeah, I mean, we, we talk about so many things and, like I just find myself strongly kind of agreeing with, with everything you're saying and these are all things that I've. I guess this is like, as you say, you've seen me probably talk about this kind of stuff on on linkedin, right, and business value like this is one of the things that it winds me up the most. Like I find it really frustrating because I very strongly agree with everything you just said. Right, like why does it matter if we deliver 40 story points to the sprint? Why is that important? Right, you could have the team that might be producing the most value for your business might have the worst looking graph in in jira, and who cares, you know, I mean like who cares, um about that stuff? Can I just give you a little kind of anecdote of because, because most of the occasions where um, like most companies I've worked, I don't think I've quite got the the business value side of it, like it's not all bad or anything, but there was one place, one occasion where I worked where I felt it was like that entire team was firing on all cylinders and it was such a a brilliant experience, um, and I won't mind like, just just giving you this example, if that's, if that's all right. So it was a team. Yeah, so it was um. This example if that's, if that's all right. So it was a team, yeah, so it was um started contracting 2018. So, like, oh god, it's 20, it's really, it's 2025 now, and I nearly said six years, um, just like seven years ago, um.
Speaker 2:One of the places I contracted at was a um, an organization called equal experts, and um. I'm only going to say positive things, so that's why I don't really mind talking about them, because I had such positive experiences with them. The way that they work is they bring entire teams into an organisation, right, so you have an Equal Experts team, and this was when I was working there, before COVID, so it was in person, right, and we were working. I won't name the end client. I don't know whether I should or not. I won't name them, but I will say that they were a really big law firm and the product that we were ultimately working on. I actually worked on a few while I was there, but one of the main ones was a. It was like an internal product that was being well, it was being built internally, but with the idea of eventually, potentially. It was being built internally but with the idea of eventually potentially selling it as a service externally, and it was to help lawyers with their workflow. Right Now, I'm not a lawyer Most people are not lawyers, right but when we're on that team, two things that I thought were amazing about that team One is that they and by they I mean equal experts we had an amazing delivery lead. He was brilliant and he well, I think it was him, but whichever way it was, we managed to get a lawyer embedded on our team. That was the first thing and that was just amazing, right. And then, on top of that, we were based in an office in Manchester.
Speaker 2:The company had offices in London and what happened was we were doing the whole continuous delivery thing and you know, we were releasing frequently, you know multiple times a day. When we really got into it. Um, we had this fantastic UX person and he was. He was really good at understanding broadly UX, so that obviously all the design stuff and stuff, but actually understanding like customer need and trying to get to the root of what the customer need was, and what he managed to do was we got into this situation where we had a team of lawyers in the London office who were using the software that we were building as part of our daily flow. We managed to get again. I wasn't involved in making those connections so I don't know exactly how that was set up, but what started happening was once we got into this flow.
Speaker 2:We got into this position where the UX guy was only ever a couple of weeks ahead of where the development team were, and he's always really open. He shows exactly what he was thinking about and all this stuff we would build a feature, we'd deliver it, tdd, all that good stuff, right, and so like bugs were extremely rare. You got them occasionally, but we were releasing often and really kind of delivering stuff. But what he would do is he would go down to London quite often, like a couple of times a month or whatever and he would sit with the people who were using the software and see how they were using it, get feedback, and so it'd be like a feature we've just released and he's asking them how's this working for you, what's good about it, what's not good about it? He's bringing that feedback back to us because we weren't following a big master plan.
Speaker 2:We had overall goals that we were aiming towards. But we could adapt and we could pivot and we say, okay, that feature we released last week, actually they're not using it the way we expected. But they said, if we did it this way or we move this thing onto this page, that would be better, because you know, this is how their workflow works and before too long it was getting like rave reviews, people were loving this product. But that's obviously going to happen if you're speaking to your customers directly. Right, and this is the thing that I find frustrating is, we work in this industry that claims to do this Agile stuff. Right, but for most teams, agile just means two-week sprints and Jira and it means a retro at the end of every two weeks and it means having a stand-up. You know, a lot of teams have these massive backlogs. It's just waterfall in two-week increments. And then, how well did we do? Oh well, you know the sprint ended on Tuesday and let's look at the graph. It gives a shit. Why does it matter?
Speaker 1:Well, something I've come to realize over the last few years. I hate that this is apparently true. Hate that this is apparently true. But so much of the economy. So much of the economy is just performative and nobody's even trying to do anything real. Um, you know, there's this fucking thing about data-driven decisions and people think they're so great because they're making supposedly data-driven decisions but, like that is such a narrow view of the whole picture, like, uh, okay, data, like what fucking data?
Speaker 1:and how are you thinking about it and all that stuff? Um, because, like I think a lot of times, what that means is just like telemetry people are releasing features. What that means is just like telemetry People are releasing features.
Speaker 1:And then they're looking at telemetry and they're being like oh, 60% of the people clicked on this button and 40% of the people clicked on this other button and so the one button is better or whatever.
Speaker 1:But what they're leaving out is like an actual understanding of anything that's going on.
Speaker 1:And so it's so great to hear about that guy who actually went and sat in person with the people who were using the software.
Speaker 1:And that's something that I always not always when it applies I advise my consulting clients to do that, like actually go and fly to where one of your customers is and physically sit with them and watch them use your software, because I can almost guarantee you will be shocked and horrified by all the crazy fucked up ways that they abuse the software. You never could imagine the way that they're using it and they'll tell you things when you get to get together in person that they would never even tell you, even over a zoom call um, because in person is just different and you can't go by just what the customer feedback is like. They'll tell you stuff, but the picture of reality that you'll get from them telling you stuff is very much like a funhouse mirror view of reality. It's incomplete and distorted. Your responsibility is to go and learn about them and their world, and learn about the jobs that they need to do, and then give them a product that will help them accomplish those jobs.
Speaker 1:Not just listen to what they tell you they need, because that may or may not have anything to do with what's actually needed. But again, most places aren't even close to there. They're just throwing shit out there and not even checking in any way to see if it's good. Somebody's dreaming something up, just kind of in a vacuum and then throwing it out there and not even really knowing what's happening with it, and it's sad, um, and and so the hope is just that you can find that that small minority of people who are doing it the way that you described just now, where they actually care yeah, yeah, this is the thing.
Speaker 2:It's, um, it's a very frustrating thing, I think, in in the industry at large, because it's like I I'm actually it's a funny one. Have you heard the term semantic diffusion? It was a relatively new term to me. It's this term that I think it was Martin Fowler who wrote an article about it some years ago and it's this idea that certain terms get coined and they have a clear meaning, that certain terms get coined and they have a clear meaning, and then they get used in kind of common language and over time that clear meaning gets diffused and it's like so continuous integration is a good one, I think, because a lot of people think continuous integration is GitHub, actions or Jenkins, and it's not.
Speaker 2:It's the act of the entire team merging their code into mainline at least once per day, ideally more than that right. And the value of continuous integration, the value of proposition, is that you spot issues much faster. By doing that, you avoid big merge conflicts and the whole team stays in sync that way and you don't end up with these horrible situations where you have these long-lived branches or you're doing deployments based on branches and stuff. Um, but the for me, the term agile, the agile thing is is another big victim of that right, the semantic diffusion thing, because it's it's just become like.
Speaker 2:I'm a really strong believer in what I think agile development actually is, but that to me, is very similar to what I described before with the process at that law firm where you're quickly iterating but you're expecting to learn stuff. And I think a key to it for me is how quickly can you change direction if you find out that you're doing the wrong thing? If you find out that you're doing the wrong thing somehow, if you get some concrete customer feedback or a clear indication one way or another, that you're actually doing the wrong thing. Yeah, so so many companies claim to be doing agile and they'll, they'll just carry on. You know, oh yeah, you know that new feature's not made us any money. It's cost us a million dollars to build and not making any profit, but anyway, we've got six months worth more of tickets ready for you and do you know what I mean it's like it's insane.
Speaker 2:That's not. Yeah, it's so wasteful.
Speaker 1:A couple comments, um, you know the actual agile manifesto, uh, dear listener, is really short, like I'm gonna look it up right now. Agile manifesto um, my memory is that it's kind of like poem length. Yeah, it's so short. It's, it's like a short poem. Um, and and nowhere in there does it mention jira or sprints or anything like that. Um, yeah so so yeah, I love this term semantic diffusion, because so many things have, so many terms have gotten a distorted meaning. By the way, one of my favorites is literal Nazi.
Speaker 2:That one is one with the definition has expanded so much yeah yeah, anyway, what was I going to say?
Speaker 1:is is one with the definition has expanded so much um, yeah, um, anyway, uh, what, what was I going to say?
Speaker 2:semantic diffusion, agile, um, I, I had a thought and I lost it this is, um, do you know I'm just googling this myself actually um, because one thing that, uh, I wasn't completely aware of until fairly recently. Actually, somebody pointed me to this and it surprised me because I knew about the agile manifesto. But did you know about the 12 principles behind? And I want to make sure I get this right, because I don't want to. I don't want to get this wrong because there are, if you google, the 12 principles, uh, of agile. I want to make sure I'm not telling you the wrong thing here, because it was a relatively recent thing to me.
Speaker 2:Let's make sure this was actually part of the original. I don't want to tell you the wrong thing, but I'm pretty sure that this is actually because what these 12 principles actually say explicitly? Uh, number nine I'm just looking at it now continuous attention to technical excellence and good design enhances agility, right, these to me, to me the the top principle. I'm obviously not going to read through them all, but these things like number four, that business people and developers must work together daily, throughout the project, you know these, working software is the primary measure of progress, right, I mean that's. That's an interesting one, I think yeah, you know it's
Speaker 1:yeah it's um yeah, and I love that because it's like I've always thought that, like, if you have to have a status meeting about something you like deeply fucked up, yeah, yeah, yeah, because, uh, you know, there should always just be working software in production. That is a very tight reflection of the most recent state of the work and there shouldn't be any question about where we're at with such and such, because it's it's being currently used.
Speaker 2:Yeah, and it's a funny thing because I, you know, for a long time I was aware of this thing called XP, extreme programming but I hadn't really kind of I was aware of it but I didn't really know too much about it. And, yeah, I'd learned test-driven development, I'd got into pair programming quite heavily, I developed my own kind of understanding and theories around a lot of this stuff that you know. Technical excellence To me, when you talk about technical excellence, the primary proof that you have an excellent foundational quality is that you can continue to release with no fear over time. Right, that's to me, if you can do that and it's sustainable, then I think you must have a. That is a proof that you have a solid foundation. Sorry, I've just kind of lost my train of thought as you lose your train of thought.
Speaker 1:I remembered what I was going to say earlier, so I don't remember what triggered this thought, but I at least remember the thought. I've been reading this book called the Beginning of Infinity that I've been telling everybody about. Actually, dave Farley, in his book Modern Software Engineering, mentions the Beginning of.
Speaker 1:Infinity. That was one of the several places I had heard the book mentioned and I was finally like OK, I just got to get this book because I've heard it so many places. But the book talks a lot about knowledge creation. It's not a programming book. It talks about knowledge creation and how all knowledge comes from. Conjecture and criticism. And the thing is not. The question isn't how can we avoid being wrong, but how can we detect and remove errors when they inevitably occur. And he also talks about governance in the book. And the question isn't how can we avoid ever electing bad leaders? The question is how can we detect and remove bad leaders when they inevitably are elected from time to time? And in software I think the question is not how can we make sure to always get things right. The question is how can we make it easy to detect and remove errors, defects like human error, that kind of thing, when those kind of things inevitably work their way into the software?
Speaker 2:Yeah, yeah, I mean, I like that as a kind of as a concept. I like that. You know, one thing I mean I'm a big fan of Dave Farley. You know I've read that book as well, um, the modern software engineering book and what I really like about.
Speaker 2:I think early on he talked about, um, uh, the scientific method, and it's interesting because you mentioned data, data driven right, and I think a lot of people get mixed up with this stuff because they think, um, they think science is all about data, right, and so they, which you know, obviously data is a big part of science, but they think that, um, you know, we are making data-driven decisions, therefore it's somehow, it's scientific right, whereas the scientific method is more about, you know, it's about, um, constant discovery and it's about proving things with evidence and testing your assumptions and, you know, building on top of positive, you know results and discarding things when you get, you know, negative results right, and it's about having this constant kind of mindset and to me, that's where things like he'll talk about evolutionary design and things like that, and I think, again, it gets us back a little bit to test driven development, um, to me, because one of the big advantages of a tool, like a technique sorry, like test driven development, is that it kind of is evidence-based when you're building, you know, rather than designing everything up front and having a big plan and this is how I'm going to design everything right you take this approach where you build the smallest thing that you need now, um, but you build it in a way that you can progressively kind of build on top of, but every test that you write is is forcing you to kind of deal with reality in some kind of way. Do you know what I mean? And there's an example. I won't go into the detail of it too much, but but in the place where I'm at now, there's been a bit of a.
Speaker 2:I don't want to go too much into that, but there's been two mindsets, let's say, for a piece of work that we've been doing recently, and one is a bit more. You know, let's design it all in confluence with boxes, and you know that kind of stuff first. And the other you can probably guess where it's going from is more um, let's build it in an incremental way and because it's such a big thing, too, too complicated, to consider everything up front and by working in that way, which is what we've actually done. We've come up with a solution again that I think is much cleaner and nicer, but it's based on what I think is a more scientific kind of approach. Does it make sense what I'm saying?
Speaker 1:It makes total sense, and this opens up another total can of worms in my mind, which is this idea of building a project. It's kind of like that idea of a wicked problem, like a problem that can't be solved until you've solved part of the problem. Man, and I want to dig into this, but also I'm going to get in trouble for my wife if I don't end this podcast soon.
Speaker 1:Yeah, yeah, and I would love to talk with you, for if I didn't have a time constraint, we could talk for a number of additional hours, I'm sure, um, but hopefully you'd be up for coming on the show again. I've really enjoyed this and, uh, I think there's a lot more we could dig into yeah, sure, by all means.
Speaker 2:Yeah, um, I'd be happy to. Um. There's actually one thing that I just this is kind of a separate thing but, um, a friend of mine in London that I was meeting recently, really interesting guy who built a startup and he did really well and he built it according to a lot of these principles. So, you know, I told you I've been kind of working on like a thing in the background kind of thing and I was giving him some early kind of you know, saying you know, what do you think of this? And I went on a call with him similar to this. I kind of wish it was recorded because it was so interesting. But he built an app.
Speaker 2:I won't name it now but I don't know whether, like, if you were interested, I could put you in touch with him as well, because I think you'd have a really interesting discussion with him. He have a really interesting discussion with him. He's a very technical guy, but he, I used to work with him years back when, you know, instead of what's at ea, I worked with him then, um, and he, he became contracts in london and ended up, as I say, he made this really good startup product, but he did it all according to these principles. What he was explaining to me was um, he did it. He began with a very kind of throwaway thing, right, so it literally was like five days of work or something that he knew throw away and it was like a prototype and it was based on. I won't go into the details of the product too much. If you're interested you can speak to him.
Speaker 2:But, um, he said, like they built this prototype, they um spoke to people in in the pubs where they were in london. It was the car, it was an app for contractors basically to do with like finances for contractors in the pubs where they were in London. It was an app for contractors basically to do with finances for contractors in the UK and they got people to trial it and basically got feedback really quickly. And then, when he knew he was onto something or felt he was onto something, he then threw the code base away because it was like shit code that served his purpose and he followed these principles really strictly the TDD thing and all of that and he built out this product that ended up doing really well and he sold it off and now he was explaining to me how, like, he's still contracts now, but he's got. He was like, yeah, I can't shop, I've got proper fuck you money, now I can just do whatever I want.
Speaker 1:And it's like, yeah, you know it's, but if you're interested then I just think you'd probably find him very interesting, you know yeah, I'd love that yeah, that would be great, okay, before we before we go is there anywhere, paul, where you'd like to send people online to find out more about you and that kind of stuff.
Speaker 2:I mean, the site that I mentioned is online. It's not. It doesn't even mention me in person, actually, and it's at the moment. I don't have any articles on there. There is a little. You can sign up to a little newsletter that I've got on there. I can send you the um. It's feedback driven dot dev. I actually send you the link there, but then it's, it's one that I it's only ages to think of a name for it, right?
Speaker 2:I bought front-end TDD, I bought React with TDD and all of this stuff, and I kept thinking I don't just want to talk about TDD.
Speaker 2:There's a whole process and all kind of theory behind what I want to get across. So my thinking behind this is that ultimately, hopefully, if I've got time, because I have two young kids, it's quite hard for me with time but what I want to do is to write a lot about this technical stuff. I do want to have some good content on there about front end TDD in the end, but also much more. If you read the little stuff that I've written so far just on the landing page, it's much more about the stuff we were talking about. Do you know what I mean? And I will say, by the way like this took me, like just writing the content on it. It's really hard to write succinctly what you want to say and I'm still not sure it's quite right. But like, yeah, I must have spent about four days just writing the content on there, going back and forth, and I don't know. Nobody can accuse me of using ai to generate that content and put it that way because it's driving me mad.
Speaker 2:But, um, yeah, and if people are interested, um, that's, that's where I'm hoping to write and I'm kind of halfway through writing the first article now, um, which you can probably guess what it's about, but, um, it's, uh, it's not going to be a technical article, the first one. So I'm trying to. The idea I have for the first one is like to try and convey what it's actually like to work in an environment where the TDD has been kind of employed from the beginning, right, and because one of the things that I find a bit frustrating, right, and because one of the things that I find a bit frustrating, and there's always this kind of communication barrier between people who have experienced it and know what it's like and being able to somehow relay that to people, because when you're teaching the technique, it's always done with, like these short examples or whatever, because you know you're showing a technique but there's a kind of there's a pleasure to the experience, the developer experience, of working in an environment where things have been built up according to those principles. And so what I'm hoping to do in the first article is to just kind of step back a bit from the technical side at that point and just talk about what it's like day-to-day working in an environment like that. That's my first idea.
Speaker 2:But there's nothing on there yet other than a bit of a blurb. But yeah, if people want to sign up on there, it might give me a bit of a push. I might be able to get more time to work on stuff. My partner might give me time if she sees people sign up. I was just going to say that, okay.
Speaker 1:Well, we'll put that in the show notes, um, and I'll plug something of mine. Actually, um, I write a monthly snail mail programming newsletter it's called nonsense, monthly um, and I send it all over the world. I have recipients in england, um, a whole bunch of recipients in the US send it to Mexico, canada, france, all over the world, malaysia, and if you're interested, dear listener, in receiving a programming newsletter in your actual mailbox and I have to emphasize, because a lot of times people don't get it it's a piece of paper that comes to you in the mail. If that sounds interesting to you, in the mail, um, if that sounds interesting to you, you can go to codewithjasoncom and there's a link there that says snail mail. You can click there and sign up. It's, it's a an annual subscription. 50 bucks a year gets you, uh, this, this delightful newsletter in your mailbox. That's awesome, yeah. So, again, paul, thanks so much and we'll talk to you next time.
Speaker 2:Yeah, amazing, thank you.