Arguing Agile

AA229 - Build First: The Dumbest Take of 2025?

Brian Orlando Season 1 Episode 229
Join Product Manager Brian and Enterprise Consultant Om as they peel back the sticky veneer from the "Build First" trend sweeping through product development.

Listen or watch as we debate the message being projected by AI tool vendors (who all have their own flavor of tool to sell you) and ask - "are we encouraging teams to skip crucial validation" in favor of rapid prototyping?

Stick around for our discussion, which includes:
- How "Build First" creates organizational dependencies on AI vendors
- The death of institutional knowledge and strategic thinking
- Effects of skipping user research and stakeholder conversations
- The token economy trap and hidden costs of AI-driven development
- Practical ways to leverage AI tools without falling into these traps

Whether you're a product manager, developer, or leader navigating the AI hype cycle, this episode offers balanced insights on how to use AI tools responsibly while still building products people actually want.

#AIProductDevelopment #BuildFirst #ProductManagement

product management, agile development, AI tools, build first, user research, token economy, product strategy, tech leadership, AI in product development, product validation

LINKS
YouTube https://www.youtube.com/@arguingagile
Spotify: https://open.spotify.com/show/362QvYORmtZRKAeTAE57v3
Apple: https://podcasts.apple.com/us/podcast/agile-podcast/id1568557596
Website: http://arguingagile.com

INTRO MUSIC
Toronto Is My Beat
By Whitewolf (Source: https://ccmixter.org/files/whitewolf225/60181)
CC BY 4.0 DEED (https://creativecommons.org/licenses/by/4.0/deed.en)

Om, have you been on LinkedIn recently? Sadly, I have. Have you noticed that it's there's a flood of AI tool vendors preaching. Build it first. Yeah. And it's not just the tool vendors. There's a whole bunch of people that have jumped on the bandwagon. Yeah. Well, that's the subject of this podcast. The build first great AI token money grab of 2025. Also maybe this episode will serve as a nice little time capsule 'cause I'm sure that this episode is gonna upset the apple cart. It wouldn't be an arguing Agile podcast if it didn't. Did you like that euphemism? I had to, I had to go back to 1725 to pull that. Upsetting. The apple cart. Yes, that's right. Upsetting the apple carts. I feel like they. The, the real truth of this stuff is they've repackaged something that was like previously a big thing is like, oh, just build fast, faster time to market, faster proof of concept, you know? Faster time to market. Always a good thing, right? Because you're in a competitive environment, so of course you want to be ahead of the game compared to your competitors, all of that stuff. But now what we've come to is. Let's just build it and they will come. I think, I think it's, I'm gonna start arguing early in this podcast. That's right. I don't think it's fair to just say, this is purely build it and they would come just like, just like, I would say it's not fair that it's purely like a solution in search of a problem, but also like skip, skip doing user interviews, skip doing research, skip doing all that. Skip talking to humans and jump straight to paying me for tokens. There you go. Paying me for tokens. I know we're gonna talk about this during the podcast, but you know when you say, okay, it's not build it and they will come, it's build. So many of them, and hopefully something will stick and somebody will come. That's really more like, what's going on now? Well, I mean that's probably the first topic to get into is like, this is the topic number one, the great human bypass, which is that AI vendors are promoting build first the culture of build first. Skip user interviews, skip stakeholder conversations, skip subject matter expert validation, and any other kind of process that you have, like process is bad. Knock out a working prototype, throw it up there, get it in front of your customers, and then magic like, profit. Sorry, I had to close the loop. It was missing. It's still a big stretch between those two last steps. So these vendors, right, they're saying forget about everything else, including all the things that you talked about, right? Involving the users, figuring out what the problems are that you're trying to solve, all of that. And the reason behind that is. They don't stand to gain anything if you're going off on a tangent doing all of those other things, validating the need and all of that. They only stand to gain things when you are feeding their machine. Yeah, right. Their token machine. Yeah. So that's why they're, they're advocating built quickly. Speed is of the essence. Well, let's jump right into the best point I have in this category, which is speed beats perfection in a competitive market, especially if your solution is like sexier than the other solutions more well packaged than the other. So like the, your sales pitch is better than the other solutions. You know, more people at the customer sites than the other vendors or whatever. Speed trumps everything. There was a, there was a saying there, I can't remember actually. You know, I might be remembering. Boiler room, or I might be remembering Taylor, I might be remembering margin call, like be first, be better or cheat. Like I think, I think that's, I think I'm actually remembering a movie. I'm sorry. Be first be better or cheat or die trying. I don't know. I don't know. So I, I think there's something to be said for trying to accelerate your product to market. Right? But that said, there's an assumption here that you've built a product to satisfy a need. What I'm seeing now though out there is forget about all that stuff.. Just make something and then immediately launch it, and then immediately go make something else and launch it and make something else. So it's basically akin to throwing mud at a wall.. And just praying that something sticks. The odds aren't with you, but the cost of producing these multiple variants is much lower now, supposedly. And so, again, this is a bit of a dichotomy. I wanna talk about that as well in this podcast. Yeah. I mean, the immediately the pushbacks right there is like the LMS and the, you know what, whatever you're building this with, right. The theoretically it's cursor, club code, whatever, using the demo quickly, like the LMS. If you feed them the right things. This is always what they're gonna say is like, well you didn't do it right. Like if you feed them the right context, they can understand patterns that maybe humans will miss or because of human bias or whatever. Like they'll get it right. And you wanna be putting something that's functional that people can click in front of people like way faster than anybody going through a more quote traditional process would be able to keep up with. That's the underlying theory here, which does, I'm like, I'm gonna get behind that a little bit 'cause I do believe in that to a point. You know, I do believe in that to a point, which is the pushback right away in this category is the, like the normal stakeholder feedback cycles, especially in larger organizations. I feel like that's where the pushback lands. Yeah. The loudest people here were Madu guru from Google was one of the first people that kicked off this discussion of Build First. And of course he's selling AI tools for Google right. There's that. But then a bunch of other people selling AI tools piled on. But again, this is why we're looking at this. Yeah. Because we're not selling you any AI tools, you know? Is the, the other, the other points for this category, if I were to, if I were to represent their side the ma guru side of this. First of all, humans might miss patterns and because the LM has can act, can just chew through so much data so quickly it can get the patterns. And also I would say you could still do your interviews and everything you do and just have the LM read from that and be kind of like a, an assist to help you. But that, that's not what they're saying. That's not the way they're presenting it. I like, I got one more piece of feedback before I turn it over to you is user interviews like the interpretation of what comes outta user interviews could make those interviews unreliable as data sources, especially in large organizations where it's like, it's not just pure responding to what the user says, it's politics as well, you know? Yeah. And then there's other nuances there too, I have to say. Like which representation did you get from the user set, right? Did you get typically across the board, did you get just right a specific type of users. So there's that. But all of that said. Ignore the user's input that's actively solicited right by, by you. At your peril, basically right. At your peril part of that now it's time to cast off , the arguing agile like let's represent both sides and talk about solving problems that don't exist. Because , I would've thought that we got our fill of this in the nineties and the early two thousands of knocking out code multi-month long holding code not deploying it and not showing it to anyone, and then coming out like, look at my nine month long release. Yeah. Big bang release. Yeah, yeah, yeah. You know, we had a whole movement. About this a whole, sorry. We had a whole 20 year movement just on, just, just on this subject of like, you're, you're solving problems that nobody ever asked for like you generate. Here's, here's a, I know I've said this on other podcasts before of as a product manager no one ever will tell you. I mean, unless you're getting feedback from a product manager and then you're gonna get ripped to shreds. Like if you ask me, Brian, what's your true unfiltered feedback, I will give you feedback, but the typical user, like B2C, B2B user of your software, especially B2B user, they like the feedback they're gonna give you that lets you know they hate your products is. It's, it's pretty good. It's okay. It's okay. Yeah. Yeah. We often hear that it's okay. Yeah. It's good. Yeah. That's it. That's, that's pretty taped, right? That's what you're gonna get. Yeah. That's what you're gonna get from them. Yeah. If you are solving problems that don't exist, you could be led down a real bad road with that kind of feedback. Indeed. So two things that come to mind here, right. If you're not even validating what you're solving for and whether that's a real issue that needs to be remediated in some way, yeah. You could miss the market opportunity completely. Your competitor who is doing some of this stuff is gonna overtake you now. They're not necessarily the ones that are churning out a hundred prototypes a day. Maybe they're the ones that are taking their time and getting it right and they will overtake you. To us that is going to win the race. The other side of this one here is that the user interviews they basically, they cost you zero. I mean, I guess you could be in some companies where you give people like Amazon gift cards or something like that. Like it, yeah. But it's pittance. Yeah. It's, it's, it's minor compared to the learnings that you get in , the connections that you make and stuff like that. Like you're not really getting a lot of stakeholder buy-in, in the world where you're punching out. But like, all those are minor to me., The bigger one, the bigger one that I just kinda can't get over is like, , you're, first of all, you're solving your wrong problems. You could easily go down the road to solving your wrong problems. Even that one I could put aside because like, well, in normal software development you could do that too with, just like with politics and inexperience, making a bad decision. You don't have subject matter involved, that kind of stuff. But the user interviews, like s you nothing near, near nothing. And then you as a product manager, you vibe coded this tool and you're thinking, oh, this tool basically works and my developers just gotta pick it up and throw it into AWS or whatever. I don't understand what the block is here. Now you're building stuff spending real money. And you're completely going on the wrong path. You are really trying to hit a bullseye on a dark board from 30,000 feet. You'll be lucky to land in the same zip code. It's basically like now that the product manager can , churn out, barely functioning front end code now. Like their biases just get amplified to a million because there's, before at least a development team would be there checking you or your UX person or whatever. Or the holy, the Marty, Cagan Trinity, the designer and the Yeah, yeah. Like now it's just you getting prototypes to the, and be like, but the customer told me this is what they love, they put their name on the waiting list for this, for Brian. Brian and o's Cool new website. Forget Brian and O'S website that we have all the development. Like, forget that no one works on that anymore. Everyone wants our cool new website. It's just mind numbing to me that people are actually thinking about this as a viable kind of approach to do business. If it's got guardrails on it, I'll quit my complaining but don't pitch it like it's free. Don't pitch it like, oh, the development team costs so much money and this is completely free. It's not free. There's tokens attached to this and that token usage. Could potentially bankrupt you if you're not careful. Seriously. It could be a lot of money. Yeah. Yeah. It could be a lot of money. And you got a few product owners doing this by themselves it could really add up quick. So feeding the token machine, that's where we're, that's where, that's where we're at. Om we're feeding the slot machine AKA, the token machine. So build first is 10 outta 10 generated vendor tokens, I guess, or the royal flush of give us all your money and we'll flush it down the, I dunno, whatever. Flush it down the royal toilet. I don't, I guess, I guess yeah. And then I guess one point for building things people actually want, I don't, I don't know how well we represented both sides on that one. We tried. Yeah, we try. I mean, I'm not saying we are gonna try very well, but we did try, we did. So like Point two the, the token economy trap, you already heard me talking about tokens. And I'm not gonna stop harping on this one, but Brian, give it up. Tokens are cheap. Like the token costs are only gonna come down. This is what, when new technology comes out, this is what they always use to get their foot in the door. Like the cloud's only gonna get cheaper om Right. Okay. There are several CTOs out there that would argue with you on that front. And say, what about the total cost of ownership? We just need to mass produce these CPUs and they'll get less expensive. Maybe they will and maybe they won't. So think about it from the perspective of the manufacturers of these things, be it hardware or the AI engines why would they keep making things cheaper and cheaper? Bananas. It's absolutely bananas to even think that, like I, I've, I. It. What it's also been, as I've heard AI folks use this, this one is like, well, as better models get created, the tokens will get cheaper for the older models. And it's a great arguing point, and it would be a great arguing point with me if it were true with any, any model that ever existed in any instance. Like it's never true. The new Claude Opus model is like a factor more expensive than any of the other models. And it's getting more expensive than the three x opus model, than the new ones even more expensive. So like, it, it's not true like the co this, oh, when you scale and you could produce more the, like the, the old factory thinking of like, well, if my factory can punch out a thousand widgets in a day and where previously it's a hundred widgets, then I can charge my my cost of manufacturer's 10%. So I can cut what I'm selling them for. By a significant percentage and undercut all my a that that's true in manufacturing when you're punch out widgets with tokens, it's first of all, also my argument completely, I'm not bringing in my heaviest hitting point in this argument, which is like companies they're just greedy. They just need more money. So like, they're never gonna give you a discount. They're gonna make you pay maximum price. That's the point. You know, and if they're looking like they're doing something if they're looking like they're doing something out of the goodness of their heart, like letting every government employee, every government employee access their site for a single dollar of the entire government has access for $1. But then an unlimited amount of tokens, they charge you at the token rate for like the, like underhood. Oh man. It's so the marketing is so good. I think we should just, it should be Brian Om's marketing company, I think. Oh, yes. I like that Brian and Om's AI marketing company. We haven't had a company since we sold our yachts. Ooh, it's a sore subject. Yes. So, to your point about the, the newer models costing more, more tokens, more this, more that, yeah. The, the way, the way the vendors justify it is. These models are better. They're bigger, they're more chocolatey, right? Oh yeah. It's now a 30 gazillion billion dollar, or not dollar, but you know, it, it, this model, it's, it's so much, so much bigger and better. So that's why we're charging you more. That's right. It's got turbo engines on it, right? It's like, hey, so this is how the selling point happens. Bill Gates told me Windows 95 was gonna be better and faster with better access to the internet, and I'm still waiting. I I was lied to! Yeah. So the bigger models are more expensive, not necessarily that they, they mean more for your purpose necessarily. Maybe a small bottle is sufficient. You never know. Yeah. But yeah, I, I think the whole ethos of trusting the vendors to make things available to you at a cheaper cost because technology evolves. I, I think that's a fallacy. Just think about it from their perspective. What do they have to gain? Well, what would you say if I told you that? Yeah, you are paying more for token use, but what the return you're getting from that is your team is x a factor more productive, even though I don't have any stats about how your team is productive, but there should be a straight line extrapolation there by saying that's the payoff, right? We're getting the speed, et cetera, so we're getting more efficient. Again, there's not a whole lot of evidence to that necessarily. Well, it isn't true because what's directly proportional to the result you get is the quality of your input into it. Actually, I'll tell you what you get from more token use, you get more lines of code. More lines of code. Yeah. And so if that's your measure software lines of code slot. Some, some people it is their measure exactly. Yeah. And those people are the ones that are gonna go around saying, this is great. Okay. While we're at the casino and we're paying these tokens let's talk about vendor lock in, because like, I, we don't like vendor locking has something that's like, it's fallen out of the normal nomenclature that anyone talks about on podcasts or whatever. But like, once all your developers are online with cloud code or whatever they end up using good luck changing that system it sits in the heart of your system and has access good luck changing that out once you're married to it. It's huge to change that. This is like the equivalent of going to a casino and having different chips. At each of the tables, so you get different chips at the craps table versus roulette versus blackjack. So the chips that you get at the roulette table, you can't just go walk over to the blackjack table and use them there. Right. You're stuck with the roulette table. That's really what this is. Yeah, yeah, yeah. There's a lot of stuff that I could bring up in this category! I think the main against here is like you're gonna need a bunch of iteration with the, with the. Burning tokens to get to where you want. So yeah I would say, and, because the technology is new, you're probably not tracking token use to actually producing value. Here's somebody, whereas traditionally when people try to measure development productivity, they have a bunch of terrible, just terrible measures. But like, nobody has a pure , number of hours that went into this feature, and then number of dollars this feature got us, maybe token use can be connected, like this is the feature that brings us the most value. Here's how many tokens went into producing the code around it or something like that. I don't know. It's a simple equation there. I think the complexity is variables that go into it. Like for example, the token usage and the value you get back. Are directly a function of the quality of your input how are you prompting the thing? If that's good, , great. If it's mediocre, not so good. If it's poor. So it's all over the map, right? Yeah. It just depends on you as the what is your, what is your maturity on the prompting front. So would you try to connect, would you try to connect the accomplishment of your business outcomes or maybe some sort of user satisfaction? The token to value ratio? That's kind of what I'm talking about I don't, I don't even think there is a way to connect this, but like the, whether you achieve the business outcome, like whether tokens were spent to achieve a business outcome, what that business outcome was, how much people like the feature or how much revenue the feature brings in , to, you know what I mean? Like, I'm trying to figure out a way to say like, Hey, this, this AI code is a valuable tool to my developer team. Again, that's not the way the people that are like pushing this stuff, they would hate it being portrayed that way. Sure. That oh, it's a replacement for all your, no, it's not a replacement. It's certainly not. But like, is there a metric there that I can use to be like, is this stuff being useful to me? You know. That's a great question. Is it a feeling like, is it, I think for the most part it is. So how is it different than just the cost of doing business? Meaning, yeah. You know, the, the cost of electricity, utilities that, that you're using to try and get to that point. Like what is the, the direct correlation between token usage and value achieved? It's very difficult to quantify that. Yeah. Yeah. One thing you could do is to say, if you had done something similar in the past before using tokens and AI and all that, how long did it take you to get to a point where you had a release out in the, like the first release, for example, and maybe compare it to now and say, we got there three times quicker or two times quicker, or whatever it might be. That'll give you some idea, right. But I don't think you can derive a formula that you can apply to future endeavors and say, if we use AI in the future. The same formula will hold true for one thing, your experience curve is getting better, right? With ai. Mm-hmm. So things will be expected to be different, and if the assumption holds true, because there is an assumption that token usage will become cheaper. Right. So you factor that in as well. But like I said, that's an, it's not, it's not true, but I like, it's not true. I, I like to dream. Yeah, yeah. Yeah. So, so it's difficult to have an equation that says this is how the return is based on spending it's very difficult. Let's wrap this category. I mean I think it's up there with like developer productivity and these things that really can be measured. But if you armed your developers with these tools, you probably have some kinda like customer satisfaction metrics that you could look at before and after of speed to delivery or whatever. Hopefully, like if you have these metrics before, it does make it a lot easier to measure the afterward, which is a terrible suggestion to be like, Hey, because there's people out there right now that are like, well, we don't have anything. I mean, if you have lines of code, you're living the life.'Cause you can generate all the lines of code in the world. It's so interesting that if you have lines of code, one of the things you could gravitate to is. The number of defects per a hundred lines of code or a thousand whatever. Pick your number, right? Pick your baseline based on the code being generated by ai. Oh, no. Versus the code being generated by human brains. No. See, we fired all our Q QA people and now we have no defects to lines of code. Yeah, no problem. I'm gonna say the token optimists, the people that are like, Hey, just pay for the token. Like, just believe in it. Like it's, it's gonna set you ahead. Okay. They get eight outta 10 for believing that the math works. Eight outta 10 tokens. They just walk into the casino and they just believe that they're gonna be lucky to know that, that they're, they're manifesting it. They're the, if they dream it, they can make it happen type of people you know what I mean? Right. They're that kind of people. They're very woo woo. And they get eight outta 10 because they just wanna believe in it. Three outta 10. For the people that understand probability, don't, don't, you don't want to be near them at the casino. Right. Let's see. Token skeptics. They get seven outta 10 for recognizing the scam. Like they're, they're down. They're ready to play the three card Monte along with the AI companies, and then you know, four, four outta 10 for you and I for looking for alternatives in the, in the mix. Cool. I'll take the four. Yeah. It's like, we're not gonna win. We're not gonna win. Make money at that. But I mean, maybe we'll get free drinks and break even at the end of the night. Always, always worth playing for. the validation shortcut, disaster is the next category. The build first culture encourages teams to skip the cheapest validation methods, which is just talking to people and in lieu of talking to people, doing surveys and gathering data from logs and stuff like that. Usage and logs to back up kind of our bets. You skip old, you don't need that. Just demo. Don't need any feedback. If they don't like it, we'll just give 'em something else. Because it's so fast to produce something. Produce five demos, asinine to think that you could dispense with feedback from people that actually will be using or are using the product. Right. Or, or will be using the product. To me, it's just a non-starter. I can already hear the Brian CEO feedback, the Brian, CEO, who was a former salesperson. Feedback right now. I was like, om, you're wrong. And I'm gonna tell you why he'll tell you in that confident that, oh, it is slightly southern accent. Also, om, you're wrong. And I'm gonna tell you why. All right. Because market timing requires speed. Over perfection. Okay? We can't be talking to all these people. We need hands on keyboards, ohm. Boy, it's been a long time in a podcast. Yeah, it's been a while. Yeah. Great. If you are really hanging your hat on that one, I'm gonna keep my resume updated because honestly, I look at the medium to long term. I don't know how you survive by just simply throwing so many products out there in the marketplace, hoping that somebody somewhere is gonna like something. What happens if you are correct, and let's say out of every 10 products that you put out there, 5, 6, 7, get some level of traction, maybe not the same level, what are you gonna pursue now? What are you chasing? You have no idea. I'll, tell you again, IM gonna stay on the sales side of this one because it's way too fun and you have an expense account is, listen, if you get out there and vibe code me something that brings revenue into the company, like you're more valuable than any developer or any engineering manager or any solution, a architect, any of these technical people I pay you. And the fact that you did it without being able to know how to program extra bonus, that's going straight to the top. You're gonna be the engineering manager tomorrow, kid, like put you in coach not, never, never coded a day in his life. That's not required vp, you don't need right there these days. Yeah. Everybody who's here, like they're all part of the problem. They can't think outside the box. Like you're out there contacting customers and throwing out prototypes and letting them log in and. Click stuff and they love it and they're, they're signing up on waiting lists and signing up to give me money, hand handover, fist money. Yeah. Like that's, that's what I'm looking for. Spoken. Well, like a like a salesperson who's gonna get their commission based before the ink is dry on the SOW or contract. Listen, long term, this is a hollow strategy, it will implode on itself just because when you put something out there that is being vibe coded, can you stand behind that product and support it? Is it going to stand the test of time when a user starts to click a little bit outside of what they see? You don't have any qa, so nobody tested any boundary conditions all of that stuff. Nobody tested edge cases. Yeah. Great.'cause they don't happen in real life. I mean, come on so, yeah. But you know when, when the pain comes later? Yeah. Not that much later. You, Mr. Salesman, will be out there in Maui spending your commission. So yeah, it depends on your motivation! I mean, listen, you're like, oh, that's not wrong, but we're getting paid I'm sorry, I, I was, I, I want to say that a, a better version of me had like, well thought out pushback for that, but no, you're not wrong. The organizational goals have been met, our salespeople got paid the real pushback here is like if you're not, you can do all the user testing, user research and stuff that you want. But like, you need to get me to revenue. That's like the, I'm thinking about businesses that I worked at in the past where the business gets in trouble. And then the pressure's on whereas the sales people I rail on a podcast all the time about your product people and your salespeople should always be together. You know, holding hands. It's a little weird like arm in arm going to customer, you don't necessarily have to hold hands. I mean, you can hold hands if you want, but the salespeople and the product people, like they should be one the same. You're both doing pitches, you're both trying to represent the product. You're both trying to solve pain points. There really shouldn't be a division of like, Hey, whatever gets us to revenue. Like that's where the real signal is. Like that's the best pushback that I have in this category is like, oh, I'm getting us to revenue. What we had in the past is dying off. What we had in the past is not hitting with customers or whatever. And if I can vibe code something up that is maybe it's like still similar to our software, but people can use it and see the vision and be like, I absolutely need that. I would definitely sign on the bottom line, you let me know when it's live and I will give you the check. If I can get to that kind of commitment, then this is a good thing. Under those conditions, I would say, I think if you're leading with doing the discovery, figuring out , the right problems to solve.. And validating that there is a need in the market for the solution you're trying to create. Mm-hmm. And then using vibe, coding, whatever it might be as tools, I'm okay with that. Yeah these are just tools. Before spreadsheets, people were doing things manually. Yeah. On calculators and before calculators they were using the abacus. So these are just tools. I'm okay with that approach of using tools to accelerate time to market. But I'm not okay with dispensing with doing all that discovery upfront and just saying, just build it.'cause it's cheap to build it. If we get it wrong, doesn't matter. It was cheap. So we'll build something else that doesn't sit well with me. Yeah. I'm not gonna have an argument for in this category. One of them is like if your product requires regulatory or any kind of compliance or anything like that, you know what I mean? Like, not governance of the, like the four letter word that is governance that I normally Yeah. Don't like. But I mean like actual you might kill people with this, you know what I mean? With this vibe coded solution. Like if your product needs actual, regulation. Obviously this kind of, fails a little bit for that. If you are segmenting a market you can vibe, code something up. But like if anyone has ever launched a product where they were trying to segment an already crowded market to be like, Hey, our product does what everyone else's does, but ours does something special. There's a certain amount of like, being in the market, doing interviews and research and stuff like that is part of your marketing effort to educate and try to divide and segment the market mm-hmm. To carve out a niche for your new product or maybe your slightly redesigned product or relaunch of existing products. This is the Coke Zero thing, right? It is, yeah. So you have, you have Diet Coke and you have regular coke. Yeah. Coke Zero also has zero calories. Exactly, exactly, yes. But in a appeal to those people that object to Diet Coke. Yes. So, yeah, absolutely agree with that. So you're gonna vibe code up Coke Zero I don't know, like there, there's, there's like a 50% marketing, 50% user, engagement type of activity that's going on. Yeah. But it's your real product team and your real develop, it's your real team that's working on the software, right. It's not smoke and mirrors. It's a real effort. And the more people you get out there and talk to and do like the, the long version of user research the more opportunities you have to start segmenting the market to be like, oh, this is the hot new term in the, in the market. And everyone's gotta have, and your product already has the zero sugar is a new, you know what I mean? If we're gonna stay in the beverage Yeah. Like zero sugar, zero sugar, zero calories is a new hot, hot marketing thing that everyone's gotta, and then everyone starts copying you or whatever. You could say like, well you can still do the vibe coding and stuff, and still do all the traditional research. And get to the findings of traditional research faster with happier users because, they have stuff they can put their hands on with these vibe coded solutions. I guess you could say that, but the issue with the nuance I just threw out the issue is like the way the message is being projected by all these people trying to sell you an AI tool, like they're not projecting nuance and they're not projecting like what I just threw out, like would actually try to help you to be like, vibe, code your thing. But, but like, have strong UX research. Chops. Yeah. And then move forward. Like it's, it's not being projected that way. Maybe I'm reading too much into nuance. No, I don't think you are. I think it's being projected as buy, my book. Get Yeah, yeah, yeah, yeah. Come to my Ted top, get rid of all of this stuff that's distracting you from. Buying tokens from me. No, no. That's what it is, is keep my coding, because the more you vibe code, the more tokens you're gonna need. And yeah, my stuff's going higher and higher and higher, my stock price and all of that. It's just extremes really, I think at the end of the day. So validation checkpoints like the implement validation checkpoints before major token spending set, set dollar thresholds, like the same thing you do in AWS where you set alarms based on spending stuff like that. Like thresholds. Yeah, exactly. Yeah. This is a good one it like be all in on AI tools, that's fine. But, set some limits, set some thresholds. At least set some check-ins to be like, when we're spending at this rate, what are we getting at? I mean, that's basically what this category is like, Hey, these lessons that you're getting from the market, especially when you're doing market segmentation, if I had my sales team out there. Just deployed around the globe. Or let's say, let's say I was just selling in the US right? And I had my sales team deployed all around the us. Like I hired like a sales team per region in the us Maybe I split the, split the US into like eight regions, six regions, whatever and I hire a sales team per person so they can be like within two hours flight of any client site that I have or whatever. Some companies, this is a pretty typical thing. Yeah, that's a very standard, yeah, pretty, pretty typical thing. Like the money that I pay to power my sales team and then all their expense reports and all that kind of stuff. Like there's a budget there. There will be a norm that emerges of like if I want to engage prospects, it costs this much generally to engage a prospect. This is the same thing. If I want to go do market research, punch out prototypes, interview people, it already costs a certain amount. Like if you're not tracking how much it costs to do the job without the AI tools. You should do that, first of all, to say this is discovery work. Like go read inspired, read all the work of Theresa Torres, figure out what discovery work is, then figure out how to budget to say, this is my budget for discovery work. Not, not, not like I, I only this many dollars can be spent on just, just what do you spend on, you know what I mean? I'm trying to say like, figure out what you spend on it and then when you figure out what you spend on it, your AI stuff will add onto that budget okay. And, and then, then set up your alarms. Yeah. I think if you are in a company that is not even tracking this and they're just giving their employees a free hand at, to using these tools right as they see fit it could be a slippery slope. This could be the the gamblers paradox, right? Yeah. I mean, you just keep gambling 'cause the ne you're gonna win the next time and it never happens. So, yeah, so I think it's good advice to make sure that you're metering this in some way.. But also, , when this product reaches maturity, you've already got traction in the marketplace. You could easily say what component of that was AI spend, whether it's in doing the initial analysis of user requirements and things like that, the validation of need or it's the use of tokens by developers to try and accelerate the product development, right? You can combine all of that and now you have a revenue number to go in the other column and for future, this informs what your investment should be in the future because obviously there are other variables, right? You know, if the market potential is bigger, you might wanna up that, but yeah, it's better to have a starting point. Yeah. That's a great point. The size of your market, like you definitely should adjust the budget because you're fishing in a much bigger pond. Sure. With bigger fish and or mammals, whales if you're not thinking this way, this is very businessy. We got into a real business-y kinda I don't think you can avoid it. Just look at the money you're putting in versus, how much it takes to bring a prospect in. This is the stuff that customer acquisition costs is made out of. Yes. The CAC is a certain slippery value of like, oh, people onboard and do whatever no, it's like you can put a hard dollar to all of the activity that went into getting the typical customer and get your real costs, because again, you're still paying all that, but the AI is adding onto the top of it in a remote first world. Those costs are cheaper'cause you're not paying hotels and stuff like that, depending on your industry. Yeah. But again, with the token use of like the uncontrolled token use, it could still get, I mean, it can run rampant actually. It could run great. Yeah. It can run wild. Yeah wanna cut a macho man pro run a wild, yeah. Scoring let's see. Scoring for this category. This was a good category. I think the, the market validation through real use versus learning these lessons that are expensive and you don't have a good handle on them anyway. We're gonna use the AI cooking show scale for this to say that all of the people that are on the side of build first, they get a nine outta 10 for the presentation of their food. But because nobody is hungry we're gonna reduce that down to two outta 10. So that's, they pass the taste test. The people that validating, they get a solid six outta 10 nobody's like super happy, but also like they're turning in like a quality consistent meal every time. So they're gonna take this category, I'm just saying Circle gets a square. Cool. We talked about individual teams burning tokens like Crypto Bros in 2021, but let's zoom out for a second to get the bigger picture. Let's talk about AI vendors and how they're systematically capturing entire organizations, or are they, like that's the organizational dependency scam AI tool vendors creating organizational dependencies to make it increasingly expensive to operate without their platforms. So I'm kind of on the fence about this one. This has like strong two thousands it budgeting vibes of like needing to buy certain tools. We talked about it a little bit earlier on the podcast of like vendor lock, which again, the pros in this category of AI tools. They create a genuine competitive advantage, which I'll sign on to this one to argue this one very easily, because they personally, like all my personal projects and stuff like that, that I code, I use it with ais in the size saddle is generating the code. I have some guidelines that I've been very successful with and it's like, it is a development productivity multiplier That's a very real thing. Like it, you, you're shipping much faster than a team that was purely just writing everything the hard way. Yeah, yeah. There's no doubt about that. The machines can really do things very, very quickly. So if you're using, if you're harnessing the power of ai but not blindly, you're doing that intently. Then definitely you have that competitive advantage. I think we're rapidly approaching a time when pretty much everybody's harnessing it. So now where's the advantage? So now you gotta look beyond just using ai. It's harder. You use it and do you really use it in a way that is either saving money or making money? And so I think that's where the advantages will come from.. Going forward. Point taken the vendor switching costs, like if I'm gonna argue about, on my own point about vendor switching costs, I would say like, yeah, if you're like all in on cloud code or something like that, like there, or if you're all in on cloud as model and it's like in the middle of your programs, like you do something with club, like you pass off the club for a decision or something like that, and then you keep going. And then you, you haven't made this like a business decision, like a person decision, like if you have like a, an AI agent in the middle of your workflow somehow, like maybe I have some like B2B SaaS application, but like like order processing, customer service, something like that. You know what mean? Sure. I'm thinking something like that. Whereas it's like, they're like the traffic cop of like, oh, this ticket has all the right fields. I'll allow it to go through to the next person. Or this ticket doesn't have the right fields, I direct it. Or something we do that with rules now, like real complicated, like business rules and stuff. But like the AI could look at it and make a human judgment, you know what I mean? Or something near to a human judgment, things like that. Once you integrate things like that with like LLM templates and temperatures and stuff like that, now you're kinda locked into a model and it would be difficult just knowing the way that I've seen people adopt AI in business. Not a lot of people are building. So that all of your prompts are like completely componentized, where you could take them and flip to like, oh, we're like, open AI is our vendor today and tomorrow we're gonna separate from open ai 'cause we got a better deal with Clause. So we're gonna flip over Like, if you did that, you'd have to like revise your prompts, revise your inputs and outputs. Let alone the actual like passing off to the right. Yeah. To the API the, yeah. Vendor lock-in. Maybe like five years from now, everyone will be like, well, why would you ever, why would you ever customize your stuff to one model? Why would you not like componentize it to where you go to the box and then it says, what model are you using? And depending on the model it uses the proper prompt and models and all that kind of stuff. Most people don't do that. They build completely to the stack they're on. That's true today. Yes. And in the future, yeah. People could perhaps go down to that level of componentizing the work, right? But I also think the other side of it is true, which is some of these AI vendors will get smart and enable switching from other vendors to yours. Right. You know so they have to build specific tools that you can harness, right? Yeah. But they would make it easy on you as the user. Is that like the Android? Feature that is they know if you're moving from an iPhone to an Android, pretty much they give you a stack of tools that migrates all your stuff. It's pretty much like that. I was gonna say, welcome aboard. You know, when you, when you see that stuff emerge, you know that's the end that there's no more, there's no more ideas. Right. Yeah that's the end of the product roadmap for me. I think . It's coming at some point. I welcome it. I think it would be a really cool feature. If you are an AI vendor, you wouldn't think about all this day one. This is not exactly in your MVP. It's also not sexy. This kind of work. It is not. but the payoff is big. If you can get certain number of users over mm-hmm. Like poach them, whatever the actual term is. Yeah, convert them. Yeah, exactly. That's right. They may well do that in the very near future, I suspect. The takeaway for this category says, maintain ai free zones and critical workflows ensure the teams can deliver core functions if vendor relationships end or pricing becomes prohibitive, which I think is a great ideal. It's a great goal. It's a great shiny ivory tower to shoot for. However, I just, I can't, I don't know anyone that designs this way or builds this way. Everyone basically selects one AI vendor and goes all in, pushes all their chips. I dunno why the, the casino metaphor has just taken over this podcast and it's, it's, it's not been in any of the planning that we did. It's just taken over the whole casino. It is organically emerged. Yeah. It's so weird. So switching costing is interesting to me though because you know, if you are a smart AI vendor, you'd wanna reduce the switching cost Right. From your competitors and I haven't seen anything to that end yet, but I dare say you probably will. Don't think they figured the tech out. Yeah. Yeah, that's true this is moving so quick. Yeah, I don't think so. I don't think they figured the tech out, but also like keep their resume updated. the other thing to talk about here is the death of institutional learning, which is the idea that your, your build first culture combined with AI tools it's creating organizations that can produce code quickly, but they can never learn from their mistakes or talking to people or, you know what I mean? Well, maybe it's not fair to say talking to people, they can never learn from their mistakes. Mistakes meaning like build something and then Correct. And go in a different direction. Look at metrics kind of be adjusting constantly. Traditional software development has it where you do learn a lot from your mistakes every time a bill doesn't compile. But with vibe coding, that doesn't happen anywhere near as often now. Yeah, right. And when it does happen, you simply do a little tweak and it fixes itself and you don't really know what it did in order to remediate the defect that was there. So yes, I agree with this. I feel like this is a, an issue that's going to plague our industry in the next two to three years. Sure. People, it's the dumbing down of. You know, corporate software development. All right, let me, let me, let me push back out again. I want Sure. It's hot in human today. The AI tools, democratize development knowledge. The pro side of this is like, we're democratizing software development. Om, you're wrong. First of all, we're democratizing this, what a great phrase, right? Democratizing software development. It, it is good marketing, isn't it? It is great marketing, but it's also a bunch of bs. Oh, because it doesn't do any such thing. Are you saying marketing is often not true? Market marketing is, it spins up versions of reality they want you to see. So this idea about losing. Institutional knowledge, even domain knowledge is valid. I think that's a real issue. And I think that's a big threat to, the long term survival of industry as a whole, because you're gonna get people that all they know how to do is vibe code. Well, let me, let me, lemme try to put into words, i'm gonna, I'm gonna. You're welcome, Julie. I'm gonna try to repeat back to you, what I think I heard. And what I think I heard was like a couple points actually that you threw out, which was, yeah. In the rapid fire round teams lose understanding of why things work, when you have the, democratization of vibe coding solutions where the people generate a function that goes and does this, does thing a, the business output. A, but then the team doesn't really understand how the function works. You know what I mean? It doesn't really encapsulate all the business logic. They feed it back to the machine when the customer complains and then the machine gives them the new version of the function and they copy paste that in, institutional memory becomes externalized to the vendors because nobody really understands this code and because nobody ever people talk to the customers and it's like, oh, it seems to be working this way. I don't really understand. And they put the like clear text English into the, into the chat. Yeah whi which by the way, the chat is like a session that one person has. So again, assuming the people leave the team or stay on the team or mo leave the company or whatever, like that chat session is gone. So you never know what the thinking was that went into vibe coding up that solution. So there's a, there's a couple things here. So there's number one, there's the understanding of why things work was never explored in the first place. Number two, the institutional memory that you get, you basically shored that to the AI companies. And then I, if those two things are true you didn't say this, but, I wrote a note quickly while we were talking. If those two things are true, you can't debug any of that code. Like you, good luck maintaining it. You're, you're just gonna send it off to a AI and be like, debug this. I don't understand you know what I mean? And then, all bets are off. Again, a casino metaphor, by the way, all bets are off if those people don't even work at the company anymore.'cause now not only do you not know how the code was created, you can't maintain it. But now, like you need to just throw it to the A and be like, Hey, ai, just figure this out. I have no idea. So the scary part about Oli is, is you're at the mercy of the ai, right? So let's say by the time you actually have an issue that needs maintenance and you don't really understand it anyway, you pass it over to the ai. It's not the same AI that built it in the first place. It is now a different ai, not different vendor, different model perhaps. The new beautiful, improved model. Yeah, maybe, but it works differently logically speaking so it's gonna come up with a solution that may not be the best solution to this problem at hand. That's a huge risk to me that you are, you're relying on something external, much less it's, it's a machine. You're relying on something external for your core business, right? I mean, I don't think it gets scarier than that for me. Right. I mean, what would you say if I said that?, Listen, om like the experimentation cost, the actual cost of developing a prototype, we can just throw it away like, that that cost is like super. Yeah, maybe it costs 20, 30, 40 bucks, but like that's super cheap compared to our normal process that we go through. So if we score a home run great, I'm willing to spend that money, and whatever your pushback to that is gonna be is this real time feedback with things that people can put their hands on. It's way faster to generate that stuff in the AI world than it is to go through the normal cycle with the retrospective analysis and everything that we do. Like, it's just, the process is so much faster that I'm willing to pay for inefficiencies because it's just so much faster. So I'll switch analogies briefly here. You know, you said home run, so we'll go to the baseball analogy here. So basically what we're saying is we can still bet on baseball, right? Stand. Yeah, we can. Sorry. Yeah. It's not illegal. You stand at the plate and you're swinging away with your eyes closed as fast as you can, right? Mm-hmm. And you're just basically hoping you're gonna hit something. So for, for a home run, yeah, it's great. It's cheap. But for that home run, you've missed so many, right? Sure. There's been so many. Or the ball strikes, I don't know what they are in baseball. Well, basically when you don't connect or you hit it, sure. So that's not the best way to go because you don't have a Sure bet. Now do you have a Sure bet if you're doing things the right way? Yeah, pretty much. It's a known thing, right? This has been a proven phenomenon that if you go figure out the right problems to solve and then solve those, you have a much higher chance of success. I was hoping that we could get outta this category. I don't know why I'm talking like Morgan Freeman right now, like voice of God. I was hoping that we could get out of this category without you bringing up the point that your strategic thinking. Is really atrophied in this environment where you're just doing this like tactical vibe, coding of a prototype up and whatnot, and like, you're like the big picture, the systems view of how everything fits together. Boy, if we start pushing product managers into this box of like, just tactically get in there and code stuff, kid. Like if you can't code and like we're gonna have coding interviews now for product managers, it's like, okay, well who's doing the systems thinking? Where's my systems thinking interview? Oh, we don't have any of those because all the people interviewing to hire people, they're not systems thinkers because they're thinking this way of like, correct, just gimme my next UI iteration or whatever. So like the strategic thinking now that , on the last podcast we put up, you were like communication. With the customer. Like don't offload that as a product manager.'cause that's the main thing you should be doing. Correct. I would say if that's the main thing you should be doing the fast follower to that. The very second thing that's behind that is the strategic systems thinking aspect. That should be the very next thing on your radar when you're done with all your customer outreach, communication, strategic talking points and whatnot, is how does this fit into a larger picture, especially the higher you get up and the, the senior product manager, director or product, stuff like that. The Sure more of that can kind of stuff you're gonna do. But if you're just focused on like, what is the AI telling me? You, I mean you're like the myopic view you're getting is so close that I'm real worried that this skill is gonna atrophy.'cause you're gonna do less and less of it. Yeah. And I agree. And, and I think not doing these is extremely dangerous. You so think about what all of that entails. We're not gonna break it down to everything but strategic thinking. You are thinking about competitive analysis. Yeah. You're thinking about compliance with regulatory requirements. You're thinking about pricing strategies you know, differential differential pricing based on different segments of the market. Yeah. These, these sorts of things. If you're not doing that and you're simply churning out code, you're swinging away at the plate as fast as you can like propeller. It doesn't really help you. And actually I would go further and say, this could harm you because yeah, you can have a different product every eight hours or whatever it is, but you're slinging mud at the wall here you know, you can't dispense with the other stuff. That is absolutely critical. The, the strategic thinking speak of, yeah I agree. The AI explanation sessions, like when, when you have your takeaway and AI is assistant to you, like is an additional tool in your toolbox, like that just becomes part of your demo to be like, Hey, AI helped us with this part. You know what I mean? It's, it's like an additional tool that you deploy. I don't see it as like, well just offload the whole thing to this. It can't be a replacement. And like, yeah, you can vibe code up stuff but like, if you're gonna say, I don't need UX researchers, or I don't need, you know what I mean? Whatever, whatever it is that we don't need this week. And again, people might listen to this and be like, well, Brian, nobody's saying you need to get rid of their UX P yeah. First of all, I understand what you're saying and that you've not necessarily seen someone like get laid off because AI is replacing their job. That's, that's also like a flashpoint online if you were to go on online forms and stuff like that, or on Reddit or whatever, and talk about like nobody's directly lost their job because they've been replaced with a chat bot or whatever like that. That's not necessarily what's happening. Companies are laying off because it's a good time to lay off and like they're using AI as an excuse. Sure. But it's not really ai, it's just like, yeah. They're doing some like baseline automation, but they're really just doing this to like, make themselves more lean , it's a profit motive at the end of it and that's really what's happening under hood. But if you were to be to supplement your team members. AI to help generate solutions or to help generate like ideas or whatever. Like it would be helpful in your demos to highlight what AI was assisting with. So at least you can get the perception out there that like, this is what we're actually using it for. I think that goes both ways, right? So if you're doing that in your demo, it builds trust with your customer because they know what you're using but inwardly to your team as well. It's spreading awareness within your team of how you use the tool selectively and where it's helping I think that's a great thing. It builds transparency, it builds trust. I'm all for it. Yeah. I'm gonna stick with the, the casino metaphor and give this nine outta 10 nine outta 10 doggy coins for reaching their destination successfully. The problem is we're gonna downgrade them to two outta 10 doggy coins because they're in navigation skills. Are useless when the GPS is not working. So that's, they're outta coins. The AI GPS Yeah, sadly, sadly. And then the traditional teams, of course, are still maintaining that between five and six outta 10, because they're still doing the the, the discovery. They've been trained to do that. That works every time. Apparently talking to people apparently is a good, solid practice. I would've never guessed that would be the takeaway of this who knew, right? Who knew? Yeah, I know. I would've never guessed. The last point that I wanna make is less of a point that I want to talk about and more a wrap up of everything we talked about so far. The ultimate irony of this build first culture that's emerging is that it's marketed as innovation, acceleration. But like, by sending this stuff to the AI tools it's like a novel problem solving when, what you really are looking for is like a big breakthrough. Like it can get you like the iteration of what your idea is that you fed it, but it can't get you like the big breakthrough you know, and, and I'm willing to get pushback to say that's just like your opinion, man but I feel like the AI tools, if prompted right and leveraged, right? That's what the people will say the people trying to sell you. By the way, here's a got a AI tool in my jacket. I'm willing to sell you. They'll say, token sellers, they'll say, well, you're just not being creative enough with the prompt right? And you know, it's a very low barrier to experimentation for anyone inside your business. They don't need to know how to code inside your business. They can just pick up the tool, start asking it to do things. And you know, anybody can be a developer. That's what they'll say. And that's one side of it. I mean, the other side of it is like. The actual innovation, the leaps in your business ahead, that requires human insight. It requires talking to people, and that stuff's not going away. No matter how many prototypes you can bust out, maybe the prototype becomes like the catalyst that helps the conversation move along. And that's where we wanna go. When we were planning for this podcast, like Marty Cagan wrote a blog in between when we were planning for this podcast. That touches on a lot of this kind of nuance. But again, like I don't feel this stuff is being projected with nuance. It's very much one side or the other. I think you know, people out there saying, do, do or die, jump on the bandwagon or perish. Yeah. Yeah. You know, and that's a shame because that's, that's not reality. Well, if I had to cut straight to a takeaway right now, like before you even really talked about this whole category, if I had to cut straight to a takeaway, I would say like, your takeaway as a product manager, or honestly even a team member, developer, qa, whoever you are on the team, like you should be reserving a certain part of your time at work to experiment with different solutions. And like the AI is one particularly different solution. It's like the old, again, my background's in qa, so like I can speak intelligently mm-hmm. About testing frameworks because I did a lot of my testing frameworks by coding with Selenium in Java and hating life every moment of the way. Now playwright's a thing, I can write in Python, I can write in whatever. And life is a lot better but like, there's other technologies out there, but like if you take AI outta the mix, I always should have a percentage of my time to probe the market, figure out what the trends are, try new technologies, try different things, you know what I mean? Learn new skills, that kind of stuff. Yeah. AI is the same way it, like the real innovation is when you have like, you're not packed, packed, absolutely packed. Every little bit of space is completely taken up and there's no room for experimentation we just need you to punch out the next widget kid that kill innovation so fast. You're right. Yeah I understand it's not necessarily an AI thing, but like you're adding an AI tool on top of all this to say like, well, it makes you faster at doing your job where you're completely packed to the tippy top over your head. Yeah. And now maybe like you can get a little breathing room; i'm like, well, yeah, maybe. But also maybe if your organization understood a little bit about you need a little wiggle room to invest in new ideas, whether those ideas are AI or something else, you should continuously be experimenting and that's your team members maintaining your products. I think the companies that are adopting that kind of a modus operandi and policies and whatnot are, you're gonna see those people overtake companies that simply emphasize speed and just churn out vibe coded products on masse I think that's gonna happen pretty quickly. Yeah. I mean like the, the other one, if I'm gonna be on your side for a second, I'll say motion is not progress. So, and, and a lot of people don't understand that. The larger the organization that you are in. I would say the murkier, it is to decode motion is not progress. The size of the organization is like the size of the container in the chemistry lab when you are learning about Brownian motion. You still have bits flying around everywhere. So yeah, we've used another metaphor now, so I'm gonna stop. I like the casino one better. here's the uncomfortable truth about the build first culture that I think we talked about in this podcast, it's not about better software development. It's about better vendor revenue streams. They're gamifying development in this. Pay to play token consumption, like ca, casino floor kind of metaphor that we've kind of settled on that again, which is very weird that we settled on this. This is a very apt one though, because it fits so well. Like is it, is it build first? I mean, the, they're trying to say, they're trying to compare build first versus plan first to say like, oh, we're building something, and you get something from it. Whereas when you plan, you get nothing but a plan and you, the plan doesn't survive contact. That's the marketing what they're saying. But what they're really doing is they're using that message to turn development budgets into subscription revenue basically. Yes, exactly. That. I, I think that's a bit of an extremist view to say if you're planning, you get nothing. Yeah it, it's just enough planning maybe, right. So, yeah. I mean, like nuance doesn't uh, put butts in seats. Om. It just, no, that's very true. It doesn't do that. So just, just like the, the casino metaphor here, the house always wins. So it's the, it's the tokenizes that, that are winning here. I'm a big fan of the house on the old arguing Agile podcast is what I'm saying there. Well listen, if you are if you're in an environment where you're just simply ask the vibe code all day long, or if you're in one of the environments where you're going with intent, let us know what you think about this podcast down below. And don't forget to like and subscribe. Oh, and keep that resume updated.