
Revenue Roadmap
revenue strategies for family law firms
Learn from the experts behind the growth of sterlinglawyers.com Anthony Karls, President of Rocket Clicks / co-founder of Sterling Lawyers and Tyler Dolph, CEO of Rocket Clicks - www.rocketclicks.com - interviews the experts in all the areas that will drive revenue and increase profits for family law firms
Get technical knowledge and learn from the experience of those who paid the price to learn what it takes to grow from an idea to an exlcusively family law firm with 30+ attorneys.
Revenue Roadmap
Testing vs. Optimization: Strategies for Paid Media Success
Join Anthony Karls and James Patterson from Rocket Clicks as they dive into the differences between testing and optimization in paid media. Learn how local entrepreneurs can balance large-scale tests with small optimizations to drive growth and improve marketing performance.
00:00 Introduction to Revenue Roadmap
00:23 Getting to Know James Patterson
02:02 Understanding Testing vs Optimization
03:45 Optimization Examples and Strategies
05:45 Maturity in Marketing Programs
14:19 Risks and Rewards of Testing
17:04 Conclusion and Final Thoughts
Check out our Blog!
https://rocketclicks.com/client-education/testing-vs-optimization-paid-media-strategy/
#digitalmarketing #ppc #businessgrowth
All right, here we go. This is Revenue Roadmap, where we talk about sales and marketing for local entrepreneurs. My name is Anthony Karls, president of RocketClicks, and I am with Mr. James Patterson again. So thanks for joining me again.
James Patterson:Yeah,
Anthony Karls:paid media topic today, and we're going to talk about Versus optimization. So, uh, James, before we get into that, hit me with something, hit me with something interesting about what you do outside of work,
James Patterson:outside of work. Um, I'm a pretty avid runner, back in the, uh, you know, COVID time period, you know, I think a lot of people were looking for. new ways to spend the time or whatever and really found myself, uh, missing the gym. So I wanted to find a way to, you know, kind of stay fit and, and be active and started running outdoors and kind of turned into a kind of a passion of mine. So been kind of
Anthony Karls:you do marathons, half marathons, you just run for fun.
James Patterson:mostly for fun right now, but aspirations to do something more serious one day. I've got an Ironman in my tenure picture, I think is something I want to get to, but definitely have a lot of, a lot of milestones leading up to there. So probably do a marathon in the next year or two and then I'll go from there. So
Anthony Karls:Okay. Time, time for honesty. When you and Lisa run, who, who, who wins the belt most often?
James Patterson:so I'll give Lisa credit on the short distance. She's very speedy and those, you know, kind of quicker ones. I definitely have a little bit more of the endurance. So sometimes on the longer runs, you know I'll beat her out, but she definitely has my has my number when it comes to the shorter distances for sure.
Anthony Karls:I mean, I'm at least you can claim one of them when I, when I do anything with Chelsea, if it's over like four or 500 meters. Not a chance. She's a swimmer in college. She kicked my, she just kicks my butt a whole time. Embarrassment at the CrossFit gym is for me.
James Patterson:Nice
Anthony Karls:All right, so let's talk, let's talk about What is, what are we talking about here? So what's the different? What are we talking about when we say testing versus optimization? Tell us generally what we're, what we mean by that.
James Patterson:Yeah. So the biggest distinction between testing and optimization, since both, you know, kind of terminology, I think it's thrown out a lot in our industry is really understanding like the impact and the degree in which you're evaluating two different things. Right. So both. In testing and optimization, you'll be effectively trying to understand if some type of change is performing better or worse than whatever the original, you know, kind of condition of whatever you're looking at is. So, when we kind of weigh the 2 different elements between testing and optimization, we think about.
Anthony Karls:Yeah. Let's, let's like draw a specific use case so that we can like create some clarity in this indifference here.
James Patterson:yeah. So for an optimization, an example of that might be something that's, you know, definitely a little bit lower in terms of, um, you know, degree of change. So, you know, if you think about a search campaign, right, maybe you're tweaking one headline in a search ad and testing that against obviously what you've had in market for a previous period of time. Or maybe something on the landing page, you're changing just the color of a button, right? So it's these kind of more minor type, um, you know, variances and, and what you're, you're kind of evaluating from the original version. Um, but still obviously meaningful things to look at and can help you over time kind of build up momentum with these kind of, you know, a little bit more minimal impact, but still beneficial learnings to then ultimately get you to a bigger, you know, improvement long term by doing this on a consistent basis.
Anthony Karls:Yeah. So last in our last podcast, we talked about buying data and like, this is, is this an example of what we mean by that testing is optimization. How does that play in? Talk a little bit more about that.
James Patterson:Yep, exactly. So like we talked about in our last podcast, buying data. So this is really falls right into that, right? So we're basically, we're, we're putting part of our dollars or marketing spent and to try and understand, can we get. You know, some, you know, increase in whatever performance metric we're looking at. So that's why it's, you know, also important between testing and opposite and optimization to really understand what metrics you want to look at. Obviously, you know, we've talked a lot on the podcast about, you know, evaluating the waterfall and things like this, generally speaking, especially on, on the optimization side of things, since they are a little bit smaller type tests, you are going to be a little bit more focused than maybe some of the platform metrics or Things like Microsoft clarity and stuff like that. If it's a landing page test. Um, but that's ultimately what's going to build you up to get to an end. And then hopefully over time, as you're reviewing things with more impactful business metrics, like your waterfall, you're able to draw a connection between obviously this improvement you've made over time to seeing numbers, improving your business data as well.
Anthony Karls:Got it. So, so really the, so what we're, so the concept here is testing. These are large changes to. To a campaign or potentially maybe an ad or a landing page or something. And the optimization is small tweaks. So maybe we're going to tweak the headline a little bit, or we're going to make a small adjustment on the landing page, like changing the color of a button or something like that. So when you, when you think about how, how to do this in the paid media landscape, like what, what percentage of the time are you running tests? Versus what percentage of the time are you running optimizations? And like, how much should we think about allocation in terms of percent of traffic or percent of budget? Like how, what's a good way to think about this so that we're not shooting ourselves in the foot and doing unnecessary damage to ourselves
James Patterson:Yeah, so basically the maturity of your marketing program is going to influence if you're in this optimization side of things or if you're in this testing side of things more frequently. So a more mature marketing program or business that's obviously, You know, been in market for a while. You're hitting your revenue goals pretty consistently. We have a proven process where we can look back at your financials that, you know, the marketing strategies are working to help achieve the business goals. You're probably going to fall in more of the optimization bucket. So it's basically not jolting the, you know, the boat that is, you know, moving on the ocean in a very smooth fashion, uh, towards the goals we want to achieve. Whereas a business that's, you know, a little bit newer and, um, maybe, um, you know, hasn't invested as many resources and time into kind of looking at the whole marketing strategy, those are definitely going to be more in the testing bucket and kind of truly, truly trying to understand, are these, you know, more significant changes? Like maybe it's like, we've never used a landing page before we've been just driving traffic to the, you know, regular old website does driving traffic to this landing page now for the first time ever. You know, drive, you know, significant improvement to the results that we're tracking against it. So that's kind of the way to think about it. So like both businesses, both mature and, you know, kind of more in that startup and growing phase are both going to do a combination of optimization and testing at different times. But the degree of maturity in which you have for your business, probably the less of these massive tests that you're going to be doing, um, on a recurring basis. And it's going to be more of that kind of slight tweak again. It's like kind of like
Anthony Karls:is the. Is when you, when you say maturity, is that more about like achieving industry benchmarks for like top performance? Or are you talking about, I've been in business for 10 years, so I'm a mature business because we've been running for a while. Like talk a little bit about that. Cause it sounds, it sounds like it's the former and it may be situational depending on we've only ran on Google, but we haven't ran on Facebook. So we an optimization on Facebook and optimization on Google, or are we One versus the other. So talk a little bit about that.
James Patterson:Yeah. Good distinction for sure. So obviously just because you've been a business for 10 years, doesn't necessarily mean that you're at a good point to where you'll be. Basically, mostly in this optimization phase. So it's really going to come down to again, like, you know, we talked back to our earlier podcasts about building the waterfall and really understanding how our marketing program is going to help, you know, influence our business metrics and KPIs, it's really basically looking at it from the perspective is, you know, how long have we had these platforms active? We've had them active for, you know, I, you know, a number of years generally is going to be kind of more so in the optimization category versus maybe, you know, to kind of your example, Tony, if like, you know, we just started our Facebook ads account, um, last week. Um, so that's definitely kind of more so what I'm talking about on the maturity front, just because you've been in business for a while, or I've been doing different degrees of marketing doesn't necessarily mean that, you know, Oh, we've got it all figured out. So it's going to depend a little bit that way. Um, I would say generally speaking, when you get to this phase to where you're more so doing the optimization side of things is basically your, your vendor or if you're an in house team, whoever, you know, is kind of looking over and overseeing your marketing strategy. If they're getting to the point where it's like, you were having conversation about monitoring and. And, you know, we're evaluating the performance on a particular channel and we're, you know, better, you know, giving good feedback that we're hitting our benchmarks through set and all these types of things. That's generally a good indicator that you're probably in this more mature time period to where you're looking for. Kind of little to kind of continue to move the needle just a little bit further up.
Anthony Karls:So if a business introduces that's been in a mature place and a channel introduces a new offer, how would we think about that?
James Patterson:Yeah. So if a business introduces a new offer that we would definitely consider that to be in the testing bucket. So that's a pretty large, um, shift in potential value to your customer and obviously a, you know, key component to your marketing strategy and driving that new business. So, um, that's something that we would definitely put into the testing bucket and you'd want to basically look at that from, you know, the difference from optimization to testing is really the length of time in which you're looking at it. Yeah. And the amount of, you know, resources and efforts are going into that test to make sure that a, you've kind of set down what's your goals for that, how you're going to measure it, and then ultimately follow the loop and determine from there, um, you know, kind of the best form of this, right. Is if, you know, you go through a larger test, you, you set your goals. Over the course of, you know, varying time, usually at least four weeks, we're looking at, if not longer. A lot of times you were to determine the test was successful. A lot of times those will actually then inspire other optimization kind of smaller test. Right? So it's like, we know this offer really works, but have we tried the offer in our email program? Have we tried the offer in social? So maybe it's just tweaking Some of the strategy there to test out new messaging around that offer in different platforms.
Anthony Karls:Yeah. I know one of the things that I've used in the past is like. Is understand kind of what are the industry benchmarks for a related offer in a similar industry and how far off are my metrics is if I'm achieving a 20 percent call rate on a landing page, probably in an optimization place for like law firms specifically. If, if I'm at like 5%, I probably should do more tests in terms of how that page is laid out and how it's designed and what my messaging is and all that. Because I know like based on achieving industry, excellent standards, like there's quite a big Delta and opportunity that's missed there. So that's, that's another way to think about it. Um, again, like we talked about new channel, a new offer that we're going to go to market with what else, what are some other examples of when we would want to tweak this?
James Patterson:Yeah. So, I mean, I would say like, you know, just to give some kind of examples of the difference between the two is like, um, you know, when the law firms we work with, they've, um, really built up their marketing program over the course of time, their paid search strategies in a really great spot, quite frankly, uh, most of the time we're having conversations about like, should we leave it on or not? Cause lead flow is really great. And, um, Usually it's because, you know, something that may be in the sales process that they're working out and stuff like that. So it's like from a channel perspective, we're driving really great lead flow. Very happy with the program. So this is really a channel we're looking at for optimization. So like one of the things the teams are doing right now is they're looking for slight tweaks in the ad creative, um, as for opportunities to maybe see if we can just drive a little bit more click through rate, ultimately bring down CPCs. So again, that's where I'm talking about where we kind of, you know, we want to always focus obviously on the waterfall reporting to evaluate marketing holistically. Um, but then when we're talking about kind of these smaller tests, Optimizations, you're going to generally pick out some of these more, um, you know, upper funnel type metrics, like a click through rate or things like that. However, you want to kind of go through it. Whereas a different law firm that we're working with, um, basically net new to a lot of these platforms haven't looked at anything. So we're actually working with them with the kind of the example I mentioned previously was, um, They've been just driving traffic right to the site all along. And, you know, honestly, from their perspective, they're like, Oh yeah, you know, you feel like our paid search program's going okay. I'm looking to, you know, us to go in and find ways to improve it and whatnot. And, um, this is something really looking forward to. So we've gone through a whole conversation about, you know, we, what we want the landing page to look like, what are the different elements on there that are going to kind of play into our strategy to ultimately influence conversion rate and get folks to. Ultimately take their concerns or questions and reach out to, um, you know, their intake team to ultimately get an eval and consultation and move through the funnel. Um, so this is much larger, um, program from them, from going from just, you know, sending to the regular website to ultimately building out a whole new place for their customers to potentially see first as their first interaction with this brand. so it takes a lot more work from, you know, thinking through here, images to again, kind of structure of the page. Um, Um, even button, you know, button colors obviously right now are matching that more to the brand look and stuff like that. Um, but ultimately those are kind of two examples where like we have this law firm that's a little bit net new to their marketing program and one that's like kind of high flying. Um, that's kind of the best example of like the difference between the two types of tests that may exist.
Anthony Karls:Awesome. So big picture when we're doing, when we're in paid media, there's always opportunities to learn. One of the ways we can buy data is through intentionally testing or intentionally optimizing, making sure we're balanced there. Um, I guess one more, one more. Question before we wrap up. If we, what's the risk of doing tests? So they don't,'cause they don't always win.
James Patterson:Yeah, so it goes back to buying data, right? So, um. The way that we always like to frame tests, you know, right. It always sucks when you think of this great idea, right? I think this ad creative is going to absolutely smash. We've have some data to maybe support why we want to test this yada, yada. Well, sometimes it doesn't work. Right. Um, I think the most important thing to really fall back on again, is that buying data concept, um, you know, by learning from this, although it was, you know, it didn't go the way that we wanted. We can now be more confident in our current strategy and now look at kind of what's next. We can, you know, if we're doing our jobs correctly, we're taking note of this test and why we feel like it failed and doing a wrap up, wrap up on it and stuff like that, um, and that can really help you go from, you know, which isn't always the most fun thing in the world, which is say, client, this didn't work out the way that we, um, initially expected and take into a more positive, um, conversation moving forward. Um, you know, some of the cons with testing, right. Is like, like I said, it's just really, really crucial that you go into. A test, especially and and identify what's the timeline going to be like in the goal. I think that's a lot of times where these things go wrong is that sometimes you'll be having interactions with the clients or your vendors and stuff like that. And they'll be saying, oh, you know, 1 weekend, we're smashing it or whatever and looking at all these different things. And then. You know, by the end, maybe it's not looking that way anymore. And it's like, well, it's good to keep a pulse on things, but it's also important that like, you know, going into it, we agreed that at the end, know, that's how we're going to value it. So even though the start was strong, we have to stay true to the fact that we said, when we compiled all this data over the course of the next four weeks, that's really, we're going to make our final decision. So that's really the cons is I think that there is, um, a lot of discipline that needs to be, um, kind of ingrained in your team when you're going through this forward to actually. Do what you want to do. Because I think, you know, I think all humans, right? For being honest, we always want our cool ideas to work out. And that's just not always the case. Um, so I think that's the biggest thing. And not testing too much, too. I think that the flip side of this, right? If you work with, I think everybody's worked with a person like this. Um, where it's like, I want to test nine different things at once. Or, you know, have this test. Going and then an element of that test going in a different area, and then that can, you know, effectively ruin really the results because now you've kind of blended it with different things. Um, so again, disciplines, I think, probably 1 of the most important concepts I would say from testing that I think doesn't always, you know, come across when you're kind of going through the wrap up phase is that really need to have that to make sure that your testing program is successful.
Anthony Karls:Awesome. Cool. So cool. So to wrap up, uh, when we're thinking about buying data in the marketplace, obviously we can buy data and understand what's, what's working and what's not. One, drilling down deeper into that, uh, some methodologies we use is we run tests and we run optimizations. Tests are big changes. When we do those, we're typically isolating those. So we're not throwing everything at a test or running a split test. Um, We're splitting the traffic so that we're knowing we're not putting the business at risk. Um, and then there's optimizations, which are smaller, smaller changes. Again, we're going to run a split test on those. And then we're going to look at our results over time. We want to be disciplined. We want to be accurate on the data and how we feel about it. Because there's been plenty of times in my career, I'm sure in yours as well, where we don't like the results of the test. Even if the test won, but we hate the design or we hate whatever it looks like. And it's like, why do users like this? I don't know. It doesn't matter. Data wins, not my feelings. So anything else to add before, before we wrap up here?
James Patterson:No, I think I think it was a great discussion. Yeah, I think closing comment really is just that when you're going to testing, make sure you're going through the process of really identifying how you want to test it and for how long. Keep it clean. Don't again, be disciplined, fight the urge to test it. too many things or maybe have something overlap. Um, really be intentional with how you're setting it up and ultimately expect that a lot of times your tests may not be correct. If you're doing the right things and wrapping up correctly, it should, should ultimately help you lead to the next test on the line where, you know, fingers crossed, maybe that's the time where you get kind of your, your golden idea comes across and ends up being a big winner for you.
Anthony Karls:Awesome. Well, thanks, James. Appreciate it. Appreciate it, sir.
James Patterson:Yep. Thanks.