The Digital Customer Experience Podcast

Sibling Synergy and Machine Learning in CS with Maarten and Michiel Doornenbal of Churned | Episode 046

April 02, 2024 Alex Turkovic, Maarten Doornenbal, Michiel Doornenbal Episode 46
Sibling Synergy and Machine Learning in CS with Maarten and Michiel Doornenbal of Churned | Episode 046
The Digital Customer Experience Podcast
More Info
The Digital Customer Experience Podcast
Sibling Synergy and Machine Learning in CS with Maarten and Michiel Doornenbal of Churned | Episode 046
Apr 02, 2024 Episode 46
Alex Turkovic, Maarten Doornenbal, Michiel Doornenbal

Maarten & Michiel Doornenbal are not only brothers, but also co-founders of Churned! They join me this week for a fantastic conversation about the platform they've built, AI, scorecards and more! 

Unfortunately, my audio/video didn't record - so you'll here me narrating the episode - but it's kind of fun to switch formats up here an there. 

In the episode, we talk about:

  • Their history as brothers and how they work together as co-founders today
  • History of Churned and what they do
  • Using machine learning models instead of rules-based scorecards to create health scoring and to predict customer churn
  • Part of the role of a CSP is to highlight were data cleanliness issues exist
  • Verifying your churn risk alerts by back-testing the data against historical churned customers
  • Personalization based on user-personas
  • AI is not just Gen AI. The future of AI in CS combines predictive, descriptive and generative models  to auto generate content to free up time for CSMs.
  • What the human involvement will look like with these future AI models

Enjoy! I know I sure did!

Maarten's LinkedIn: https://www.linkedin.com/in/maartendoornenbal/
Michiel's LinkedIn: https://www.linkedin.com/in/michiel-doornenbal-98710067/
Churned: https://churned.io

Resources Mentioned:

++++++++++++++++++

Support the Show.

+++++++++++++++++

Like/Subscribe/Review:
If you are getting value from the show, please follow/subscribe so that you don't miss an episode and consider leaving us a review.

Website:
For more information about the show or to get in touch, visit DigitalCustomerSuccess.com.

Buy Alex a Cup of Coffee:
This show runs exclusively on caffeine - and lots of it. If you like what we're, consider supporting our habit by buying us a cup of coffee: https://bmc.link/dcsp

Thank you for all of your support!

The Digital Customer Success Podcast is hosted by Alex Turkovic

Show Notes Transcript Chapter Markers

Maarten & Michiel Doornenbal are not only brothers, but also co-founders of Churned! They join me this week for a fantastic conversation about the platform they've built, AI, scorecards and more! 

Unfortunately, my audio/video didn't record - so you'll here me narrating the episode - but it's kind of fun to switch formats up here an there. 

In the episode, we talk about:

  • Their history as brothers and how they work together as co-founders today
  • History of Churned and what they do
  • Using machine learning models instead of rules-based scorecards to create health scoring and to predict customer churn
  • Part of the role of a CSP is to highlight were data cleanliness issues exist
  • Verifying your churn risk alerts by back-testing the data against historical churned customers
  • Personalization based on user-personas
  • AI is not just Gen AI. The future of AI in CS combines predictive, descriptive and generative models  to auto generate content to free up time for CSMs.
  • What the human involvement will look like with these future AI models

Enjoy! I know I sure did!

Maarten's LinkedIn: https://www.linkedin.com/in/maartendoornenbal/
Michiel's LinkedIn: https://www.linkedin.com/in/michiel-doornenbal-98710067/
Churned: https://churned.io

Resources Mentioned:

++++++++++++++++++

Support the Show.

+++++++++++++++++

Like/Subscribe/Review:
If you are getting value from the show, please follow/subscribe so that you don't miss an episode and consider leaving us a review.

Website:
For more information about the show or to get in touch, visit DigitalCustomerSuccess.com.

Buy Alex a Cup of Coffee:
This show runs exclusively on caffeine - and lots of it. If you like what we're, consider supporting our habit by buying us a cup of coffee: https://bmc.link/dcsp

Thank you for all of your support!

The Digital Customer Success Podcast is hosted by Alex Turkovic

Speaker 1:

A predictive AI model telling you which customers are at risk, then a prescriptive AI model telling you what is the best action to take given his behavior, and then the generative AI actually does it. So it's connecting the different pieces of AI to have this autonomous AI vehicle in customer success.

Speaker 2:

And once again, welcome to the Digital Customer Success Podcast with me, alex Cherkovich. So glad you could join us here today and every week as I seek out and interview leaders and practitioners who are innovating and building great scaled CS programs. My goal is to share what I've learned and to bring you along with me for the ride so that you get the insights that you need to build and evolve your own digital CS program. If you'd like more info, want to get in touch or sign up for the latest updates, go to digitalcustomersuccesscom. For now, let's get started. Greetings and welcome to episode 46 of the Digital Customer Success Podcast. As always, it's great to have you back. One quick note I'm not sure if regular listeners are aware of this, but on a weekly basis I do distribute show notes and some other digital CS I guess mini articles, you could say as part of a newsletter that goes out on Thursdays. So if you'd like to sign up for that um, just go to digitalcustomersuccesscom. Uh, on most pages there'll be a signup form where you can sign up to receive those weekly digests. You'll get, um, the show notes for the week's show, as well as a preview of next week's show and then usually a mini article. Right now I'm doing a series on how digital CS functions can cross-collaborate between departments. So if you're interested, go to the website digitalcustomersuccesscom and go sign up for that.

Speaker 2:

Today's episode is a cool one. I've got Martin and Mihiel Dornenball from Turnedio. They're actually brothers and co-founders, so we do get into that a little bit. As with most things, digital things don't always go right when you take on some, you know, technical activities or anything having to do with technology, and this is one of those instances. The episode's going to be a little bit different because, luckily, both Martin and Mihiel's video and audio recorded just fine but mine didn't. In fact I had no audio or video from the session at all for me. So it was an interesting exercise going back and editing this episode because I had very eloquent answers and about I don't know 25 minutes of complete silence while they were listening to me. So what I've done is I've essentially I'm going to be interjecting myself into this episode and kind of setting up the question, you know, between their answers, so that you have some context for the conversation and how the conversation kind of went. It's a shame because we lost some funny moments and funny exchanges.

Speaker 2:

That said, there's a lot of gold in this episode. If you're not familiar with churned, it's essentially a CSP customer success platform built from the ground up with AI in mind, and so a lot of what they focus on is churn prediction and using AI models for churn prediction. It's pretty fascinating actually, and so we definitely dig into that a little bit as well as you know a little bit of. You know their history together as brothers and then as co-founders of churned. We definitely get into gen AI, but I think one of the most compelling elements of this conversation was the description of different AI models and how they can come together to be really effective in CS, specifically looking at predictive and then descriptive and generative models to create a fully automated system. So lots of gold in this one.

Speaker 2:

I hope you enjoy my conversation with Martin and Mihiel Donenbaugh of Churnedio, because I sure did so. In this first clip. I just asked for a little bit of background, specifically about them being brothers and their history as brothers and if they got along well when they were young and how that translates into today being co-founders and what that's like. So we get a little bit into that.

Speaker 1:

Sure, sure. So I'm the older brother, I'm Maarten for the listener. I'm actually only one and a half years older than Michiel, and we grew up in a small village approximately 10 minutes from Amsterdam. So in a small village you have to deal with a small group of people, and that also meant that Michiel and I were not only brothers at home, but we were also sharing.

Speaker 1:

We were in the same football team, we had the same friends, we were going out and as an older brother in the beginning you don't really like that, right. You don't want your little brother to be everywhere where you are. And also there were moments that he was mad at me and he snitched to my parents because everything I did he saw. So I think until when we were young, I think, we did not expect that we would have a business together, so I was not very nice to Michiel. I think when we were very young, at a certain point, that changed. We both started studying, went in a different direction, we both started living in Amsterdam, but in different houses, and I think then things got back together and eventually even starting uh, starting the business and I think also to.

Speaker 3:

To add to that, I think it's good that we are. We are completely different. So my brother's is uh responsible for the commercial side. He's more the, let's say, the extrovert, where I'm more like a data scientist, a little bit more um, uh yeah, doing other stuff, and I think that is important for, as brothers, if you start a business, that you don't want to be too much in each other's space, so to say. And I cannot tell him how he has to do his commercial side, because I don't have any commercial experience and he has no clue about data science. So that's lucky me, he cannot bitch on me about that.

Speaker 1:

Still, sometimes I try to tell him how to build an AI model, but we're different. I think we're different in terms of personalities and different in skills, so I think that's a good thing for us.

Speaker 2:

So, as you know if you've listened to the show, I like to dig a little bit deeper and not just ask the question that we prepped for. I like to get a little bit deeper and not just ask the question that we prepped for. I like to get a little bit more out of the guest, and so I dig in a little bit and ask them about their. You know what their entrepreneurial history was and you know whether they were so in their youth and you know how that has affected their business relationship today you know how that has affected their business relationship today.

Speaker 3:

Martha was definitely more entrepreneurial in his youth, I think, trading in in clothing during his high school, and I was definitely not not busy with that, that stuff, although the I found that shirt at the start and Marta always had the urge to found something. So that was funny that it was the other way around. Then for a while we tested it out because we thought, okay, in our youth we had some, let's say, challenges. We thought is it going to affect our relationship if we start a business together? Affect, affect our relationship if we start a business together? And so that's where we tested it out a little bit on a one day per week basis, see if we would, uh, yeah, get into each other's hairs.

Speaker 2:

But that turned out to be good and, uh, yeah, that's how it developed after this we spent about a minute um talking about how they separate work, uh, their personal life, you know, especially around the holidays and things like that. So just a quick more minute on some personal details, and then we get into the nitty gritty of Digital CS.

Speaker 1:

I think we talk too much business. I think Michiel's girlfriend will agree to that. But no, but it's so, we like it. But it does happen that. So I've been with him all day and then when he comes home I start calling him. And still so I said, when we were young we were not, uh, we were even forced to be friends, as had, we're in the same team, etc. But now of course we both still live in amsterdam, which is a big city, but still we, we go on holidays together with friends with the same friend group, we go to the gym before work, so we're actually very good friends, next to being business partners and brothers.

Speaker 2:

If you're watching this on YouTube, you'll notice they're drinking from pint glasses once in a while, and we did joke a little bit about whether they were drinking beer or not, because it was kind of late in the afternoon Actually it was evening when we recorded this for them. But yeah, it was tea. It wasn't even spiked tea. But anyway, as regular listeners of the show will know, one of the things that I ask all of my guests is their definition of digital CS, because it does differ from guest to guest, and in in this episode you get not one, but two definitions of digital cs one that's kind of current state and one that's um forward looking yeah, so of course, we listened to the podcast so we knew this was coming.

Speaker 1:

no, so so we discussed this, obviously and um. So my definition of customer success, digital customer success, would be using technology to automate and deliver the right information to the right customer, or even the right user, at the right time. And so obviously, it's skill, and why I say this is because it's not difficult to automate and deliver information to a customer or a user, but it is to deliver the right information to the right user at the right time. Deliver the right information to the right user at the right time. I think that's the intelligence that is required in digital customer success, what we claim to bring.

Speaker 3:

But, michiel, I think you have a yeah, so of course, it would be boring if I would give the same definition of digital customer success. So I thought how can I make it different? And that brought me actually to something to propose, let's say, a definition which I think we will have in a few years. So not the current digital customer success definition, but one that will be a future one. Of course, there are many experienced customer success leaders have joined this podcast and might be listening, so they will probably argue this and probably write so, but I'm still going to do it. So the future we believe that it will be autonomous, digital, personalized digital customer success. This doesn't mean, of course, that there is no interaction or that there's no one-to-one high touch approach from customer success managers, but we see it more in a way that the customer success manager determines the destiny and the AI will bring you there.

Speaker 1:

Yeah, so that's more the future how we see it currently, it's actually freeing them of time right, so that humans can focus on things that humans are good at.

Speaker 2:

So I love these definitions. I thought they were spot on, very much in line with previous guests. Also, the forward-looking statement is very much in line with what we've been talking about on this show, which is to say that we're trying to automate as much as possible so that the humans can focus on actually building strong human relationships. They're just thinking about it a few steps further along than we are, and rightly so, you know, given the nature of their business. And so that kind of led us into the next question, which is, you know, we try to be platform agnostic on this show. I try not to highlight one over the other, but it's always fun having CS-focused vendors and platforms on the show because, a they're serving the CS community, b they know a lot about CS and C they're working on cool cutting-edge technology. So I did want them to get a little bit into how Churned came to be and what market challenges they're trying to address.

Speaker 1:

So Churned is an AI-driven customer success management platform. Maybe a bit of a history. So Michiel, my brother, started the company together with his former professor. So Michiel did a master's in data science at the University of Amsterdam, vu University, and he wrote this master thesis about can you use machine learning to predict customer churn. And that thesis was supervised by that professor in data science. And they found out in the thesis hey, you can predict customer churn very well using machine learning. So I said, okay, shouldn't we do something more serious with this? But, as you can imagine, a professor in data science and a data scientist are not the most commercial people on earth. Actually, in our case, it's not really the case, because they're they're surprisingly commercial. But that made them ask me I should, shouldn't we do this with the three of us? So I did, we started. That moment was about four years ago, january 2020. And in the beginning, we spent some time in developing these machine learning models and the models worked well.

Speaker 2:

At this point I interject, as I often do, with the fact that hey, wait, this was during COVID, right.

Speaker 1:

Yeah, yeah, it was in the middle of COVID, that's true, that was true. Yeah, it was a great time to develop models, in my case, sales presentations. Yeah. So we developed these models and then thought, okay, so who will benefit most from this solution? And we knew, of course, that churn was a problem in SaaS. We knew that customer success was responsible for the churn topic, so that made us research that market a little bit.

Speaker 1:

So we did some interviews with SaaS leaders, with CS leaders, with CSMs, and found out that the current software solutions used by CS teams, our customer management platforms they were not using AI or machine learning models yet in the topic that we are focused on. So on the portfolio management part. So their health scores were done, rule-based, right, so which are not, let's say, we don't see them as really smart solutions to do it based on gut feeling, and that made us decide to do that in a different way, and that is the AI way. So using predictive machine learning models to predict customer health, maybe to clarify. So we started really on the churn part.

Speaker 1:

It's actually now it's more than that. It's a customer success management platform with an AI engine, and that engine contains predictive models that uses customer data to predict what's going to happen, and it can be either churn, upsell. We predict customer behavior and the thing that we started with was churn. So that is how we started the company and we help customer success teams target the right people at the right time with the right information to prevent churn and boost NRR. That's it.

Speaker 2:

So, given that churned is natively machine learning based, I wanted to make sure that we did have this conversation about rules-based scorecards versus machine learning-based scorecards. On these rules, either ourselves or an admin goes in there and configures all these rules to configure a scorecard that tries to be as predictive as possible and that that continually proves to be exceedingly difficult, whereas machine learning has gotten to the point now where it is entirely possible to accurately predict churn, you know, with these scorecards, and so we get a little bit into the details of, uh, you know how the sausage is made, so to speak I think, um, yeah, indeed, I think there's a if you would have, uh, multiple technical people in a in a company can define in a rule-based way, health scores pretty well.

Speaker 3:

I think the beauty is of what we do is that it's very machine learning can find a lot of nonlinear relationships, so to say.

Speaker 3:

So it's not always the case that higher product usage is always better for adoption. What we often see is that right before the moment of churn, actually usage can go up a lot. So that is unexpected behavior. So we basically predict what is the expected healthy behavior of a customer and if it falls outside of those boundaries, then it's actually a alert. And it is, of course, a combination of all kinds of variables that machine learning model can pick up. And also, I think what is interesting is that it should adapt very well to changes. So if you make changes to your pricing, make changes, if there are economical changes, machine learning model should pick up those parts, those parts and, yeah, the rules that you might have need to change in a rule-based platform. Yeah, that requires a lot of maintenance and machine learning model should be able to pick up those changes and also adapt the risk course according to that and maybe for the person that, in rookie words like the salesperson, would explain it.

Speaker 1:

So it works just like a rule-based system. So you integrate all your customer data, but then you don't have to set up the rules yourself, but the model will just. If the thing that you want to predict is churn, the model will look at the customers that churned in the past to look at what are the patterns in the behavior and the characteristics of those customers, to predict it for the current customers. So it trains on historic data and predicts what current customers will do without human help.

Speaker 2:

Next we get into some topics related to data, which I know is a sore spot for a lot of folks in digital, mainly because it takes decent data to do anything worth value in digital data, to do anything worth value in digital, and so one of the questions specifically that I had for them was around how data variability affects, you know, scoring with machine learning and you know what those outcomes really look like. One of the examples that I think I used in this conversation was a scenario that I'm well familiar with, which is, hey look, we've got both an on-prem product and a SaaS product and we get different telemetry from each one, and so you know, we've had to design different scorecards to account for that, for instance, and so you know, and so I wanted to get a sense for you know how data variability affects scoring, and then we get into some other data things as well.

Speaker 3:

No, I think that is exactly the beauty is usage, and we see that there is also a variable that states that it's an on-prem customer. That relationship, basically, is being picked up. So the model will recognize hey, for on-prem customers, no, product usage doesn't mean anything, because that relationship yeah, it's just that factors each other's out basically. So that is how it will deal with those kinds of uh factors.

Speaker 2:

Um, yeah, if you, if you give that information, of course, to the model so that then naturally kind of led me down the path of okay, what, what kind of data do we need here, um, to get this stuff working, um, correctly? And and I kind of I think I didn't quite ask the question correctly. Initially I think I asked how long it takes the uh, you know, to train the platform, um, you know. But we got around to the fact that that you know it it. The actual question is how much data is required, um, and how much historical data is required to, you know, properly train a scorecard to to start scorecard to start to represent some accuracy in predicting churn.

Speaker 3:

So it depends a bit. So it's not that we are building a machine learning model, it's not that it learns over time. We can just look at the history and then it retrains immediately. So it's not that it gets better every week or so until a certain moment. No, we can immediately already have a good model. There's not really a, let's say, a threshold or a clear minimum amount of data that we need. What we do internally is that we have different models. So you have a very advanced machine learning model, a neural network, that very well. If you have big data sets, a high number of customers, a lot of churn, then you have very complex machine learning models that work very well. We also have customers, of course, that are very enterprise focused smaller number of customers, limited data and then we just run different models on that. So there's not a, let's say, minimal requirement. It just requires different settings of different models.

Speaker 1:

Basically, and that's done in the platform. So we run the data through it and it tests it through multiple, let's say, models and it will pick the winner. But maybe also on the how much data do you need? Of course, the question we also get often in the commercial discussions and maybe in examples. We started also in B2C. We started with, for example, an NGO. They had 150,000 customers, but they only have the name and the donation amount. You think you have a lot of data, but it's actually not that much. We have super enterprise customers that only have 100 customers doing 100 million ARR, that have data of every click that a user does. So it's not really about how many companies or clients or users. More like what is the depth of the data that you have?

Speaker 2:

And while we're on the topic of data, we then get into a conversation about trusting in your data, because a lot of us in CS really rely on our data and our insights to be able to provide value, not just to our customers, but to provide value to our internal customers and our stakeholders by providing really valuable insights on the health of customers. And so we talk about trusting data, and one of the interesting points that comes out of this next section is actually one of the side purposes of a CSP, which is using it to highlight where you do have gaps in data.

Speaker 1:

Let's say, of course, we also get the question how good are your predictions? So maybe to start with, okay, trusting the data. That is something that a objection that we had to deal with and how we dealt with. It is through backtesting and also people that are implementing health scores that are not done by a turn but by themselves. It is a way to verify can you actually predict what is going to happen? For example, you predict which customers are going, so we exclude the last year of data and we predict which customers churned in 2024. That is a way, for example, and then you actually measure which customers churned. So it is a way to, let's say, to verify how well am I able to predict. I think if you can explain that to the business, that's already a start, because if you show that you can predict well, there will be more trust and the health scores will be used.

Speaker 1:

I think that is a thing to start with, but maybe it's a bit technical for the people that are listening to the podcast that don't know how to backtest themselves. How we deal internally with this thing is that we also report on it in the tool itself. So we predict which customers are at risk. Then, of course it's up to the customer success manager or the digital motion around it to do something with those predictions and health scores so sending emails based on risk alerts and low health scores, for example, and then how we report on it is did the companies actually contact the customers that were at risk? If not, then there's already proof that you should actually do something with it. But it is still a difficult thing because even if you have 90% right and you have one off, then people will say yeah, but see, it doesn't work. So it is also a mind chat.

Speaker 3:

I think it's, indeed it's, a challenging one.

Speaker 3:

I think, then, what we deliver at least what a customer success platform delivers might actually be more on the side of hey, we help you clean up your data, point you in the right direction, prepare you basically for the journey that you can embark on. So it shows why your ARR in the platform is completely incorrect. Why is that? You highlight basically all the flaws that you have in your data. And also, I think, if you run a machine learning model or a model on the data, it will show which features are important and it will also guide them. So it is important to store historical product usage data on a feature type level. Or it is very important to see if a customer has been upsold or has had certain plan changes that they made historically. So we should store that and not just overwrite all the data continuously. So I think, in that sense, you're just taking a customer by the hand and showing them hey, this is how you should build a good, trustworthy data set set but of course there are situations that the customers have just crap data.

Speaker 2:

Yeah, that doesn't happen, of course as mentioned earlier, I love having cs facing vendors, um and platforms on the show, because they see some really cool stuff that their customers are doing in customer success, uh, and so I ask you know if there are some examples of digital cs that they've seen uh, within their customer base and they came back with um this.

Speaker 1:

A nice example is one of uh of our dutch saas customers. Um, so we just released a new feature and we were very happy that the customers had actually tested, and especially if they tested with some commitment, which they did. And what we've developed is this way of modeling the users. So not what we see, that in B2B it's often quite generic the playbooks and next best actions is really on a company level, and what you see in B2C and e-commerce that is very on a person. It's on a user level. It's super personalized and tailored, and that is what we just released in this model to. It's similar to an RFM model I'm not sure if you're familiar with that. It's a model that used e-commerce and what we've done is that.

Speaker 1:

So, let's say, as a company, you might have 10 users, but those 10 users might engage completely different with your product. One uses it every day, one once every month, one is a cfo, so it's not expected to use it every day. So every user has a different expected behavior and also should actually be treated differently. And that is what this model does. So it segments your users from champions to hibernating customers, new users, et cetera, and what this customer of us did is connected all kinds of playbooks on a user level connected, all kinds of playbooks on a user level. So, example, a new customer should become a new user and should be in a welcome journey, while a champion could get a G2 request or even an upsell opportunity, and they had all kinds of playbooks for the, let's say, the different user segments. And I think that, uh, that works really well. I don't know if you have anything to add to this, this part no.

Speaker 3:

So, um, yeah, the interesting thing to see there is what they did with it is that they used also, what type of features did they use of the product that is related to a certain user role? So, indeed, you compare the features that a CEO is expected to use in relation to what they are actually using and, based on that behavior, you can bring very personalized messaging, and that's what I think they did very well, both in-app as in. They're just their automated outreach, exactly.

Speaker 1:

It's almost like you have one customer with 10 users. Actually 10 different customers and 10 different health scores, so 10 different playbooks.

Speaker 2:

So this is where the rubber meets the road with these two, because they've built this platform foundationally from the ground up using machine learning, and so I ask about the future of machine learning and AI in CS specifically, and also how that pertains to us humans, us mere mortals, and our interaction with these machine learning models one thing that I think is good to realize that ai is not one thing, right?

Speaker 1:

so you have it's more like a family of technologies and methods that do something similar but different, right? So everybody knows gen ai, as that is ai, but it's, it's one of the forms that AI, so it's actually Gen AI is more the chat GPT-like. But we have predictive AI is using a machine learning model like churn to predict what a customer is going to do. You have prescriptive AI, which is telling you what to do. So what is the next best action? So now we know this customer is at risk. What is the next best action to take?

Speaker 1:

Nlp, so natural language processing. So I think there are more forms of AI. So it's not only in the gen AI field. But I think our problem we will confirm this in a second is that we believe that the future of AI is actually combining those different sources. So it's not so a predictive AI model telling you which customers are at risk, then a prescriptive AI model telling you what is the best action to take given his behavior, and then the generative AI actually does it. So it's connecting the different pieces of AI to have this autonomous AI vehicle in customer success.

Speaker 3:

Martin, you said it very well. I think no need for going into the depths of the AI, where it will go. I think this is indeed what is crucial for the future combining all the different components and bringing it into one piece.

Speaker 1:

basically, Maybe many more different AI solutions will appear, maybe on the so, where a prescriptive model is very good at, or a machine learning model is using historic data to predict what should happen. So if you have never logged a call, a model cannot predict the call as the next best action. So there has to be a human telling. Okay, we should do calls as a playbook. If a customer doesn't log in, we call them and then at a certain point the ai can measure okay, what did we see happening after doing that call? So it still needs the human, let's say, to create the playbooks and to come with new actions, so that then the ai model can look at all these things that you've done and to see okay, for this customer that showed this specific behavior, this was actually the next best action.

Speaker 1:

So after performing this call, we've seen that feature X had a usage increase of Y. So it's actually learning from the moments that you've done. But it cannot create, let's say, new ideas itself. So a human has to create the plan and then it can verify it. Let's say, new ideas itself. So a human has to create the plan and then it can verify it.

Speaker 2:

Let's say, so I want to spend a little bit of time unpacking that a little bit. The first part of the fact that we're looking at, you know, this kind of flow of different machine learning models to get what we need in terms of customer facing or internal facing, you know, ai assistance. Ai assistance is this notion of you've got the predictive element that's really looking for those, you know, those things that are about to happen in the account or with the user. You've got prescriptive things that are kind of, you know, telling you or potentially telling a Gen AI model you know what to do, and then you have the execution of that which could be human, could be Gen AI, but it's a very interesting progression, you know, and connection of different methods of machine learning. So I think that is fascinating and something that I've been digging into and really, you know, trying to wrap my head around and understand a little bit more. Their minds. Wrapped around these sort of concepts are the folks that are, you know, are going to have a place, you know, within CS and within the workplace, going forward Because, look, this stuff isn't going anywhere and the more that we can really understand about how these tools are going to help us and how these tools are here to augment what it is we do as humans, the better.

Speaker 2:

I think the other point here is that you know, at the end of the day, you know as a human how you want your customer experience to be. There isn't an AI model that's going to design it for you the way you want it. You know it may with some prompting, but again, the prompting comes from you and the design comes from you and the inspiration comes from you, and that's the human element of this whole thing. So when I, when I think about these, you know, generative AI models and these, you know these kind of things that we've been talking about and these things that, quite frankly, a lot of people are scared about because they're, you know, going to. You know, come in and take all our jobs.

Speaker 2:

You know, if you find yourself doing highly repetitive work and you find yourself doing something that a machine could do, yeah, it's a possibility. But if you're, if you are a highly creative individual, you're used to using technology. You like, you know adopting new tools and things like that, and and and quite. You know, quite frankly, if, if you're someone who, um, uses your skills as a human to help a business thrive or help, you know, create something that didn't exist before. Those are the kinds of things that are primed for these types of AI assistants to come in and help. So I don't know. That's my two cents. As we start to wrap up this conversation, I do ask I mean everybody for kind of their shout outs and their resources, and so this is a stitched together montage, if you will, of you know some shout outs that they like to give, as well as some podcasts, some resources, some books that they're paying attention to.

Speaker 1:

Sure. So first of all, shout out to this podcast. There's actually a podcast that I do listen, no, but also, as a SaaS founder, I also like to listen to other people the same situation telling about their problems and how they're doing it. And how this relates to CS is that I listened to the what's it called the Big Exit Show. It's about I listened to that yesterday Robin van Lieshout telling how he exited inside to Gainsight so he sold his company to a customer's customer's platform. So I would recommend listening to that. Shout out to Robin van Lieshout great story. And just to follow some communities. So Mick Weyers with Customer Success Snack I don't know if you ever heard of him, but he's doing very well. He has Connect. There are a couple of communities that I follow and events that I participate in now and then.

Speaker 3:

Yeah, for me it's just I think I just listen a little bit on random stuff more on the technical side. So I like less, let's say, customer success related stuff, more data science like what models can we use for customer success. And I think interesting book to read is Outliers. That just also keeps you. It's not really machine learning models but it tells a interesting story. That, yeah, that connects a little bit of the data points to events. So you can, you know you can play it a little bit on your own story. But yeah, I think the general Impact Weekly scaling up podcast is also a very interesting one.

Speaker 2:

So there you have it. That was my conversation with Martin and Michiel from Churned. A lot of good stuff in this episode. I really enjoyed it. It was a pain in the butt to edit, but you know the format is kind of cool. I guess I wish it wasn't so labor intensive. Churned puts on quite a few events and webinars and things like that. So keep an eye out on the website, um, which will be linked in the show notes, um, and and go have a look around there to see what upcoming events that they, uh, they do have at churnio. Um.

Speaker 2:

Again, if you want to sign up for the newsletter that I mentioned at the beginning of the episode, go to digitalcustomersuccesscom. And yeah, I hope you enjoyed this episode, because I sure did. And here comes the outro. Thank you for joining me for this episode of the Digital Customer Success Podcast. If you like what we're doing, consider leaving us a review on your podcast platform of choice. It really helps us to grow and to provide value to a broader audience. You can view the Digital Customer Success definition word map and get more details about the show at digitalcustomersuccesscom. My name is Alex Terkovich. Thanks again for joining and we'll see you next time.

AI in Customer Success
The Future of Customer Success Technology
Predictive Customer Churn With Machine Learning
Machine Learning and Customer Success
Future of AI in Customer Success
Digital Customer Success Podcast Sign-Up