What's Up with Tech?

Revolutionizing Customer Service: Teneo's Cutting-Edge AI Solutions

Evan Kirstel

Interested in being a guest? Email us at admin@evankirstel.com

Discover how conversational AI is revolutionizing customer service with industry leader Per Ottosson CEO at Teneo. Say goodbye to frustrating IVR systems and hello to advanced AI solutions that truly understand customer intent. You'll learn how Teneo's unique deterministic input and output systems are setting new standards in automated interactions. With millions of calls automated monthly, Teneo is leading the charge in transforming customer experiences. Per provides insights into the crowded conversational AI market, highlighting Teneo's innovative solutions that overcome common challenges organizations face when adopting these technologies.

Explore the advanced strategies and technologies that are shaping the future of customer interactions. From the powerful ML technology supporting thousands of agents to the shifting preferences towards voice channels, we cover it all. Hear about successful implementations in industries like healthcare, where automation is streamlining communication. We also delve into the implications of the EU AI Act on biometric technologies and how companies like HelloFresh are adapting to changing customer expectations. Data privacy, robust encryption, and compliance with security certifications are more crucial than ever. Get ready to understand the future landscape of customer service and how Teneo is at the forefront of this exciting transformation.

Scale Faster with The Growth Hack
Growth marketing tips & tech insights from those who’ve done it.

Listen on: Apple Podcasts   Spotify

Support the show

More at https://linktr.ee/EvanKirstel

Speaker 1:

Hey everybody, fascinating chat today diving into what's next for Advanced Conversational AI with Tenio. Per. How are you Fantastic? How are you this morning, evan? I'm doing well. Thanks for joining in such an exciting time and space in Conversational AI. Maybe introduce yourself the mission at Tenio and a little bit about your journey yourself the mission at Tenio and a little bit about your journey.

Speaker 2:

Absolutely so. I'm Pär Utzon. I'm Swedish, worked most of my life in US companies and in AI since 2009. So quite a long experience within the AI industry and, incidentally, also a company has a long history in the AI industry and our mission is to make people less frustrated when they call you. So when they're calling into your company. We want to end that frustration of I mean, I'm sure you've seen this, evan right, you call to a company you just want to rebook something, or you have an invoice you don't agree with them. What are you met with? You're met with press one, press two, press three, but that's the most common thing that you see. Or somebody asks you to say something and all they're looking for is a keyword, and you end up in this constant loop of trying to understand what's going on. To get to the person you will need to talk to. That's what we get rid of. So Teneo gets rid of that pesky IVR that somebody's using today. Most people are using new ones. The, incidentally, new ones is also end of life. So next year, in most ecosystems as of end of next year, in some ecosystems, as of end of 2026, depending on where you acquired your newest IVR. So that's what we do.

Speaker 2:

Worked on this for a long, long time. I spent $200 million building the product. We have patents worth about $150 million on the product and today we're the only ones that can deliver real deterministic input and output. So, knowing exactly what the customer said not the text, but what the intent was, what the entity was, what the date was, what the entity was, what the date was, etc. And then on the other side, delivering the message. That really is the correct message, not the hallucinated message. But in between there we can use LLMs and all the other stuff which may be hallucinating or maybe doing different stuff, but we always deliver a deterministic output from our system.

Speaker 1:

Well, it's an exciting proposition and change for good is coming, so I'm really excited to chat. Let's take a big picture to start. What are some of the biggest challenge organizations face when adopting conversational IAM? It's a new technology for a vast majority of large enterprises, and how do you help customers address these challenges and obstacles?

Speaker 2:

Well, the first challenge that people have is they don't have any data. So a lot of companies have transcribed phone calls, but typically with the help of first some kind of AI layer and then humans, but it's really just text. It's not an understanding of the text. It's never been really put into an engine to understand what the customer is calling about so you can actually query it and see. You know what's the volume of my data. Sometimes that's done by humans and sometimes it's not done at all, and that's probably the most important one that it's not done at all. So what we do is we install something we call open question, which is the first solution from us, and all that does is that listens to the customer, say hey, welcome to this and that company, how can we help you today? And then you speak in natural language in long form sentences and we will start to break that down into why customers are calling and start routing it to the right group to initially start off with. So you get rid of the one, two, three, four, five press immediately and as we then a few months, let's say 60 days of that data. You'll then start automating these phone calls and today we deliver 25 automated interactions per second. So this is very big. We do several customers that do more than a million automated phone calls per month, and that is outsized much bigger than anybody else has done. We also have a vision. Everybody's been talking about this.

Speaker 2:

Google Dialogflow invented this in 2017 or 18. They presented it on stage. Sundar Pichai had this great presentation what he called to do a barbershop appointment headrestless appointment but that technology was never really adopted. It never really worked because it didn't understand quite well enough why somebody was calling. And we've tested this out with all the other tools that's out there and we are the most accurate. We did that together with a testing company, sierra, called Sierra, and you'll find all that data on our website.

Speaker 2:

But what that means is that once you have that data ruled in, you can now start automating, and that's what people really want to do, right. So instead of waiting in the queue, you can now start automating. So our most advanced customer they automate roughly 4 million calls a month and that's 60% of the volume. So 60% of the phone calls which are support calls, calls about bills, calls about what should I buy, which product do I need license keys, et cetera. Those are all automated now. So, customers, there's no queue. There's no queue to get to the right person, but there's also no queue to even wait for the right person.

Speaker 1:

It's instant service. Oh, fantastic opportunity. Well done, and tell me about the conversational AI landscape from your point of view. It's fascinating, but it's also very complicated and very busy and noisy. You have, of course, the big analyst quadrants, but you also have hundreds, if not thousands, of new entrants in this space and different bots, different approaches, startups. So how do you see your unique role in this space and your positioning therein?

Speaker 2:

Well, to start off with, the big thing now is everybody thinks that they're going to solve this with an LLM, right? So you put some lipstick on an LLM and off you go, and just these last two weeks we first heard Jens and Wang say we're not getting rid of sorry, the CEO of NVIDIA say we're not getting rid of the hallucinations, and now the CEO of Alphabet said the same thing the day before yesterday as well. And it's true that the transformer technology, which is the background of the LLMs, is always going to be producing hallucinations, and you probably saw scene testing. You do the same, you try the same prompt every day, a few times a day, and you get different hallucinations and it goes up between 6% and 22% and there's no real way to control it, and I think everybody listening to this will have seen that as well, that it's very difficult to address that layer. So if you want to use LLMs, you need to use a framework that can get rid of the hallucinations, that can deliver deterministic output, but you also need to have the deterministic input, because you need to understand really what's saying. And I thought I was at the AWS Connect conference last week and what was quite interesting was that the Lex, which is the technology behind the Lex as well, was used to transcribe as we were speaking in different seminars and wherever people were speaking was transcribing it. And AWS Connect was consistently put as AWS Kinect, and obviously that's because in the LLM was consistently put as AWS Kinect and obviously that's because in the LLM model something that sounds like Kinect is probably Kinect. Now that's going to be a big difference for you if you have a product called AWS Kinect which of course AWS themselves have tried to train into Bedrock and say, hey, when we say Kinect we mean Kinect, not Kinect. But obviously that's something that you can't quite control. So you're going to need that deterministic input and output, because if you input connect or input connect, there's going to be a huge difference for the output as well. So that's the first part, the LLM space.

Speaker 2:

Llm is a great background technology. We use it, for example, as part of our co-pilot to translate. If you build a solution, let's say in English, we had a customer go from English to 48 languages over a weekend, but just using the co-pilot button, and then they went to Azure, openai did the translation, came back and then we had human reinforcement for the first two weeks to make sure that those translations were accurate and were good enough, that those translations were accurate and were good enough, and then they were off to the races delivering in all these different countries, which was 100 and something odd countries off the bat. So voice calls. The second thing going on is voice. So when you deliver a chatbot and you connect the chatbot to your backend systems and to an LLM, now you have a big security risk because it's quite easy in a chatbot to do prompt engineering. However, if you deliver a voice solution, it's very difficult. You're not going to place 1,000 phone calls and have different voice play-ups to try to gain the LLM system, to try to get back at the backend system.

Speaker 2:

So the big change here is that if you have voice at volume, which we do, then it's really not that interesting to build a chatbot anymore, because the chatbot is never going to be able to go into all the backend systems and do all the work in the background that you need. So our voicebot goes into your SAP or Salesforce, whatever it might be, and of course integrates right away into Avaya, into Cisco, genesys, amazon Connect, and we then drive that conversation. So there's no way to gain that there's no way to gain access through Teneo, then back into the backend system. So there's the LLM lipstick on an LLM space. That's going to be your 1,200 emails. Come to our webinar that you get every day. There's the traditional CAI, which basically deploys chatbots which now use LLMs in the background as well.

Speaker 2:

And then there's the voice side, and on the voice side there's quite a few that do we say they do 80% of the potential queries, that provide 20% of the values, and those are is your store open? Where's your store? Where can I find my bill? Can you send me a new password? The simple stuff. When it gets to more complex stuff like support, that we do quite a lot. We do for AT&T, for example, support for your home router, for your home alarm system. When it comes to that, the chatbots typically never get developed into going into your router and so forth, but through voice, absolutely. So, yeah, that's our view of the landscape. If you really want to provide service to your customer, go to voice, don't think chat, and if you really want to do that, you can connect your LLM in the background but not in the foreground.

Speaker 1:

Wow, really well said, Great insights. So let's talk product differentiation. You touched on this a few times, but I've had gosh 150 different conversational AI vendors on my show. Everyone has a unique selling point, let's say, but what sets your platform apart from this vast number of companies in this space?

Speaker 2:

It's voice at scale that sets us apart. As soon as you have a couple of hundred thousand calls a month, this really, really works and it's the only solution that works at scale. So I mean, you can call any company in the US today If they have a voice solution, if they're a large company and they have a voice-based solution, it's going to be looking for keywords, it's going to be based on nuance or based on Google Dialogflow, potentially Lex, but it's not going to be able to carry on a conversation. If you really want to have a conversation, we're the only ones to deliver it, and the reason for that is the deterministic pre and post-processing. This is called TLML, one of our patents, and the other one is the performance that we have in the platform. Our platform delivers huge performance. We I mean doing 25 automated interactions per second. In the SaaS platform is, and there's no real ceiling that we've seen so far. We have peaks, of course, that are much bigger and it's just deliberate.

Speaker 2:

Then it's the question of agents. Everybody now talks agents and agentic. Microsoft launched 12 agents last week, or was it the week before last when they had their big conference? We have 17,000 agents deployed at our customer sites already. So an agent can provide instruction in natural language and then it goes out and autonomously does something right, manipulates a record in your database or changes a setting in your router, sends your QR code to go replace your power supply to your computer whatever that might be. But we have 17,000 such agents actively that have been built by customers already. So really it's scale and it works. That's the big differentiators and it's all about those patents. We started working on this 20 years ago. Same team has been working developing that since then. That's what we spent those $200 million on.

Speaker 1:

Well money, well spent. Let's talk about big tech and generative AI integration. All of the big tech players, the Magnificent Seven, have sort of trained their interest on customer experience, which you know is kind of the low-hanging fruit, if you will. As you know, do you see these companies, whether it's a Salesforce or Google or AWS, as all partners, competitors, coopetition, I mean? How do you see the big giants in this space?

Speaker 2:

So I see the ones getting the initial business to build that chatbot because the CEO says go do something on LLMs. That's going to be Google, microsoft and Amazon Salesforce. We don't see at all in our customers because our customers typically integrate to SAP, genesys, salesforce, hubspot, to tens of systems, not to one. So being just in the Salesforce ecosystem is not very helpful, or just in the Zendesk ecosystem is not very helpful to our customer base. They want to be able to. Evan is calling in. Well, evan is a customer, he's a prospect, he has four devices in his home connected to us. We want to know everything about Evan when Evan calls in, and then it's not enough to be integrated into one ecosystem. So that's why we see Amazon, microsoft and Google. On the flip side of that, they only currently power chatbots and then the customer will turn to us when they want to turn this into voice and we then partner up with them.

Speaker 2:

So in the case of Microsoft, we are a transact partner so you can get us on their MACA agreement. So customers buy us from the bucket that they committed to Microsoft already. Amazon we're on their marketplace and customers can buy us straight off of there. Genesys we're on their marketplace. Genesys is also really coming up here, especially with the new ones and the live. Genesys has a lot of customers that have new technology that they've deployed, technology that they've deployed and Genesys obviously going to IPO next year, growing quite and pushing quite hard in the space with their new cloud offerings, so we also see them coming into this space now. They also have the same philosophy we need to integrate to everything. So we're similar to the big cloud providers, but we haven't seen any of the other. You know the other ecosystems that are being built around an application. We just don't see those in the support or call center space.

Speaker 1:

Well, good to know you touched on some clients' success stories around scale. Any other stories or anecdotes you know on how your clients are using you to implement new ways of interacting with customers that you might be able to share?

Speaker 2:

The first is, of course, to take just the inbound phone call and route it to the right person and then start automating those phone calls. That's the prime use case. But what we're seeing now is also outbound use cases where you might want to call, for example, if there's an appointment that's been set. This is, in the healthcare industry right now quite something that it's primarily in the UK right now the NHS, but it's coming over to US companies as well. Right now we have several in the pipeline. So let's say, evan, you have an appointment on Tuesday to do a foot surgery and there's lots of stuff happening in Boston that week. People are slipping and they're breaking bones and arms.

Speaker 2:

That's true, yes, so they have to postpone your surgery. Then that outbound call is something that's quite important to be able to get a confirmation that you're talking to Evan and that Evan said yes, I can move it to Tuesday, the 24th of December. The day before Christmas is much better for me anyway, so let's do that. So those types of conversations are now something that's coming. We also see that people are moving things away from chat and into the voice channel to be able to provide a faster experience and meet the customer where they actually are.

Speaker 2:

People have given up a bit on chat. This is the travel industry, now Also HelloFresh, for example, one of our customers. They see the same thing that if you had a chatbot, if you have a chatbot experience, which everybody has now you're probably seeing chatbots that aren't working very well. So you sort of just click away the chatbot. That want to get immediately to the phone. So if you can provide a better phone experience that saves the customer experience, customers want to move to the phone. So we see a lot of deploying flows away from chatbot into phone.

Speaker 2:

And then the third one right now that everybody's trialing is that we saw with this large tech provider in the US is to go to more languages from one language, because everybody has that problem. If you're a large international company, you're going to have three people that speak Malay that need to be online in odd hours in the US, and those three people two are sick one day and you have a big problem. So if you can cover all the languages with just one implementation, that's all I would say. All our customers are trying right now.

Speaker 1:

Yeah, huge opportunity. Let's talk some of the challenges around ethics and security and related topics. You know big question marks around data protection, data privacy something in Europe you know a thing or two about and regulations in. You know financial markets and healthcare in particular. How do you ensure all these requirements are met? And you know what are some of the security concerns or opportunities. Let's say, I know on Amex just turned on voice biometric verification so when I call in, they train on my name and voice and eliminates all of the know, your customer kind of checks and logins, which is wonderful as a customer. But where do you see all of this headed next year? Well, maybe just pick up on that first.

Speaker 2:

So we just turned that off. For our European customers is wonderful as a customer, but where do you see all of this headed next year? Well, maybe just pick up on that first. So we just turned that off for our European customers. Sentiment and biometrics is not allowed under the EU AI Act. Or it is allowed, but it's called high risk. And if it's high risk, you have to go and convince an entity which still doesn't exist. You have to convince an entity in each country that this is a safe use of that high-risk technology. So we just turned it off.

Speaker 2:

But we do use it in customers in the US. The big thing there is what we call confidential compute, which means that data is encrypted not just at rest but also in memory. So as you're processing the data, it's actually encrypted. So Evan calls in, but Evan is now given a hexadecimal code and that's what travels with the information back and forth. So there's never a way. Even if you hack the system, there's no way to say Evan has a problem with his home router, so you can't connect the issue and the person, which also means that the data is very much more mineable internally, because if it's personal data, then you probably can't query it in the same way when you want to see what the customer trends are. That's happening right now. So that's a trend, definitely.

Speaker 2:

So we provide confidential compute. We have three customers that demand that and are using that. It takes a bit of a more. There's a bit more processing power, so it's a bit more expensive, but it's not big. We're talking 10%, 15% additional expense and customers make $5.40 on average dollars per call. That's automated. So if you have 1 million such calls, that's obviously $5,040 of ROI per month. So it doesn't really matter if it costs $500,000 more because it's confidential compute. So that's one. The second one that we see well, obviously, iso and SOC and all those HIPAA, et cetera. All those certifications very important. You can't really do business without those today. So you need to tick all those boxes and of course we do that. We have all those certifications.

Speaker 2:

But the thing we really see now is this security threat on chatbots. That's something that people are really attuned to right now. I don't know if you saw that people managed to hack the Apple intelligence and find that intelligence intelligence to a large extent was a five sentence prompt to open the eye, and that was, of course, hacked through the chatbot. So in that case, through Siri, as long as you have a chatbot, you have all opportunity to prompt engineering, especially if there's an LLM behind that, and that's something we see customers really looking at. Do I really want to connect that chatbot to my backend system? I'm not so sure that that's going to happen. So those we see.

Speaker 2:

I would say a fourth thing we see is hotel chains and people that are now behind like a booking or somebody else. They're now preparing to get agentic phone calls. So phone calls from Evan has an agent. Evan talks to the agent. The agent called to Hilton to get the best rate in a hotel in London because he needs to visit there next week. That's something we're also seeing. That people are preparing for inbound phone calls from agents, which also poses a bit of a security risk, as you will not be able to do voice biometrics or any others. How are you going to authenticate that agent? That's a new thing that's really happening. In Europe. Many have a digital ID, most countries have a digital ID and most customer care units use that, but the agent is not able to use a physical person's digital ID. So that's another security topic that's coming up more and more. But really encryption, confidential compute encryption all the way. That's the way to go. That's what customers are really going to be. That's the safest way to do it going forward.

Speaker 1:

Fantastic. Well, I'll be heading to RSA in the spring, the RSAC conference. That'll be an interesting topic at the cutting edge. So thanks for that. And any advice for customers who are interested in your technology but really don't know where to get started. Maybe they're a legacy Nuance customer. They've had something installed in the back office since the 80s or hasn't been touched. I mean, how do you get started on this journey with minimal disruption?

Speaker 2:

So we have several partners like Valcom BSL in Europe. We have Intervision Several partners like Valcom BSL in Europe. We have Intervision Kenway, appdirect, who have several partners, of course, and all these partners, they can replace that nuance within 60 days. That's pretty much a risk-free proposition. So we can, of course, discuss with the partner how you want to structure that, but we've done it so many times and we have so many reference cases that it really is a risk-free proposition. Then what you get is you get a instead of press one, two, three, you get a voice-based IVR and then after that you can start automating and then that's when the real ROI kicks in. But just a replacement is a very much cookie-cutter approach. Quite simple Cookie cutter approach, quite simple, been proven in the Avaya Genesis, cisco, amazon Connect space already, and WingCentral as well. So all of those we've replaced nuances in today Wow well done.

Speaker 1:

Sounds like a great proposition for customers, I guess. Final question what's next for 2025? I just looked at my event schedule and it had CES, Mobile World Congress, Enterprise Connect, IT Expo. I'm not going to be home too much in the spring, but what about you? What are you looking forward to in the new year?

Speaker 2:

So we have a lot of events happening with partners in the US and it's very much related to AWS Connected, genesis. So those are the two biggest ecosystems and it's yeah, it's like you said, it's all over the place. Now we don't do mobile World Congress and those are a bit too broad for us, right? We're a bit more specific. We're very contact-centered focused, let's say, phone focused. Even so, we yeah, a lot of activities together with AWS Connect and Genesys primarily. That's where we see the big traction as well. A lot of people are looking at what to do in the Avaya space. We don't really work together with Avaya there, but they do come to us through these partners and a lot of activities with AppDirect. So AppDirect is distributing our software as we go forward now and that's a big marketplace for us in the US.

Speaker 1:

Oh, interesting. Well, congratulations on all the success onwards and upwards. Hope you have some time off and relaxed. Relaxation over the holidays and off to a busy new year. Thanks, Per.

Speaker 2:

I hope so too, evan. I hope so, the same for you. So yeah, let's see if we catch up in one of all those events during the spring.

Speaker 1:

I'm sure we will, and thanks everyone for watching, listening, sharing as always and take care. Bye-bye, thank you.