Hearing Matters Podcast: Hearing Aids, Hearing Loss and Tinnitus

How Omega AI Turns Hearing Aids Into Everyday Assistants

Hearing Matters

What if your hearing aid could listen to you, read the room, and fix the problem before your next appointment? We explore Omega AI’s newest leap—Telehear AI—and how on‑device intelligence lets people request help in real time when noise, wind, or tricky spaces get in the way. Instead of waiting days for a follow‑up, users can compare new settings with their originals, choose what feels best, and keep moving, while clinicians still see the changes and refine care. This feature, of course, does not replace the role of the hearing care professional.

We go deep on the sound engine first—directionality, spatial awareness, and DNN‑driven scene detection—because clarity is still the heart of hearing care. Then we widen the lens: fall detection keeps evolving, Balance Builder supports everyday stability, and translation is taking shape as a practical tool to bridge conversations. The dream of “Jarvis in your ear” starts to feel real when the hearing aid becomes a daily assistant that protects, informs, and adapts without adding friction. Comfort matters too; lighter, more discreet devices make it easy to forget you’re wearing them, right up until they save the moment.

Data is the quiet force behind these wins. Smarter data logging breaks listening lives into patterns that clinicians can act on—who’s in wind all day, who’s stuck in high noise, who’s wearing less than they say. Those insights reduce returns, raise satisfaction, and turn fittings into living plans. We also share a candid look at where this is heading: a convergence of digital and traditional care where professionals remain central, devices handle minor tweaks on the fly, and the ear becomes a true superpower for communication, safety, and independence.

If this vision resonates, follow the show, share it with someone who loves great audio tech, and leave a quick review telling us which feature you want next. Your feedback helps shape where we go from here.

Connect with the Hearing Matters Podcast Team

Email: hearingmatterspodcast@gmail.com

Instagram: @hearing_matters_podcast

Facebook: Hearing Matters Podcast

Blaise M. Delfino, M.S. - HIS :

This is the Friday Audiogram. Let's go. What recent advancements with Omega AI do you believe have truly changed the patient experience at the point of care?

Dr. Dave Fabry:

It's a great question. I mean, I think first and foremost. Thank you. I wrote it. Oh, okay. Well, blind squirrel finds a nut every once in a while. So the the issue, I guess, as we started, really, it's all personalization, customization to the patient, that relationship and that connection between the professional and the patient. We're all busy. One thing we learned, although we've had telehealth features for a very long time, people didn't realize until the COVID pandemic that they needed both synchronous and asynchronous telehealth to remain in touch with their patients to be able to customize and optimize when they couldn't do face-to-face. Now it doesn't replace face-to-face care. Where we've gone with Omega AI in a new direction is called Telehere AI. So if telehere, synchronous or or asynchronous telehealth, allows you to fine-tune the device to meet the individual's lifestyle or audiometric needs, telehere AI provides an additional bridge if the professional is not available, if the patient is in a challenging listening environment where even with edge mode it doesn't work, or with all of the DNN 360, if it's still if they're still struggling in a particularly difficult environment, they can simply use this Telhair AI feature to state what type of trouble they're having. And the device, without intervention from the professional within range, will make adjustments based on the environment where they are in and on the question that the patient asked and provide them with new settings and then say, try this, here's your original. They can always keep with the original, or they can actually update the devices with the new settings. And to me, that's another way of meeting the patient need, we'll still engage with the professional to say, here's adjustments that were made. But the ultimate judge of our success is the patient. If he or she is saying, yes, that's better, I'm doing better in this situation, I always want to go to the health care professional for the challenging situations, or if there's a physical adjustment that needs to be made, or some other issue that requires a face-to-face or even a virtual visit. But we find that this telehair AI feature has been met with great enthusiasm on the part of patients and even professionals to help them handle minor tweaks that can make a big difference in the patient's outcome. Trevor Burrus, Jr.

Blaise M. Delfino, M.S. - HIS :

Brendan, as it relates to the latest AI features that we have with Omega AI, what are you most excited about?

Brandon Sawalich:

You know, first and we mentioned I mean it's it's all about the sound because you know it what AI and DNN uh and our 360 and and spatial and directionality and it's incredible. And you know, putting that all together in that ecosystem and and putting it in on the chip with our our uh uh engineers have done a great job doing and you know there's so many features I could go down, but it's the ones that we keep getting uh improving on. Uh we keep improving um our fall detection. Um we now have uh the balance builder, um, and you know, also you know, the language translator. And it's that's one where there's a long-term goal there, we keep improving, but you know, our our intent was you know simultaneous transl translation where the hearing aid becomes the personal assistant with with hearing, health, and your day. And you know, we're seeing those, you know, those dots start to connect. Because I you've you know many have heard me mention the you know, Iron Man fan. I watched all the Marvel movies with my my son, and and it's Jarvis in the air. It's you know gonna be talking to you and as you need it to and help you throughout the day. So each generation, each release, um, and they're gonna be seeing a lot from Starkey. I mean, we're we're moving fast, and that's that's the way the world and the way technology is going. So if it's better, we're gonna get it out. And it's it's getting there uh uh quicker and quicker.

Blaise M. Delfino, M.S. - HIS :

Brandon, you always say Starkey will be in five years, Starkey will be ten years ahead. What does that look like? You you are absolutely a visionary leader. You're again most tenured CEO in the industry. Uh you're you're making these bold decisions. What does that look like ten years from now? Well, that you can share.

Brandon Sawalich:

I think there will be a convergence because you have uh you have the digital patient and the traditional patient now. I don't believe, as Dave was just talking about with our telehere AI, it's you know, you're not gonna be able to mail these to the patient or have somebody uh fit themselves. Um I don't see that anytime soon. Those are revenue plays by other players that are just trying to um make make money. When you focus on the patient, again, it's that uh ecosystem where we're helping them throughout the day, not just with hearing. So it really is a hearing enhancer. Well, it's a superpower. I write. I mean you're we're making the ear a superpower, the devices, you know, um incredibly intelligent, they can adapt to the individual's personalized need throughout the day. And you know, to be 10 years ahead and it's it's it's the sound too because people have to forget, you know, they're wearing it. And right now, there's days, well, most of the day, unless I think about it, I forget I'm wearing mine. And you know, it's it's really when that you know that that fitting and the product, it's so light and invisible, even though there's a lot of technology packed in. Um, you know, if they forget it, they're gonna turn around and go home and get it. You know, that's that's kind of the goal. It's something you just can't live without uh throughout your day.

Blaise M. Delfino, M.S. - HIS :

What what I'm personally most excited about, not only with Starkey's continuous innovation, but growing up in a hearing home. Okay, I remember the HyPro box and all the cables and just connecting these devices to all these different um intermediary devices to really power them and program them.

Brandon Sawalich:

Um I was smiling because I hyper, remember the a product that one people would sell is uh uh make sure all the cords didn't get tangled up. Yes. Your cord or sorry when you were saying the hyper box that popped out of my head.

Dr. Dave Fabry:

Boy, we really got we had them hanging on the wall, they wouldn't have to be able to do that.

Blaise M. Delfino, M.S. - HIS :

And then all the manufacturers say, Do you need more cables? No, no, I need less. I need less and I'm okay. So for me, it's like I love to see this innovation and having fit patients with Livio, and then it was evolved, and I'm excited to help you know some friends and neighbors maybe get into Omega because having that passion for connecting people to people through better hearing, that's what you've said for for many years, Brandon. Um, the technology is a third of the equation. And and Dave, hearing care professionals, again, you hear AI and it's like, I don't want to be replaced with this technology. The fact that we have data logging and we've had data logging for many years, but now Starkey's really taking data logging to a whole other um dimension, if you will. How is Starkey leveraging the data not only for better fittings, but really for broader health outcomes, like Brandon had mentioned your um balance builder?

Dr. Dave Fabry:

Sure. Um well, first of all, yeah, indeed.

Blaise M. Delfino, M.S. - HIS :

Even for cognitive and emotional wellness. Right.

Dr. Dave Fabry:

Data logging has been um included in our software for a long time, and I really wonder how many professionals really look at all the information that's in there about not only, I mean, first and foremost, when people are using uh data logging, they're they're double checking when the patient says, yeah, I'm wearing them all the time, and then they see they're wearing them a few hours a day. Fortunately, we don't see that anymore. You know, now it's routine that the patients that I'm working with 14, 16 hours a day or more on average. But then beyond that, we can break down are they in quiet or noisy environments? How often are they in windy environments? Those are all I'm the things that I look at in the data logging screen help me determine what sorts of adjustments and adaptations I'll make based on feedback that the patient's providing me. And there's a wealth of information in there about the average uh level of environments that they're in, what percentage of time they're in each of the different acoustic classes, quiet, noisy, windy, et cetera. Um what other types of environments that we can look at to customize and personalize based on how often they're in uh noisy or a variety of different listening environments. And that's the first thing I use with data logging. And then uh but but we're using that to really also help predict a patient who might be um uh at risk of returning devices. We've helped it and uh to help professionals use data logging to lower their return for credit, increase their patient satisfaction.

Blaise M. Delfino, M.S. - HIS :

I always would say in practice, we don't want to see the patient walking in with their bag at the first follow up because that was always like, oh no.