Unspoken Security

The Future is Human

AJ Nash & Galya Westler Season 1 Episode 54

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 1:06:20

In this episode of Unspoken Security, host AJ Nash sits down with Galya Westler, Co-Founder and CEO at HumanBeam. They explore how advances in AI, digital identity, and holographic technology are reshaping the way organizations interact with people—while raising tough questions about privacy, ownership, and trust.

Galya shares how her work began in health technology, connecting patients to care during pandemics, and evolved into building secure, lifelike AI avatars for real-world use. She explains why protecting personal likeness and voice matters more than ever, especially as AI tools become more convincing and accessible. Galya stresses the need for consent, encryption, and clear boundaries to keep digital identities safe and organizations accountable.

Together, AJ and Galya dig into the risks and rewards of merging human presence with AI. They discuss how thoughtful design and strong security practices can support experts instead of replacing them, and why education and authenticity are key as we build a future where technology and humanity work side by side.

Send a text

Support the show

Unspoken Security Ep 54: The Future is Human

[00:00:00] Galya Westler: I never gave my real voice. And even if you give a real voice, we have to distort it or make it a little bit slightly different because people use their voice for banking. People use their voice for, and, and we can talk about the whole like.

[00:00:11] Galya Westler: Abduction of voice and abduction. Of likeness, which is also very, very dangerous. Right? And that's why we always do security first

[00:01:02] AJ Nash: Hello, and welcome to another episode of Unspoken Security. I'm your host, AJ Nash. I spent 19 years in the intelligence community, mostly at NSA. I've been building and maturing intelligence programs in the private sector for, uh, about 10 years now. I'm passionate about intelligence, security, public speaking, mentoring, and teaching.

[00:01:19] AJ Nash: I also have a master's degree in organizational leadership from Gonzaga University Zas. So I continue to be deeply committed to servant leadership. Now, this podcast brings all these elements together with some incredible guests to have authentic, unfiltered conversations on a wide range of challenging topics.

[00:01:33] AJ Nash: It's not gonna be your typical, polished podcast. My dogs, they make occasional appearances. I don't think we'll see them today, but you never know. people argue and debate here. We can even swear I sure as hell do. and that's all. Okay. So I want you to think of this podcast as a conversation you'd overhear at a bar after a long day at one of the larger cybersecurity conferences that we all go to.

[00:01:51] AJ Nash: These are the conversations we usually have when nobody's listening.

[00:01:54] AJ Nash: Now today I'm joined by Galya Westler. She's co-founder and CEO of HumanBeam Technologies. It's a company with unique AI tech that I'm really excited to explore, but that's just part of her story. Gallia is also a serial entrepreneur.

[00:02:06] AJ Nash: I mean, she wears many hats. She's a co-founder and CEO of GO Health Technologies, co-founder and CEO of Plazus Technologies. Her entire career has been about smarter, safer, faster, more efficient tools, and she's done a lot of podcasts and webinars and public speaking over the years, including Ted. So I'm honored to have her on unspoken security today.

[00:02:24] AJ Nash: Is there anything you wanna add to that Bio Gallia?

[00:02:27] Galya Westler: Oh my God, this is amazing. This is well, well above everything that it is. It is the best bio. I've never been introduced so Well, it's amazing and, and the fact that you, you came up with this by yourself without me sending it to you is even more amazing, so

[00:02:40] AJ Nash: I did. And, uh, full transparency. I did it without AI.

[00:02:43] Galya Westler: amazing. Without AI.

[00:02:44] AJ Nash: yeah, that's a hundred percent me. And, and, and I now realize, as I say that I, I'm proud but also kind of embarrassed. I was in a hurry. I probably should have had AI just write it. instead I went through your LinkedIn and like stitched it together.

[00:02:55] AJ Nash: 'cause you know, it's old habits still. but AI is, you know, becoming a more useful tool. I use it. A lot of people do, obviously. And there's so many different, you know, AI tools out there right now. So with that, you know, the topic for today is actually that the future is human, which I find a really interesting topic, which you came up with.

[00:03:11] AJ Nash: Uh, but it's a really interesting topic coming from somebody who's, who's got this technology that's, that's very much not humans, it's very AI, right? Although I shouldn't say that 'cause we're gonna get into it and it's a little bit of both, so I don't wanna take away from it. Let's just jump in. So, HumanBeam, right?

[00:03:24] AJ Nash: HumanBeam technologies. So why HumanBeam? Like what are you, what are you doing? What's different about this? Let's, I don't normally open up with like products, but in this case I think you have to, 'cause it's just really interesting stuff.

[00:03:36] Galya Westler: Sounds good. Well, thank you very much AJ for inviting me for this podcast. It's a pleasure to be here today. like you said, I co-founder in a few companies. It's been now a journey of 16 years. this particular company, HumanBeam, has been around for one year, but it comes as a permutation from another company that we have where we develop different health, health type technology solutions for different governments around the world.

[00:03:57] Galya Westler: And it actually started by us beaming doctors into rural areas. So it started by before this developing a Telus, a telehealth solution, for different. health organizations and hospitals and clinics around the world. And essentially, it actually started well before that when we were able to collaborate with different governments to try and eradicate HIV, which is, you know, a pandemic.

[00:04:19] Galya Westler: Uh, quite similar to the COVID Pandemic, which the only way for to eradicate this pandemic is to connect people to testing and then to care. So we did this like a few years ago. That's how we started because we come from a world where Plazus's technologies is actually implementing enterprise level type software solutions.

[00:04:36] Galya Westler: We also do cybersecurity. We do penetration testing. We are gonna talk about the responsible AI. So we also implement, the backend of the Azure AI and any kinds of AI's tools that are available for the organization. So we come from the world of enterprise and custom software development. And when we did this, these different, uh, software development projects, we actually started with connecting people to testing in order to be able to identify whether or not they have a certain disease.

[00:05:01] Galya Westler: And together with different partners, they were actually distributing self tests. All around. They started in Canada as a nationwide project initiated by the Health Authority of Canada. And while they're distributing the test, you might ask, okay, how is this related to AI? And I'll tell you the journey.

[00:05:18] Galya Westler: It's got nothing to do with AI a few years ago. In fact, it has to do with democratization. It has to do with blockchain, which is our background. And it has to do with a thing called self-sovereign identity where you're able to verify credential of a person, particularly a patient. And if you remember the world during COVID, the world was very scary because we.

[00:05:38] Galya Westler: we were, we had to give away our personal information just for being able to pass a certain test to pass a certain border. And if you remember, there's been a lot of security leaks back then. They've known, like everything, when the person lives, their status, their name, their information, everything was leaking and everybody was okay with it.

[00:05:56] Galya Westler: So. We were not okay with it. Because, because we come from the world of blockchain and we believe in democratization. We believe in protecting and, and also monetizing by ourselves, our own, data and our own information. And so we actually started like a few years ago with GO Health where we called the project Go.

[00:06:13] Galya Westler: We wanted to get the economy going at the early start of 2020, when the pandemics just started, and we actually developed this for COVID, you were able to do verification of your id, and then we connected you to testing, very early testing of COVID. And essentially you go from zone A to zone B, totally anonymously because the only thing that they need to know about you it, am I infected or am I not?

[00:06:34] Galya Westler: And if you're not infected with the disease, you can go to the zoma, you can hang out with people. So our vision was to get the economy going and to be able to open up the economy. Of course, that didn't work. Nevertheless. There comes the permutation. We ended up winning a project with, uh, with different parties from the Canadian government and they wanted to distribute tests for co, for, uh, for HIV, which is very similar.

[00:06:56] Galya Westler: So they distributed a test and then using our technology, we're able to anonymize, bring in this test. So almost like picking it up at a pharmacy or even like an Uber, kind of like an anonymous Uber where you can grab the test, test yourself, totally by, independently. And then they're able to access the test results to be able to run the analysis.

[00:07:16] Galya Westler: How is the pandemic spreading and how do you stop the pandemic? You connect them to care. And then comes the telehealth solution where you're able to, get on a call and essentially have the consultation that you need to treat the disease or treat the condition that you are at, and then connect them with different things such as prep.

[00:07:32] Galya Westler: If people are aware of this disease. Then what we did is we started to give this solution to different doctors in rural areas and we decided to integrate this with 3D holographic display. So when you go to HumanBeam.io, and that's where HumanBeam was built, we wanted to beam in humans and that was the first use case, right?

[00:07:50] Galya Westler: When you're able to give these tumor remote areas and doctors and, and different healthcare providers really loved it because we had our scheduling system. They were able to create this immersive experience and bring the doctor with any 4K phone that everybody has. You don't need any sophisticated 4K expensive camera, you just need to have a white background studio.

[00:08:10] Galya Westler: And then just like that you're streaming in 4K. Totally live in very, very high quality. Then came the request and they said, what about the mundane tasks that's happening with the front desk? And that's where we're getting into the world of ent AI, but it's not just AIC AI, right? You can see this a lot also in hospitality, right?

[00:08:28] Galya Westler: A lot of them are implementing different conversational AI that has like an audio. But before we go into conversational AI, the question is. Why not just implement OpenAI? Well, we started using OpenAI at the beginning, the first permutation of the project. And OpenAI, it's a wonderful LLM solution, but unfortunately it's not enough.

[00:08:46] Galya Westler: And you can even see it as an end user that it's not enough. Which means that if you are using a ChatGPT, for example, to write my biography, and you tell it, take it from LinkedIn and, and then it's gonna write the thing for you, right? But is it going to be a companion or is it going to be a chat bot?

[00:09:01] Galya Westler: What do you think?

[00:09:03] AJ Nash: Uh, it's definitely not a companion at that point. No. It's very much input, output, what?

[00:09:07] Galya Westler: Exactly. Which means that it's the first form and a lot of people that are using it like this, the way to use it as, and also not even conversation. the way to get into conversation is if you're able to enable. The audio on your phone and then you're able to hear back what the chat PT or the different AI solution is talking back to you.

[00:09:25] Galya Westler: But still, it's a chat bot, which means that it tells you really, really amazing information that it gathers from all around the internet, but then you have to go and do the job and then comes the point of an agent. So we don't just do agentic, we do it in an AI companion approach, and we give it a body and we put it in 3D and it's like, and then you're like, oh my gosh.

[00:09:46] Galya Westler: But why should people use it? Why should people use it? Well, why do you need an engagement for the mundane task? People used it, right, in order to be able to automate things like front desk in a clinic, in order to automate things such as a concierge at a resort in order for, in order for you to even capture.

[00:10:04] Galya Westler: So for example, you sent me a form, right? You wanted me to provide some information. Oh my God. Who has information to fill out a form? No one has, so no information. Who has time to fill up a form? No one has time. So if you talk to the AI and it looks maybe like myself or like you, you're able to capture all the information simply by having a conversation, which means that you can move people to action much faster.

[00:10:29] Galya Westler: And as a business to be fast. When you provide these input and output, it's crucial for the business to do their job. It is crucial for the business to have this like back and forth really, really fast. Capture all the data. And it's something that I say all the time when I meet with clients. Right now, we are having a conversation online and it's being recorded easy, right?

[00:10:50] Galya Westler: Put Scribe, put whatever, put the Zoom, put the riverside, whatever it is, the AI tools that they have, that's great. But what happens when you meet people in person? Usually you don't record them, right? They're gonna think it's pretty odd, right? But that's because these are people, right? Unless you're having maybe a business meeting and then you.

[00:11:06] Galya Westler: Uh, still for you to put a recorder, it's a little bit odd. It's a little bit weird, right? But what if you can have an AI that looks like you or looks like some of your representative? And when you speak to the AI, when, when you speak to our AIs, you can download our app and try it out. Not only do they look real, they look really exactly like the likeness of the person that gave us their likeness.

[00:11:28] Galya Westler: Not only did they look real, they feel real and they talk real. And it really feels like you're having a real conversation, which means that you are able to, if you want record this information, the AI can understand how you look, how you feel, and record everything. And based on that tune the way it responds, or maybe later on, this can becomes the analytics that are very, very valuable for the organization that's using it.

[00:11:54] AJ Nash: All right, so that, listen, this is fascinating. As I said, I mean, we had, we had a prep call and talked about it and I was like, man, I gotta, I gotta have galley on the show. This is amazing stuff. And, and it's, it's very interesting. You've said a lot. So I wanna unpack a few things. I wanna back up just for a sec.

[00:12:07] AJ Nash: 'cause there were some pieces I actually had missed all along, all along that you started, it was healthcare and it was anonymization, which I'm, that's where the blockchain comes in, if I understand correctly. so it's an interesting migration from anonymization, you know, using blockchain so that people could, you know, get their results and figure out where they needed to go if they could move up into the next stage of, of socialization without having to give away too much information, which is, which is a very interesting and very important security, technology.

[00:12:31] AJ Nash: Right. and then from there you said, you know, okay, we, we also got into the healthcare space and it was most health concierge piece, which is. A separate use case, sort of a related use case, but a actually a bit of a separate use case. But that got you into, as you said, beaming. Right. So I wanna dig a little bit into that.

[00:12:46] AJ Nash: When we say beaming, what do we mean? Because in my head I go beam, I'm thinking like Star Trek, and that's obviously not what we're doing. So when you say, you know, I assume it's not, I shouldn't say obviously, but I assume you're not teleporting people. that'd be very cool. That's a whole nother show. Um,

[00:12:59] Galya Westler: we are, we are actually teleporting people. If you think about it, because there's two features. There's the live call. With the live call. A person, you can be sitting in a studio and on the other hand, or maybe the other way around, there is a hologram, right? Let's imagine you invited to a talk show.

[00:13:13] Galya Westler: Let's imagine you invited some sort of an event and you can't make it right. You can teleport yourself using the hologram, so of course you don't go boop, boop, boop, boop, boop, and

[00:13:20] AJ Nash: right, right. Your whole body actually isn't there, but, but a digital version of you.

[00:13:24] Galya Westler: a digital version of you that is in 3D is actually there, which means that you are kind of teleporting someone.

[00:13:30] Galya Westler: And because it's, because it's 4K streaming, like there's no lag. There would be a lag at the beginning as it's like buffering. But if you had a good, if you've got a good internet, you can beam in the humans right away. And that's where the name is HumanBeam. But actually, because our tagline, because the future is human, even when we add the AI to compensate for the human, the human compensate for the AI, because the I AI can have certain variables that say, okay, you're done talking to me, or you wanna speak to a manager, or you have something that's unclear, I can beam in the person instead of me.

[00:14:02] Galya Westler: This is the AI speaking. And then you're gonna be beamed in so that your staff is able to give that support. That is more, more like an immersive experience. And if the person is not available, and that's the problem, right? Short of staff, you wanna automate it with the AI. So that the AI and, oh, actually the AI is beamed in as well because the AI is not flat.

[00:14:20] Galya Westler: The AI is in 3D as well. We can still make them, make them. We are making them in web format, in app format, like in streaming format,

[00:14:28] AJ Nash: Mm-hmm.

[00:14:28] Galya Westler: because it's, it's it's cross uh, device. It's not really, it's device agnostic.

[00:14:33] AJ Nash: I mean, that's fascinating. So you have the ability to really expand your capacity. Right? So, it's probably a terrible example. I'm not a doctor, but I'll go ahead and say I'm for the sake of this discussion. So I'm a doctor and I've got six patients and you had to triage, right? And they're virtual.

[00:14:46] AJ Nash: We're, we're, obviously there'll be virtual meetings in this case. So I could send AI into five patients and triage them while I'm working with the first patient on my list, right? And I could send either AIs that look and sound like me, which I assume I would tell 'em it's not really me. or I could send other versions, you know, AI nurses, AI, whatever it's Right.

[00:15:04] AJ Nash: Intakes. so they can all be getting, you know, all the intake things taken care of in an interactive way, not just fill out a form and wait and, you know, interactive way. and then those, I am, I don't wanna speak for you, I'm gonna guess and then have you answer I guess. But, I assume we're able to develop these AI personalities, right?

[00:15:21] AJ Nash: So that they have a body of knowledge, however, limited or

[00:15:25] Galya Westler: have, they have a body and they have knowledge

[00:15:27] AJ Nash: Exactly right. but however, limited or wide do you want it to be. Right. So, I mean, I wouldn't want, if I'm this doctor in this scenario, I wouldn't want my intern, person, or my intake person to suddenly start actually medi answering medical questions probably because that's not their job.

[00:15:40] AJ Nash: Right. So you put those guardrails on, but, but they could answer things that are beyond just whatever forms filled out. Right. You know, could,

[00:15:45] Galya Westler: That's right. Absolutely. And also symptoms. Right. Another thing that they do is they capture the symptoms. They also capture some information with the intake form, and then it comes a point where, you know, PHI comes into play if, if they don't, if they don't wish to have PHI, or it's just up to the point where it's, it's like a front desk person.

[00:16:02] Galya Westler: That's able to answer different questions, but actually it could even amplify that a little bit more because you can make the AI almost like have a certain authority in a certain area. I'll, I'll give, I'll give you another example to try and like compliment this. We are also working with universities and also also coaches.

[00:16:18] Galya Westler: So anybody that does like, for example, leadership coach, we've got one example of Dr. Joe that's in the app. He actually wrote a book about different leadership styles that he have. He also wrote a book about different things of how do you communicate, and he's a communication and a leadership coach.

[00:16:33] Galya Westler: So when you speak to his AI, he's gonna give you advice as a coaching advice, which allows him to have residual income because instead of seeing the different student that he has, he's able to give his AI and then the AI is able to give the material to them in terms of leadership. With regards to his book, he is now creating a course.

[00:16:51] Galya Westler: So what's an AI-led course? AI led courses are basically, instead of you watching a video. All of the instructor explaining different things, even if it's an interactive video, right? What's an interactive video with some elements and some whatever, videos and, and other like pictures and whatever overlays that they can do that is not interactive video.

[00:17:11] Galya Westler: The only way to do interactive video is if he himself, somebody hires him, you know, for some sort of a, an event, or maybe it's a Zoom, maybe it's a workshop. It still requires the person to be there, so it's a one-on-one, right? What if you can amplify that person? You could gain a replica of their AI, and then the AI is able to teach the course in an interactive way.

[00:17:29] Galya Westler: So what he does now with the launch of his book, he actually trains like the first chapter and he actually used it in order to lure the people into the course. And it's kind of like when you write a book and you let somebody read the first chapter, a

[00:17:43] AJ Nash: a

[00:17:43] Galya Westler: of it, and people are like, oh my God, what about the rest?

[00:17:45] Galya Westler: And then, and then it's not just you reading the book or listening to the course, you are interacting with the instructor about the course in the AI way,

[00:17:54] AJ Nash: hmm. Yeah. I mean, you're, you're creating scalability on the individual level, really. Like, I mean, obviously it becomes organizational if you do a lot of these, but individually, so for solopreneurs, you know, now you could, you can act as a, as a larger organization, you can get more accomplished, you can be more efficient, right?

[00:18:09] AJ Nash: So, I, some questions that, that, I come to mind on this and, and do for Allis, but especially for this, I think for some of these, you know. You talked about having to upload your image, for instance, or having to upload your voice. so I'm sure a lot of people are gonna ask questions about, you know, the security of that, right?

[00:18:24] AJ Nash: There's a lot of concerns. I have concerns, I see ads regularly now on LinkedIn among other places for voice actors, right? And if you look and you dig in, and voice actors are, they have ads for all sorts of different skills. You know, if you're good at this, good at that, know this, know that for a few hundred bucks an hour, and you realize when you look at the ad, it's, it's to train AI, right?

[00:18:40] AJ Nash: And so the idea is great for a couple hundred bucks an hour, I can do some voice acting, but now you have my voice and now I don't have my voice anymore, and that's it. I've given away my voice forever. So, you know, how do you address that and make sure that, you know, hey, if I decide to work with HumanBeam and I, and I, you know, set it up and I get, get my voice, and I get my face and my body and all these different pictures, and so you can create this perfect doppelganger of me, how do I make sure I don't lose myself now?

[00:19:04] Galya Westler: So, that's a really good question, and that's actually a valid concern that we get from different customers. So the one thing that people have to know is that we do not do anything with their likeness or their voice without their consent. So legally, we have to cover ourselves and we have to give consent, number one.

[00:19:19] Galya Westler: Number two is we do not share the likeness, unless, of course they are, they're joining us as like. Our actors are representative and then, and then they, they have a deal going on with HumanBeam and of course it's like licensing costs and so on and so forth. But actually, if we do this for your business, we train the AI on your particular voice, on your particular image, on your particular motion, and on your particular knowledge base.

[00:19:44] Galya Westler: We don't share, so we don't do a crossover. We don't do this. Right. The second thing that I was, and then it, and then it gets stored and everything is encrypted. Always good. Right. But you asked about usage, you asked about license, and you asked about the way it's been actually being implemented. So it's only been implemented for the organization.

[00:20:02] Galya Westler: So what we do is also from a technical perspective, we're able to do, it's not really a multi-channel, but it's kind of like. The way Slack works, right, you go into the human platform, and we also, we're adding this also to the app because we want people to have access to the app, or if it's, if it's a physical place, then they add it per hologram.

[00:20:20] Galya Westler: So it's a physical place, or they can add it to the website. We're now working on a container option. Container option is if I'm using the right, uh, I don't wanna say the wrong term here, I forgot the name of the container. But anyways, there we're, we're building a container where it's essentially like a zip file that you're able to install it as an app running on your Linux, on your Mac and on your windows.

[00:20:41] Galya Westler: And essentially that launches very similar to the way Slack works and Zoom works.

[00:20:46] AJ Nash: yeah. Okay.

[00:20:47] Galya Westler: So you can have web access, you can have app access, you can have hologram access. And the way we do this is we separate this based on organization. So every organization's got their own channel. So you can imagine that if you are, for example, an author or university, you've got different characters.

[00:21:02] Galya Westler: The characters, the avatars belong to your organization. And that's the way we work. Because we do it, we do it as a B2B. When you think about the model of B two, B2C is a little bit different. So for, with Dr. Joe for example, as because he's a coach, he's a very good example of how do you do B2B. So he has a B2B deal with us, and then he sells it to the C.

[00:21:22] Galya Westler: So we're also wanting to create a marketplace because a lot of these solopreneurs, they don't have the resources to build a marketplace website or to, that's where they use a, that's where they use a Coursera, right? And we want to be able to provide them this marketplace where either, either we can put it in a marketplace of a bunch of different coaches.

[00:21:43] Galya Westler: Or it can be his own or her own marketplace. So there's different ways to do it. But the point is that it is a B2B solution, which means that if you gave us the, your likeness and your voice, then we're able to create avatars with your likeness. And it can be different characters, right? It can be you and different outfits.

[00:22:01] Galya Westler: And, and another thing that I would say, I would not recommend to use the exact voice ever because, and that's for example, with my AI, right? I lend my AI to different conferences because they ask me to do, uh, presentations and I don't always have time. And I'm, and I'm lazy. And because I'm lazy, I'm, I'm, I'm laughing, right?

[00:22:20] Galya Westler: Because I'm the, I'm the least lazy person in

[00:22:22] AJ Nash: Yeah. Yeah. I don't think the three time at the same time, the simultaneously three CEO person is lazy, if I had to guess, but, all

[00:22:29] Galya Westler: But, but do you know, I do, I do have a bad hair day or like, there's no filters on this riverside. What the hell? All of my like.

[00:22:35] AJ Nash: Uh,

[00:22:36] Galya Westler: Thank God I comb my hair. but, not a lot, not just a little bit of roots. Right? I am an aging person. But you know, as, as an aging

[00:22:44] AJ Nash: but your AI probably is not an aging person.

[00:22:46] Galya Westler: my AI is an Amazonian woman.

[00:22:49] Galya Westler: You should see her on the hologram. My goodness. People, people came to our booth and I was like, they're like, who is this? It's me. They're like, oh, okay. Oh, okay, okay, okay. And I was like, okay, she's a little bit taller than me, but come on, she could be me. We, we made her like, we made her two meters tall.

[00:23:08] Galya Westler: And I'm like a, I'm a meter and a half basically. So, so the point is that I never gave my, my real voice. And even if you give a real voice, we have to distort it or make it a little bit slightly different because, you know, people use their voice for banking. People use their voice for, and, and we can talk about the whole like.

[00:23:27] Galya Westler: Abduction of voice and abduction. Of likeness, which is also very, very dangerous. Right? And that's why we always do security first because we come to the security world, we always do security first, which means that, um, if if people wanna use their own voice, then we use like a slight version of that, and that's what's being used particularly for their organization.

[00:23:48] AJ Nash: Yeah. That's good to know. 'cause that's a big concern, uh, for everybody. I'm sure. I mean, I, I'm not gonna lie, obviously I'm thinking of myself 'cause I'm selfish. Uh, you know, I'm sitting here right now, I'm Mike and on camera there's a big chunk of my job and I'm thinking, well, if somebody can copy me, then I don't exist anymore.

[00:24:00] AJ Nash: And the very limited value I provide to the world suddenly

[00:24:03] Galya Westler: Wait, well I'll tell, I'll tell you something. Anybody can copy you. Anybody can copy me, right? Because it's online.

[00:24:09] AJ Nash: Yeah. It already

[00:24:09] Galya Westler: you just, you, you, you synthesize the voice and you've got the voice, and then there's tools out there that you can just replicate it. But then you go into the legal problem that if that platform did not, you didn't give consent to this platform, then of course you can sue them, right?

[00:24:23] Galya Westler: So.

[00:24:24] AJ Nash: Sure. Yeah. Yeah. I'm thinking more along the lines of criminals outright, and what they can do with it as opposed to, to somebody just violating

[00:24:29] Galya Westler: but, but, but also in the film industry, right? The whole like open AI and, and SC Scholar Johansen, right?

[00:24:36] AJ Nash: Yep. Exactly. Yeah. It's a big concern. So one of the other big concerns I have, again, and I, we talked about this a little bit, you know, when we prepped for all this, when we talk about the future is human, right.

[00:24:45] AJ Nash: So I wanna come back to what I will say is probably my largest concern across the board with AI. And I suspect it's for a lot of people jobs, right? You know, there's all this talk about, you know, AI is gonna take jobs, it's gonna replace people, it's gonna get rid of people, et cetera. And I've, I've seen experts on both sides of this discussion.

[00:25:01] AJ Nash: I've seen experts that say, no, no, no, it's definitely not gonna take jobs. It's the human and the AI together. It's gonna be superhuman. That's the solution. And others have said, oh yeah, it's absolutely gonna take jobs. There's plenty of experts to say millions of jobs will be lost. And I, I tend to fall in the middle because I think it can be either right.

[00:25:14] AJ Nash: And so I'm curious about your thoughts on the corporate prioritization of volume versus efficiency. And, and, and I'll give you an example. There's, you have a company with 10 people in a small company, right? So I know that company could say, Hey, let's, let's bring in AI, let's, let's, you know, multiply ourselves and we could be a 10 person company that now does a hundred people's worth of work and we could really punch above our weight and, and work against the big boys.

[00:25:36] AJ Nash: And that's great. The alternate version is to say, I have a 10 person company. Lemme bring AI in. I like the volume of work we have now. I just wanna do it with one person instead of 10. So instead of trying to raise my productivity and to increase my revenue, I'm just gonna lower my costs, and make my margins huge.

[00:25:52] AJ Nash: And I'll have the same revenue, but a lot more margin. And then of course you can apply that at a very large scale to companies like Amazon, et cetera. What are your thoughts on where this is and, and which makes more sense and what are we really gonna see in this area?

[00:26:04] Galya Westler: Okay, so I would like to approach it from the different companies. Okay. I don't wanna talk about the conglomerates because the conglomerates are making their back on top of our backs regardless of what you do, whether it's AI, whether it's robots, whatever it is, right? They are quite shady, right?

[00:26:19] Galya Westler: These are shady companies, despite the fact that they're trading in the, stock market. Total public companies, they're extremely shady. They're taking advantage. this is me putting the blockchain hat. Telling you that the entire demo demoralization it's not something that they do, it's not something that they care about,

[00:26:33] AJ Nash: Mm-hmm. Mm-hmm.

[00:26:34] Galya Westler: about the solopreneurs, we care about the small businesses. We care even about medium sized businesses. And if this is an automation that's going to help the business owner still have a roof on top of his head and his employee's head, then this is the way to go. Right? And you can refer to this in customer support and everything that they're doing right now with conversational AI, with audio, that could be beneficial when you're doing different phone calls.

[00:26:58] Galya Westler: And for that, they're mimicking the whole customer service, approach quite well. But there are situations where conversational AI with the body makes more sense. And this is where you talk about the amplification, which means that. If we have a shortage of staff and, you know, shortage of staff, it's not always the business fault.

[00:27:16] Galya Westler: Shortage of staff actually started in COVID. It accelerated in COVID. Why? Because the markets and the economy shut down, right? People stayed at home, people, uh, went into mini jobs or halftime jobs, and so they had no choice but to become on, on, on ei. And when they were on the unemployment insurance, they realized that they better stay on employment insurance than to get the job, because they're actually gonna earn the same thing.

[00:27:40] Galya Westler: They're gonna have more free time. So when you think about the future, and lots of, there's lots of people that wrote articles about this, right? In an utopian world, they forgot which, a few, really well known people talked about this in an utopian world. You are able to replace the humans with the mandate task, which, if you think about it makes sense.

[00:27:59] Galya Westler: Why would you want, nobody wants to do a mandate task. Like when we talk about this in hospitals and clinics, we talk about the sad nurse and the sad nurse men or female, they're so overworked and they don't wanna do this job, right? So they don't wanna do this job. They don't wanna show up.

[00:28:13] Galya Westler: They do a lousy job where actually they need to go back to what they do very well, which is care. And how do you give care only with a human? Which means that the way we see it is the AI is the front facing. Give you another example in different areas where we've got the problems with the homeless population.

[00:28:32] Galya Westler: A lot of the time we actually have a homeless guy. We call him Michael, in our app. And we, we do, we do, we use him for simulation. He has a very, very unique personality, but his personality was actually trained on real personalities of folks that are coming to the er. They're flooding the ER, and a lot of the time it's very dangerous.

[00:28:51] Galya Westler: For the, for the, for the real person, for the real life. And that's why you can see a lot of, ever since COVID, right, a lot of security, stuff that are there to make sure that they don't, they don't become violent. Right. And why do it become violent? Speak to Michael or AI. He'll tell you if you are able to show him a doctor.

[00:29:08] Galya Westler: So that's where the simulation comes to play. We do it for like universities. Michael, the homeless guy in the ER, he has a respiration disease. He can't breathe. And if you tell him, I'm not, I'm not going to, you're not gonna see a doctor, he's gonna become really

[00:29:21] AJ Nash: freak out. Yeah.

[00:29:22] Galya Westler: he literally freaks out and he literally, he will not get off your back until you show him a doctor.

[00:29:27] Galya Westler: And if you think about this, right, we do this for simulation. This is the reality. So in some cases, you wanna, you want to protect people from, you know, having too. They are having too much stress in their job, doing the job that they don't need to do, doing a job, that they really should focus on their core skills.

[00:29:46] Galya Westler: And now I'm gonna come into the perspective of the person, right? Every single person. That would like to be part of the job force. The first thing that we say, and I've been saying this for years, right? As a person who studied software engineering, I started my job as being a project manager, a product manager.

[00:30:02] Galya Westler: I was even qa, like I was, I was a security guard when I was working to pay for my tuition when I was in university. Like I've done all, I was a tutor as well, right? And I am always saying to people that they need to get the education to become experts, right? 10,000 hours. How do you become an expert? The AI is not going to replace you as an expert, because guess what?

[00:30:21] Galya Westler: When you use lovable, which is a tool or cloud, which is a tool for you to code, if you look at the way it is with somebody that's not an expert that's trying to cope with this, is, is a disaster and never nev, nevermind coding, right? Try, try to use the no code, try to use other no codes, AI tools. Hey, try to use tools that I'm gonna create you an image or an avatar or a video.

[00:30:44] Galya Westler: You tell him to do this and then he does this. Like, it's totally, this is why, you know, when we do the whole automation of taking a picture of someone, converting it into emotion, converting it into a video, then training it into becoming the AI, it looks so real. If you look at the models, the way they look in the real life, and you look at them as our AI models, it's almost exactly the same.

[00:31:05] Galya Westler: Of course, I didn't make mine the same because, you know, why not make me an Amazonian

[00:31:10] AJ Nash: sure. Right? Not,

[00:31:11] Galya Westler: Um, but, but, but it, it is. And why, how come the quality is so high? Because we do what I call a handcraft AI.

[00:31:19] AJ Nash: Mm-hmm.

[00:31:20] Galya Westler: AI. It means that we pay attention to every single pixel, every single jumping right? Every single.

[00:31:27] Galya Westler: So we have to do upscale and many things that tools out there tell you that they do upscale, but they don't. Right. And if you are a business and you care about quality, because the way they look, the way they act, the way the platform is, it has to be extremely high. And think about it, right? We AIm for hologram 4K Screens in 3D needs to be very, very high quality.

[00:31:47] Galya Westler: That's why when you translate into the app, translates into the web to the container that we're we, that we're building, it looks so good. And because it looks so good, people are so excited. And in fact, people think that it's real, that they, they have this like back and forth conversation as if it's a real person.

[00:32:03] Galya Westler: So I, I don't know if I answered your question. I think that the revolution is coming, not, not because it's taking people's jobs, but it, because it makes people ask themself, is this the job I wanna do? Can I become an expert? And actually for people themselves as experts, they should use AI to become, you know, super, super humans.

[00:32:28] Galya Westler: That's essentially what it is.

[00:32:30] AJ Nash: and you make a good point about AI. I've not, I'm not an expert on AI. I don't claim to be, I've got hundreds of hours of, you know, prompt engineering at this point accidentally, sort of, I just started playing with it and I've, and I've learned a fair round at least. And, and one of the things I did learn is, yeah, AI needs a lot of help.

[00:32:45] AJ Nash: So it is very cool and it can do a lot of things. And the current versions are,

[00:32:48] Galya Westler: infant, it's still, it's still very infant stages

[00:32:50] AJ Nash: It is, yeah, the current versions are much better than the originals, but AI needs a fair amount of help still. You know, you mentioned you gotta be an expert to really get expert results and as an Intel guy I can say that I can, I can talk to people and I've seen people say, I don't even have AI do my intel reports.

[00:33:02] AJ Nash: And when they come out with this complete garbage, uh, that shouldn't be trusted, shouldn't be sent to anybody, should, should just never be used. If you know how to build a Intel reports and you know how to work with the AI, you can get a lot further down the road. I still would never have anybody plug things into AI and say, that's your Intel report.

[00:33:17] AJ Nash: But you can get a lot closer. 'cause you can teach a lot of things that you can't teach the AI until you know them yourselves. So there is something to be said for the AI like every computer, it's, it's only as smart as the inputs it's given, right. and so there's something there, which I do find interesting.

[00:33:31] AJ Nash: I think as far as the corporate journey, I, I think we're gonna still run into, you know, as you said, like kilometers are gonna do what they're gonna do. I, I think it's interesting that this does give the smaller companies, you know, people in smaller companies, a, a, a fighting chance to, to punch above their way to, to do more with less.

[00:33:46] AJ Nash: I think it's great to be able to get rid of Monday and, you know, the medical example you gave with nursing is fantastic. Yeah, let's get rid of the stuff you hate doing so you can really focus on patient care. 'cause AI's not taking that job. Nobody's gonna get their medical advice from a robot and be comfortable about it anytime in the near future.

[00:34:00] AJ Nash: or, their nursing care. Like AI isn't changing IVs. Hell, doctors don't even do that. So, it's nurses all the way. so there's something to be said for that, I think. And, the teamwork, you know, that you talked about, about how these things can work together, is interesting.

[00:34:13] AJ Nash: I, I think there's some, some very, very cool use cases of that training use case you mentioned. I think's fantastic actually. the opportunity to, simulate like, as you mentioned this, this AI of a, of a homeless person command in er. I mean, I could see a whole lot of opportunities to train medical personnel, emergency services on traumatic scenarios.

[00:34:31] AJ Nash: Yeah. And that's fantastic without having to have the traumatic scenario, not having

[00:34:34] Galya Westler: Well, I

[00:34:34] AJ Nash: know, be in a

[00:34:34] Galya Westler: the, the, exactly. So, the use case is that this is where they save money because they hire actor and actresses and they have to pay for them time for, to be able to, as you know, they have to practice. Uh, and then, and, and then they don't always get into the character correctly because they don't come from the medical space.

[00:34:50] Galya Westler: And so they end up, so they end up having another person there that is kind of like the director. So actually they might pay them 30, 40 bucks an hour, but it's not because with all of the, all of the stuff around them, it's actually $200 an hour. And why would you pay $200 an hour when you can have, you know, a simulation AI and that he looks and fill in?

[00:35:06] Galya Westler: And that's how he reacts. And not only that, the AI has perception. The AI can examine, the AI can take notes. The AI is able to say, oh, you know what, you, you should really work on this angle and you know, you should be a bit more compassionate because it's a person without a home. You know what I mean? So it's, there's so many possibilities.

[00:35:23] Galya Westler: Another thing I wanted to touch upon, which you said the future is human. We saw a big dissonance. We just came back from CES, the big CES 2026 that happened in Las Vegas. It was insane. We got invited there and our booth and, and also we did like different tricks that was just talking about this within another podcast.

[00:35:38] Galya Westler: We like different tricks. I can send you some, some videos if you, if you somehow can add it into the

[00:35:43] AJ Nash: Sure, sure.

[00:35:44] Galya Westler: it was amazing. So we had like the large 86 6, uh, beam box. We called it hologram display. And then we had a 21 each one. And then we also had an app because the internet there is terrible. You pay extra for the internet.

[00:35:55] Galya Westler: That's work. Ah, who's, who's doing like a live demo in a big conference?

[00:35:58] AJ Nash: Oh, it's never easy. No. Oh, you decided to though?

[00:36:02] Galya Westler: Nobody except us. Like totally. Oh my God. It was crazy. Like we had huge, huge, huge companies come there. Right? Like you really had to, and then media comes and then I have, I have a video there that the guy, Jesus Christ was like filming my nostrils because I didn't realize I was so busy and he had like a camera here and I'm trying to talk to him.

[00:36:20] Galya Westler: So you see me in the video like. So embarrassing. So I'm like, okay, fine. Take it for the team. That's fine. You don't have to look anyway, so people get to know my nostrils. yeah, thank God it was clean at the time. So, but the dissonance was that behind us, we were like in the robotic AI enterprise AI area at the, at the big convention center behind us, which was very popular.

[00:36:49] Galya Westler: It's if you type CES 2026, all the videos there are all about the robots,

[00:36:53] AJ Nash: Oh yeah.

[00:36:54] Galya Westler: robots, like the dogs, robots, oh my God. People like, like ping pong with them. And it was all about the robots, right? Robot. And we were all about the human. And then we made AI look and act like human because the future is human.

[00:37:08] Galya Westler: And when people came to our booth, they were like, oh. This is amazing, and we did a little trick. We wanted to get people in, so we did like a massive, like SEO campaign. We

[00:37:18] AJ Nash: Sure. Sure.

[00:37:19] Galya Westler: in advance, right? We wanted to get a lot of attention. We got attention before that. But then when people started to come, all the bypasses, because it's like almost 200,000 people,

[00:37:28] AJ Nash: Hmm.

[00:37:28] Galya Westler: people came because we did two things.

[00:37:31] Galya Westler: We rented a studio and we wanted to take the models. I was also one of the models, you know, have to lead by example, and I was like, did, did some 10, 10 spray tanning. Oh my god, I really look like an African woman. It was really hilarious. Hilarious. Like, when you look at the videos, you are like, why is she so dark?

[00:37:50] Galya Westler: Anyways? Scary, right? Scary moments online. Okay. So we, we did like, we hired a studio because actually when you wanna make it really fast, you take a full body picture with a white background to create the shadow effect. And that's how you are able to render it into the 3D effect. That's the fastest way because we didn't have a lot of time to prepare.

[00:38:10] Galya Westler: CS happened at the beginning of January. We had to do everything during Christmas. Insanity who works, nobody works during Christmas except for us. And the Apple and Google store to also try to push for the app was insane. Like they rejected like 5, 6, 7 times last minute. They, they approved it and we needed the app as a backup to show how it works.

[00:38:29] Galya Westler: Anyway, so we hired a studio, we had the models come in and we did different, different dancing videos. So real people dancing in and outs. I'll share, I'll share with you the video you see, like you see the 3D effect phenomenally. And then we made from these models, including for myself, the AI version. So when you come to the booth and also we put like popular music.

[00:38:51] Galya Westler: Right remix, not original remix remix music.

[00:38:56] AJ Nash: Yeah. You gotta avoid the copyright issues.

[00:38:58] Galya Westler: Yeah. No, no need, no need to sue. We're good people. So then they came into the booth and they saw like the AI dancing like this, right? And then pop up the music, blah, blah, blah, blah. And I'm, I'm just there with my team, like dancing, right?

[00:39:12] AJ Nash: Oh

[00:39:12] Galya Westler: The things you do for your business.

[00:39:14] Galya Westler: People loved it. And then they came in, they tapped on it, boop, boom boom. The AI start to talk. They click on live demo when the internet gods were helping us. And all of a sudden the AI spoke back and forth in 28 languages.

[00:39:27] AJ Nash: Wow.

[00:39:28] Galya Westler: You should have seen the Korean. And the Japanese. They were like,

[00:39:31] AJ Nash: Oh yeah,

[00:39:31] Galya Westler: the AI is just like

[00:39:34] AJ Nash: Huh? That is, I mean that's, that's very cool. It's very interactive. Right. And that's what just, you know, people want, like I said, with all the robots, I'm sure you did stand out. 'cause, you know, robotics is, is very, very cool and it's doing a lot of things. And I've, I have great fear for the future when you plug AI into robots and make 'em look like people.

[00:39:48] AJ Nash: And I don't know where we're gonna be in, you know,

[00:39:49] Galya Westler: They don't look like people, but

[00:39:51] AJ Nash: they don't yet.

[00:39:51] Galya Westler: of them, some of them try to make them look like people, and it

[00:39:54] AJ Nash: Yeah, it's creepier. It's even creepier when they do that, but they'll figure it out. But I'm curious, so the underpinnings of this, right? You, you mentioned way back a while ago when we started, you were talking about how you started with ChatGPT, but you know, ChatGPT needs help, right?

[00:40:06] AJ Nash: And, and it's not to pick on open AI. You pick any of the platforms, they all need some help. They're all waiver on who's where. I've gone through several chat. GBT does not wanna be my number one choice today, but, you know, things can change. But where do you see, this is a frustration I have about all the AI platforms essentially.

[00:40:22] AJ Nash: And, and where they've been sent out into the world is the responsible AI journey, right? So. AI is massively interesting, and potentially very, very powerful. It's also been released to the world with no training, no guidance. you know, it's, and so, and frankly a lot of it came with no security.

[00:40:40] AJ Nash: Uh, people were happily pouring stuff into, into these AI platforms without really considering where's my content going? What, when I've asked this question, where's my personal information going? because it's just so interactive. It's so intuitive that people lose track of the fact it's not a person, it's a machine.

[00:40:53] AJ Nash: Uh, and, and it's collecting whatever you give it. So, you know, what are your thoughts on this? What I've would say so far has been kind of an irresponsible AI journey, and that this massively powerful technology's put out. And thankfully, most people are still kind of in this infancy of, they really just treat AI the same way they treated Google.

[00:41:08] AJ Nash: Like, they just ask, you know, it's a smarter search engine to a lot of people. Most people I think,

[00:41:12] Galya Westler: Well, you know, you know what they say, right? If you're not paying for it, then you are the product.

[00:41:16] AJ Nash: you're the product. Exactly. And, and a lot of

[00:41:18] Galya Westler: might, you might wanna pay for things. Yeah.

[00:41:20] AJ Nash: It's, it's very true. Whenever you're not paying, you're the product, right? I mean, social media

[00:41:22] Galya Westler: But even, even when you do pay, you're still the product. Right?

[00:41:25] AJ Nash: That's, yeah, that's the next phage is they've been smart enough now to charge us and still make us the product,

[00:41:30] Galya Westler: Well, I mean, you do, you do, you do go into the whole model of advertisement, which I can talk about advertisement as well. And then if you don't, you don't, usually you don't pay if there's advertisement involved, unless of course they have a, they, double a dip on you, which means that they serve you with the ads

[00:41:45] AJ Nash: Mm-hmm.

[00:41:46] Galya Westler: and you end up getting, they get monetization from the ads and they grab your data.

[00:41:51] AJ Nash: Sure. Exactly. Well then you've got subscription services become the next thing you pay to get the ads go away. And so, so there's the revenue anyway, right? And they're selling your data and, and they've got the revenue coming other places. But this technology have been, you know, let loose and we've seen some negative results.

[00:42:03] AJ Nash: We've seen, and I'm starting with the easy things of, hey, you know, people are writing research papers that aren't real. And, you know, people have submitted, you know, to courts of law lawyers have submitted documents that turned out to be based on totally fake references. You know, they've been chastised by judges.

[00:42:15] AJ Nash: But then all the way to the, deeper aspects of like AI psychosis and, and some of the mental health issues we've seen as, as there's really, you know, put us in this position. So, how do you see that journey as a whole? How does it affect what your business is? 'cause there's obviously underpinnings to your business, to this.

[00:42:29] AJ Nash: How, how are you being a responsible producer of AI based technologies to ensure that people don't end up in a position where I'm talking to this avatar, this, that looks

[00:42:40] Galya Westler: so I'll, so I'll explain, I'll explain the entire architecture. So there's two ways I'm gonna answer as HumanBeam technologies, and I'm gonna answer as Plazus, right? So when we do this at HumanBeam, like I said, we do partitioning of the avatars and the information and the likeness and the guardrails and the way it sounds, the way the personality is based on the organization.

[00:43:02] Galya Westler: We also have a way to partition the PHI, when you talk about PHI, which is, you know, the highest level of, uh, personal information, we are also able to containerize it and then have them run it on their servers, which means that it's enterprise first. Security first solution, which means that if you are conversing with an agent, AI that has been approved to be implemented with the organization knowledge base, whether it's a form of conversational AI or something else, then you know that is a safe, because you signed a contract with that clinic, with that hospital, with that, with that, even with that, with that resort, with that university, right, with that coach, right.

[00:43:38] Galya Westler: You have usually a contract with them. And you are interacting B2C, we do B2B. And the way you do B2B, you absolutely have to do it secure. Right? We welcome people to do penetration testing to evaluate the architecture. This is what we do when we deal with businesses. So that's, this is how we deal with businesses.

[00:43:54] Galya Westler: And also the integrity of the information needs to be handpicked by the organization, which means that their eyes are going to be responsible for whatever the organization means, which means that we don't do general knowledge base unless they ask us to provide general knowledge base. But even then, that comes with guardrails.

[00:44:12] Galya Westler: For example, Vegas AI, Vegas coach AI, she was Demi, a real person, a model. And when you look at Demi, she's, she's the one with the orange soup. She was trained to tell you all the different events that's happening in Vegas, right? And this is something that we wanted to do, CS AI that we also had there.

[00:44:28] Galya Westler: As Danny, she was trained on like. 3000 exhibitors, different sessions, and she was trained to help people with the CS fatigue, and she was able with the conversation to tell you what, so everything is trained based on the organization, right? When we do different implementation of AI as Plazus, we actually take the.

[00:44:46] Galya Westler: Open AI. API is also known that as Azure AI. If you are an Azure organization as an example. And we do, and this has got nothing to do with HumanBeam, we're gonna connect this to the documents base, which usually is SharePoint. When you deal with Azure, we connect Azure AI to their SharePoint and we're able to gather the information for just a regular chatbot experience that is plugged in to their environment, in their enterprise.

[00:45:11] Galya Westler: And in addition to that, we add the security layers to that, which means that even if you had a private AI, a private, uh, chat, GPT or a private Gemini in your organization, that is fed from the particular knowledge base, also Salesforce and other, other programs that you have. So software tools that you have.

[00:45:32] Galya Westler: You are able to add security layers to it because whatever the HR person is gonna use it for, she or he should not access anything that has to do with finance. And finance. People should not deal with HR and developers shouldn't deal with high corporate stuff, right?

[00:45:46] AJ Nash: Mm-hmm.

[00:45:47] Galya Westler: And when you do this, the only way to do this is if you hire a company that is able to implement it from the API and security perspective, and then you run the penetration testing to make sure that you haven't been poisoned with documentation.

[00:46:00] Galya Westler: Now this is what you do for businesses, right? What do we do as individuals, right? We have to trust the companies with each IGBT, which is Gemini, right? You should not trust them, which means that, that you should never, ever upload personal information there. And when you ask the question, ask it in a way, always think about it in your head.

[00:46:19] Galya Westler: They're using my data, they're gonna use it. And you know what else? I should be really careful by different documentation that I've been injected. Right, the different things that you can inject

[00:46:29] AJ Nash: Sure. Absolutely.

[00:46:30] Galya Westler: that you can inject without behind, documents. And so people have to be very careful because when they use, for example, the research mode, research mode, because it brings a lot of different things, it could actually contain false information.

[00:46:42] Galya Westler: Another thing that I would tell you, do you know what is geo? I think it's called geo. GEO.

[00:46:46] AJ Nash: mm-hmm.

[00:46:47] Galya Westler: Geo is the SEO of the Chacha. Bt

[00:46:50] AJ Nash: Oh, yeah. Okay. Yeah, I know what you're talking about now.

[00:46:51] Galya Westler: I didn't, I didn't know that we, to that, you know, we had such good results. I only heard about the good results because companies are coming to us totally online. Turns out there were number one in the search results.

[00:47:02] Galya Westler: When you, when you're searching for, I think it's like 3D or conversational AI visual conversation, oh, conversational AI with the body, which is kinda like a term that we've, that we've introduced. Turns out that when you ask g, Pia and Gemini, and by the way, they give you different results, we're number one in the search results along with the other.

[00:47:18] Galya Westler: Competitors. Right? And that has to do with the SEO work that you do. But if you think about poisoning, if you think about manipulation of data, which is happening now when the elections are coming, right? We're not gonna talk about different countries that we're from, right? Which I do not support these kind of, the way they run their administration absolutely ridiculous.

[00:47:35] Galya Westler: And you know what they do? They use tools to be able to poison all of this. Poison is not the right word. The word it is not the right word. The right word is there, is it's manipulation. And actually it's false information they feed. So up until now, they used to feed the social media with false information.

[00:47:52] AJ Nash: Sure. Mystic and malformation is a huge problem.

[00:47:55] Galya Westler: Well, the A, the AI, AI is now learning from social media, from Google, from all the different search results. And if you have like a website that is trending and another website is linking to it, this is how Geo works. So all of a sudden you can't even trust what the ChatGPT means that Open AI and the other companies, they now need to be responsible to make sure that they give us an authentication of approval.

[00:48:16] Galya Westler: Right. Do you remember h GT PS,

[00:48:18] AJ Nash: Sure.

[00:48:18] Galya Westler: do you remember how it started? Right. This is exactly the same point in time. It is now 1990, right? It's the beginning. It's the beginning of the internet era. Right? And the fact that we are creating like private AI implementation, it's us telling people, you can have your own website, you can have your own AI presence in this amazing internet, internet world, right?

[00:48:41] AJ Nash: Mm-hmm.

[00:48:42] Galya Westler: And, and this is the beginning of the internet world. And some people, they don't know how to build websites. That's why. Another joke is you can see when someone crafted something with a really, really lame AI. Even, even like the post on LinkedIn, right? Even when you see like the different icons, oh,

[00:48:56] AJ Nash: Oh, yeah. You know, right away. Yeah,

[00:48:58] Galya Westler: what is

[00:48:58] AJ Nash: can spot, you can spot the AI posts

[00:49:00] Galya Westler: Exactly.

[00:49:01] AJ Nash: very, very

[00:49:01] Galya Westler: but you should say, but you should say, created by AI, just like, this is not me, it's created by AI. Right. You, you have to disclose that. And the companies also, the ones that are providing the AI tools, they need to always, and actually they, they start to do this, right? So when you create pictures, for example, with Gemini, it has the watermark.

[00:49:17] AJ Nash: Yeah. Which you can of course edit out, very easily. Uh, I've done it several times. but I, one of the challenges, you know, you point out like the poisoning of the, well, so to speak, right? It's the missed diss and malformation. And, and so I've written about and talked about MDM for years.

[00:49:31] AJ Nash: I've talked about, you know, I dunno, four or five years ago, I, I wanna say I coined the term, I'm sure somebody else did. But I, post-truth world was something I started talking about years ago, as something I thought was coming, which I think is here at this point. And it was because of, this mass effort to poison social media, which is the largest megaphone, global megaphone in the history of humanity.

[00:49:48] AJ Nash: So that now it's not just, Hey, I'm gonna whisper to, you know, a few people, or now it's not just, you know, crazy aunt, you know, Martha and the attic, it's millions and millions of people. And whether they're, they're, they believe in conspiracy, the theories because they just believe in them or because they've been told to believe in them, whatever it is.

[00:50:02] AJ Nash: And so,

[00:50:03] Galya Westler: they, they've been fed, they've been fed with this information. Right.

[00:50:06] AJ Nash: And so nefarious people have done

[00:50:07] Galya Westler: like, it's almost like it's not even their fault, right? They,

[00:50:09] AJ Nash: A lot of it isn't. Yeah. I mean, we don't teach critical thinking at young ages and we don't teach people about sources, you know, and as an Intel person, it's frustrating 'cause you know, we go through all this training, so you know, a lot of discussions about, you know, digging back on sources.

[00:50:21] AJ Nash: But that's been a huge problem. That's changed perceptions. We have a lot of people who believe a lot of things that just simply aren't true. And it's very hard to convince 'em otherwise. I'm really concerned about it because, as you said, AIs are trained on this stuff. Like one of the things AI, you know, that people may not know about these lms, a lot of 'em are just built on just hoovering in all this information from the internet, from, from social media.

[00:50:39] AJ Nash: And whether it's real or fake doesn't matter. You know, it, it hoovers it all in. It doesn't know the difference. Theoretically, the companies are supposed to help teach at the difference, but I don't know that there's any accountability on it. They just put a little disclaimer that says, Hey, you know, the AI could be wrong.

[00:50:50] AJ Nash: You

[00:50:50] Galya Westler: They're, they're gonna have to certify. There's gonna have to be some certification on this

[00:50:54] AJ Nash: I mean, the flip side is they currently don't have to 'cause just like social media platforms aren't responsible for what the content is on their platform, I suspect, you know, LMS are gonna try to, or the AI companies are gonna try to make the same argument, maybe a little different.

[00:51:05] AJ Nash: 'cause it's not actually user injected data, it's, it's their

[00:51:07] Galya Westler: But you, but you're gonna, you're gonna have another AI tool that's gonna be able to certify. So some kind of like a plugin that you're gonna be able to install. And then based on that, the users are gonna be able to antivirus whatever the content that they're getting. So

[00:51:17] AJ Nash: Yeah, we're gonna have to have some arbiter of truth, which is gonna create a whole new set of political problems. Who creates that arbiter of truth and whichever, you know, politics that company has is gonna be the new thing. And so we're gonna have this, this endless, well, maybe not endless, but hopefully not, but this spiral and this cycle of, you know, what is truth, right?

[00:51:32] AJ Nash: Which is why I said we were gonna get to this post truth world, and then truth ceases to exist. And it's whatever people tell you it is and whatever you choose to believe. and we have a lot of significant problems that are gonna come along with that. I'm very concerned about, but we'll see, right?

[00:51:44] AJ Nash: Is, I hate to put it that way and, and kind of dumb it down to we'll see. But we will, you know, I think people, most people inherently wanna know the truth. I think we see when post-truth goes too far, we see people that tell you specifically not to believe what you just watched. And people have pushed back against that and said, no, that's real.

[00:51:59] AJ Nash: Now as AI video gets

[00:52:00] Galya Westler: and also the bo the body of regulations, right? You will have the bo So the same thing that happened in the blockchain space, right? When they regulated all the different plays, places that you can buy and, and exchange different tokens and different coins. They had to regulate this, right?

[00:52:14] Galya Westler: And they really like, they really like, they, they went right into, uh, the different companies and, you know, a lot of these companies didn't survive. Unfortunately, the scams were happening quite a lot and people got hurt. And I think you'll see as the same thing happening with AI. So if, if open AI don't provide disservice to the public, then the regulation parties are gonna make them, you know, uh, be regulated and give this to the public.

[00:52:37] AJ Nash: Yeah. I, I mean, I hope so. There's a lot of platforms out there right now. They won't all won't survive, you know, competition's

[00:52:42] Galya Westler: But I mean, it depends. It depends, right? I think, I think that the social networks are being more regulated in Europe, for example. More than they are, uh, in the States and other parts of the world, right? There's a whole discussion around TikTok and how TikTok is being used in the US and who's gonna buy it and this and that.

[00:52:55] Galya Westler: So, I mean, if it, unfortunately, it kind of weaves into the political

[00:53:01] AJ Nash: It does. Yeah.

[00:53:02] Galya Westler: which is by itself very corrupted. So, you know, expectations are, need to be a little bit more, you know, maybe a little bit lower than that. And at the end of the day, people need to be educated. They need to get the education to know and understand that they have to be very careful from the information that they're feeding it and the information that it gives back.

[00:53:19] Galya Westler: And always kind of become the expert, right? Because when you're the expert, you know that it's not correct and be able to refine it and make it something that will be very, very useful for you and for your company.

[00:53:31] AJ Nash: And I think that's, I think that really is the key point, is become the expert. If you just think you're gonna become an expert on everything, because you have AI sitting next to you, you're not, uh, and, and frankly, when you run into real experts, you're gonna look really stupid and not know why, because you just trusted this box.

[00:53:42] AJ Nash: It gave you a whole bunch of, you know, garbage that looks and smelled good. You know, I tell people all the time, well, it reads really well. It sounds great, but

[00:53:49] Galya Westler: then it breaks, it froze

[00:53:50] AJ Nash: Yeah, it's all fiction. It's nothing. You check the sources and they go no place. And, and if if you choose to interrogate the AI, it will tell you, by the way, but down this path where if it gives you a bunch of fake things, it is not hallucinating by the way.

[00:54:00] AJ Nash: People call it AI hallucination. It's AI fabrication. But if you talk to it, it will actually tell you why it did it. It's not trying, it's, it's just, uh, I didn't have that, so I just filled in the blank. It's just not, you know, very transparent about it unless you talk to 'em for a while. Uh, and you'll find out it.

[00:54:12] AJ Nash: Doesn't know how to say no. So it will give you answers that are garbage. And if you don't know how to check those answers and you don't actually know the, have the knowledge to begin with, there's some working knowledge of the topic you're on. Uh, you're gonna produce a lot of very bad things and people are gonna figure that out, and you're gonna lose jobs and, and bad things will happen.

[00:54:24] AJ Nash: So listen, we could talk about this forever. I, I also could do another hour on this, but I don't know if anybody will listen, for two hours straight. So I do find it fascinating. I think where we are in AI is fascinating where we're going. I think the tool and technology that you guys have at HumanBeam is remarkable.

[00:54:40] AJ Nash: I've seen it. add some links to this so people can check it out. Um, you know, I, you can download the app. I've downloaded the app, uh, to play with it a little bit because I was curious, just put it on the stack with the 72 other AI tools I'm playing with right now. But, but it is different and it is interesting and, and the interaction is there and, and the concept is, is fascinating.

[00:54:57] AJ Nash: I think some of the applications you're talking about for this, could be very, very, I hate using the word unique. 'cause unique, it's unique to me. I haven't seen anything else like it. Let's go with that.

[00:55:05] Galya Westler: It is, it is hand. We call it handcraft, which means that we do AI by hand.

[00:55:10] AJ Nash: Yeah. And it's, it's very good. Yeah. And as a result, it's very good.

[00:55:13] AJ Nash: And I think, I think the use cases are good. I'll be interested to see where, you know, where this goes going forward, where you guys go. I'm

[00:55:19] Galya Westler: We have, we have, we have a few surprises coming up with you soon, so.

[00:55:22] AJ Nash: Well, I'm, I'm very excited about it. But as we're getting near the end of the show, I have one question I ask every guest, you know this 'cause I do give guests a heads up at this point for me wondering, so they're not gotcha questions at least.

[00:55:31] AJ Nash: but the question, you know, the name of the show is Unspoken Security. So with that in mind, I ask every guest to tell me something they've never told anybody before. Something that, to now it's been unspoken. So what do you got?

[00:55:43] Galya Westler: Yes, I knew you were gonna ask me this one. Um, it's, it's kind of, it's kinda like a job interview, even though like you have to tell, like, something that's not good about you. And I'll be like, I'm

[00:55:50] AJ Nash: Not necessarily not good,

[00:55:52] Galya Westler: I'm a,

[00:55:52] AJ Nash: be not good. Be a great thing that you've to

[00:55:54] Galya Westler: have to tell you something that I don't tell until somebody describes, I, I think I told you this, right?

[00:55:58] Galya Westler: I am really, really am a pain in the ass person. Like, I will not let go until I understand exactly from beginning to end why a person said a certain thing or did different things. And that's because I really, I try, I really try to do, I really am trying to be a perfectionist, which is kind of funny because a lot of the times I, I don't go into the details too much, but to be a perfectionist in my eyes is to try and do 80% really, really good.

[00:56:24] Galya Westler: And not try to do it a hundred percent. So I'm gonna give you, I'm gonna give you an example, and that's a little secret when I give my speeches. Even when I did my ted, which was really big, like at Queen Elizabeth in, in Victoria, they trained us for like nine months and we had to stick to a script.

[00:56:39] Galya Westler: And I never stick to a script because. As a perfectionist, I don't see perfection in the absolutely amazing, perfect polish, script or words, right? Because as you see, even today, the way I'm able to be perfect on stage is because I allow myself the room to do improvisation. And the only way to do improvisation is if you give yourself the breath and the ability to not hold on, to not be chained to a certain script.

[00:57:08] Galya Westler: I mean, this is part of my ethos, why I'm an entrepreneur, because I can't be chained to a specific company, to a specific, you know, somebody that's gonna tell me how to run and how to do, because I'm a very creative person. I'm a free spirit, blah, blah, blah, blah, blah. And everybody that is like that, they need to be free.

[00:57:23] Galya Westler: And how do you become free? You unchain yourself from the different boxes, right? And the boxes they told me in my 10, they said, practice your script. Never go beyond 12 minutes. This, or. They approved every single world. And then when I did as a trick, I arrived to the general rehearsal. A lot of people don't know this, and I changed everything.

[00:57:44] Galya Westler: They didn't even pay attention because they were so, they were so like, you know, so busy. But I came with general rehearsal with my hair and my makeup and my clothes, and I changed the script because I realized it wasn't good enough. And to be honest, I didn't want them to, you know, to keep on telling me how to, which every word I should I should use.

[00:58:01] Galya Westler: And in the end, I didn't even memorize the entire thing, even though I had to memorize. And it was kind of like, it wasn't really a teleprompter because they didn't allow for that, but they allowed to have like a screen with just a few points. So I had like nine points. And I have to tell you, if I had to do the speech all over again, it will not be the same because I improvised in between to make it look so natural the way it was on stage.

[00:58:24] Galya Westler: So this is my secret.

[00:58:26] AJ Nash: I love that, and I can relate to it. when I started doing the podcast, I worked for a different company at the time, originally, and, you know, the original thought they wanted to be scripted and, you know, you know, 'cause we, you've been on the show now, you know, for every, anybody who has a, the prep isn't in the show, is pretty minor.

[00:58:39] AJ Nash: There's a little bit, you don't come in with nothing, but it's very, very minor pep. I don't like scripting. I don't believe in it. I think it just comes across as stilted. Uh, and originally I, that was the original, you know, request mandate, however you wanna look at it from the company. I was like, no, I'm not gonna do it.

[00:58:50] AJ Nash: We just won't do the podcast and go, you know, find somebody else because I don't wanna do that. And, and, you know, we worked through a good conversation and solved it, but I said it just doesn't work for me. And, and part of the reason scripting doesn't work for me is, uh, is the same thing you're

[00:59:01] Galya Westler: thing you're talking about,

[00:59:03] AJ Nash: It's stiff.

[00:59:04] AJ Nash: you're forced to stick to it. And if, if you misalign, then it's not perfect anymore. Right. And so when I work and I work almost exclusively without scripting of any kind, it's what I would call imperfectly.

[00:59:13] AJ Nash: Perfect. you're natural and you're authentic, which people really appreciate you. You'll get to all the points, or most of 'em, at least the important ones, show up. And, and I don't walk away feeling like, oh, I wish I'd said this differently, or I wish I, because there was nothing to begin with. I didn't have expectations coming in.

[00:59:25] AJ Nash: So I, I totally agree with what you're talking about. I can relate to it. I think it would help a lot of people, you know, public speaking people talk about, oh, be very prepared and have your script and all that. And I feel like you get people very stressed sometimes. You know, I always tell people when they've asked me, here's what you need to do, know your stuff.

[00:59:41] AJ Nash: It's as simple as that. If you're gonna get on stage and speak, know your stuff, that's, then it doesn't matter. 'cause you know, things will happen, slides won't work that day. or they'll get out of order or, you know, there's always something. Right. As long as you know your stuff, that's all that matters.

[00:59:55] AJ Nash: And, and you'll get it out there like you talked about being that expert. Right. so it's great to find out from somebody who's spoken at Ted 'cause I certainly have not, that this actually works. That I'm not the only one that thinks this

[01:00:04] Galya Westler: Another trick that people can take and they can exercise to allow themselves this breath of ability to really, like, get your free spirit to shine. Don't let people tell you how to wear and how to act and how to be, but you still need to be professional. You need to respect yourself, right?

[01:00:22] Galya Westler: Whether it's is wearing a cleavage and wearing, whether it's not, whether it's wearing your head, your, your hair up or down, or stripes or whatever, and being like a she or, or he or they, it doesn't matter, right? Because if you respect yourself, people will respect you. And if you want people to respect you on stage, then you can have a certain framework.

[01:00:38] Galya Westler: And the framework is something that I do because I used to do more mentorship. Now my AI does the mentorship for me. But when I did mentorship, I used to tell to people that you pick your topic, right? Let's imagine it's topic of an icebreaker. That you need to, you know, say it at a, at a, at a wedding or at a, at a corporate event or whichever speech that you want to do at a TED speech as well.

[01:00:57] Galya Westler: You create, you have the topic right, and you always will have a very good beginning. And at the beginning, as you know, one of our mentors from Toastmasters, Toastmasters International, Roger would say, you need to have the what's in it for me at the beginning, which means that you don't talk about yourself at the beginning.

[01:01:14] Galya Westler: You talk about the audience. Why should they, why should the audience have the hook to explain what you, what you can do for them right at the beginning? So that is the beginning. Then you have the body of the speech. It needs to have usually three points. Usually it's three points around the three points.

[01:01:30] Galya Westler: You should have some sort of a revolving, some sort of a, so not weaved in story, a personal story to, so you start with with facts. You add in the personal stories to get the people hooked in, right, and to listen to you. And then at the end, you're going to have a call to action and a reason why people, why do they have to sit and listen for you?

[01:01:52] Galya Westler: Because you gave them something that they can take it further, right? Which is actually on, you know, on one leg. This is kind of the training of Ted. This is how you do this for Ted, right?

[01:02:01] AJ Nash: Yeah. And,

[01:02:02] Galya Westler: that's easy to remember, right?

[01:02:04] AJ Nash: It is. And, and you just gave people a lot of great information, frankly, here at the end. I hope they stuck around for, because

[01:02:08] Galya Westler: Hire, hire my AI.

[01:02:09] AJ Nash: Yeah. For those who've been wondering how to do fantastic, uh, you know, public speaking. That's a great way to look at it.

[01:02:13] AJ Nash: And as you said, it's, it's not being comfortable with who you are, you know, and, and I, uh, like you, I don't wanna be told, you know, what to wear or how to act or, or what to say. I've been told, uh, you should, you know? No, no. If you go to any speech class or any, they'll say You don't swear. I, I curse, I curse a fair amount.

[01:02:27] AJ Nash: and it depends on the environment. I've had people, you know, for years I have suits and ties and I could wear them. I don't anymore. but at some point I got to a point where I was like, you know what? I know my stuff. I'm good at what I do. I'm here for that and I'm going to be me. And, I remember having conversations with people who were panicked about it and we show up for an events and they're like, everybody's in a suit.

[01:02:44] AJ Nash: I'm like, well, I'm not. And then it turned out it was fine.

[01:02:47] Galya Westler: From a female perspective, right? The, the grueling that female, that people, especially, I would say, especially women, right? Like when you look at the different comments on my tag because I was wearing a certain dress, it is too tight. It is too sexy. It is too. I'm like, really? I should have made it even more sexy.

[01:03:03] Galya Westler: Right? I wish I wore like even higher satos that, you know. I wish it would've shown my, the curves of my body because I'm so proud of my body. Right. And I think that, you know, like, again, going back to the, the respectfully, and I don't think they were talking about women, right? I don't think they would criticize as much if there was a guy wearing a very tight pants versus a woman that's wearing like a tight dress or whatever it is, right?

[01:03:28] Galya Westler: I mean, you think about Cheryl Sandberg and her book Lean In, how is she dressed like this? Right? Aren't you can, can you breathe woman? Maybe you wanna, can you breathe? Like maybe you wanna breathe as you lean in.

[01:03:43] AJ Nash: Right

[01:03:44] Galya Westler: This is totally cy, right? You need to, you need to stand behind your own values. And if we talk about women, I do support women, I do support women to celebrate their sexuality and also for men as well.

[01:03:55] Galya Westler: Right? Of course. You don't have to totally, go beyond that because if you are in the professional world, you can still be professional and still be sexy. Right.

[01:04:05] AJ Nash: Well, and, and it all comes back to where we started. It's about the future as human. It's about being human, and that's what it comes down to is,

[01:04:11] Galya Westler: we act like robots actually, right? In the

[01:04:13] AJ Nash: A lot of people do. Yeah. Corporate world, you know, trains. Some people do it. It's like being human. You know what, wear what you wanna wear, be who you wanna be, be authentic.

[01:04:20] AJ Nash: People care about that. and know your stuff. Right. Again, if you know what you're

[01:04:23] Galya Westler: Absolutely. Know yourself.

[01:04:24] AJ Nash: and you dunno anything, then, then you better conform.

[01:04:27] Galya Westler: Get, get your education, get you education, you know, work for companies maybe before you become an entrepreneur. Experiment, try stuff like be bold, like be, become an entrepreneur. That's another really, really strong message, right? Become you don't need that boss.

[01:04:41] Galya Westler: Right. You are very, very creative. Some people, they really come from a different species. They cannot be a follower. They have to be the leader. And if you have this inside you, then there are a lot of AI tools out there now that can, can make this a reality for you.

[01:04:58] AJ Nash: Yeah, a hundred percent. Alright, well, listen, this has been fantastic. I really appreciate it. Uh, Gallo Wessler from HumanBeam, uh, co-founder and CEO. For anybody who hasn't seen HumanBeam, I do recommend checking it out. If you don't know Gallo, you should definitely look her up on LinkedIn. I'll have links and everything and all the notes and all that good stuff.

[01:05:14] AJ Nash: as far as, for the show, thanks everybody for taking the time again, to listen, to watch wherever you're catching us. You know, please subscribe, tell other people if you like the show, if you don't tell, like the show, don't tell anybody. Just shut up and go away. But if you like the show, please tell other people, you know, I could use some more subscribers.

[01:05:27] AJ Nash: We all could. We wanna be able to continue to bring great content and, and amazing people at Gallia to tell their story and to, and to talk about, you know, whatever's going on in technology or life in general, right? So, again, thank you to everybody. Please, uh, you know, share and like, and, and all those good things and, uh, and gimme feedback, right?

[01:05:43] AJ Nash: If you wanna, if you like the show, if you don't, I actually do wanna know that if there's ways to improve it, if you have ideas on guests, you can reach out to Gey and say, Hey, was it fun? You know, feel free to do that. if it's a show you wanna be on. And, and so,

[01:05:54] Galya Westler: Pleasure. Thank you so much, aj. Thank you much for everybody that's listening and watching.

[01:05:58] AJ Nash: Well, thank you. I'm honored you're able to be here today, and so with that, I will close it out.

[01:06:02] AJ Nash: This has been another episode of Unspoken Security.

[01:06:05] ​