.png)
The Catalyst by Softchoice
A podcast about unleashing the full potential in people and technology.
When people and technology come together, the potential is limitless. But while everyone is used to hearing about the revolutionary impact of tech, it can be easy to forget about the people behind it all. This podcast shines a light on the human side of innovation, as co-hosts Aaron Brooks and Heather Haskin explore and reframe our relationship to technology.
The Catalyst by Softchoice
Google’s human-centric AI
What if your company could leverage AI to not only increase productivity but also improve your mental well-being?
In this episode of The Catalyst, Heather Haskin speaks with Anna Baird, Chief Customer Officer for Generative AI Go-to-Market at Google, about the crucial human element in AI implementation.
They discuss how to leverage AI to enhance, not undermine, the employee experience, fostering psychological safety and well-being. Learn how to navigate the challenges of AI integration while prioritizing employee empowerment and creating a more supportive work environment.
This episode is brought to you by Google Cloud. Simplify document processing with Google Cloud's Document AI, capturing 99 percent of data from any format. Visit softchoice.com to get started.
Featuring: Anna Baird, Chief Customer Officer for Generative AI Go-to-Market at Google
The Catalyst by Softchoice is the podcast dedicated to exploring the intersection of humans and technology.
This episode is brought to you by Google Cloud. Simplify document processing with Google Cloud's Document AI, capturing 99 percent of data from any format. Visit softchoice. com to get started. You're listening to The Catalyst by SoftChoice, a podcast about unleashing the full potential in people and technology. I'm your host, Heather Haskin. By now, many organizations are relying on generative AI to help streamline and tedious tasks that. Free you up to focus on what really matters your creativity, problem solving and your customers. But what does all this mean for employees in an economic climate where we're asking people to do more with less AI offers tremendous opportunities. It also presents significant challenges. As AI power tools take on more tasks, employees may fear being replaced or losing the human connection in their work. This can lead to feelings of insecurity or even burnout if not addressed properly. Organizations need to think strategically about how to implement AI in ways that enhance, not undermine, the employee experience. Today, I'm joined by Anna Baird, Chief Customer Officer for Generative AI Go to Market at Google. We'll explore how AI can not only improve productivity, but also foster psychological safety and well being. Anna, thank you so much for being here with us today on The Catalyst. Thank you for having me, Heather. Absolutely. I'd love to start off a little bit by getting some of your background, and I'd love to just learn how you got where you're at today briefly. Absolutely. Absolutely. So I actually did a talk recently at one of the universities here and they asked me to timeline some of my career and what I'll call a squiggly line career from where I thought I'd end up, which was journalism, spoiler alert, to where I've ended up, which is in technology. It's always been an interest to be part of an education process. How do I intake a bunch of information that people are trying to work through and digest and understand? And how do I make it Maybe simpler for people to understand. Or how do I attribute storytelling to it so that it's stickier in their minds? The capabilities of public speaking, conveying, facilitating, and helping people interpret information has always been at the core. And this is sort of a design thinking mentality for me now, but empathy, understanding the actual challenges and sort of rooting myself in the problems that exist, emphasizing with the person or the people or the group or the company or whoever it is you're trying to solve for. That's always been a part of, I think, my world. I love that you mentioned a squiggly line because it's not a ladder getting where you got to and people don't have things predefined and determined on how to get to being the chief customer officer of generative AI go to market at Google. Appreciate that you brought that up because it shows that diverse background that you have that brought you to where you are today to really see and have that empathy. With that information, I would really love to learn a little bit more about maybe what your purpose statement is as a leader. I've been thinking a lot about this, and I think for me it's an evolving statement. My purpose, like the ice sculpture that a lot of my work happens around at the moment, is really understanding all of the various use cases and opportunity for the technology. So AI, more broadly speaking. And then for how I've come up, which is a lot of data based decision making, is understanding that data can actually do now with where we are with generative AI. So what data sets matter, how data gets combined to make insights. What's really interesting. interesting and what I find both challenging and also very empowering about being at a place like Google is the culture of innovation that happens around employees. Google uses our own technology to serve our employees best or better. So that could be anything from Bringing a potential new employee in. So having that initial conversation with talent outside, nurturing them through the funnel, having them participate in an interview process, getting data back about how those interviews are going and what that means for when we actually offer people roles and then they onboard and how do they do when they onboard. So you have. That side of people operations and then you also have the side or other sides that look like how employees give feedback and it's always a question of what's working and what's not. And I think data decision driven culture comes back to how employees are feeling about the current state of their environment, how could they be creating more impact for the world because we build technology for the world. If they could. do something differently, if they could have access to different data, if they could collaborate differently with their colleagues. So I think the employee focus for me has become a lot more relevant and interesting and it's really exciting. You're bringing innovation to that employee experience. And what really speaks amazingly to me is The employee experience is so important and it can often be overlooked. And when you spend time and energy bringing innovation into that, it's going to shine through in the products and services that come out of it because those people are happy and they're well cared for. So it sounds like empathy and putting yourself in the shoes of the end user is what drives you. Definitely. And I think back to when I made a leap from customer experience consulting, so working for a consulting house and serving lots of different clients around particularly CRM systems, I feel like I learned a lot in that space. But when I made the leap from consulting into working for a technology company, I joined LinkedIn at quite an interesting time where a lot of the belief, especially as an employee, was you walk the talk. So if you're going to go and talk about the power of the platform and what it can mean for your network, you better have stories about what you're doing to use the springboard of a platform to really, you leverage its insights so that you can do a better job or you can be more insightful when you're speaking to your customers or consumers or clients. And I remember LinkedIn largely saying our employees are also the best example of how we innovate. So if employees are trying new things, adding new things to their profiles, How do we take that to scale and is that something that affects a larger group of people? Do we have education sessions for ourselves and we bring in outside speakers so that we can continue to push the technology and understand the value of the technology for different industries? So I feel like LinkedIn was the first place that I cut my teeth on this mindset around innovation and as a technology company what you can do not only with The tech that you have and how you take it to market, but also the employees that want to join that type of a company. How do you leverage them to be people that test internally? What does that mean to nurture talent? Like when you allow people to test something that hasn't hit the market and give feedback and be really a part of that mechanism. So all of this I think comes back to is communication there to employees feel listened to, do they feel like they can Put their hand up and join the movement. Those are all of those kind of culture of innovation pieces that we tend to come back to. AI really does change a lot. So the question is then, how are you getting your teams to engage in it? Are you setting aside time and learning sessions? And what is the current landscape right now with that engagement? I think. My humble opinion living in the Google world right now, and then living in the broader technology world is, I think we're showing people different ways that they can leverage the technology. And then everyone's at a different part of the continuum on how comfortable they are to trial it and to put it to practice. Google had our earnings call yesterday for Q3 of 2024. And what is so fascinating about Gemini's capabilities for me is the ability to take. A YouTube link, so something like our earnings call replay and put it in and say, tell me what I need to know about this video or give me five bullets. about this video and Gemini will produce that for you and it'll be key highlights from the entire hour long conversation around our earnings call. So that's fascinating on its own. And then more recently talking about Google on Google examples, a couple of our finance teams shared with me That they've been using Notebook LM, which is another product offering that we have. We've been using it a lot internally to digest 10 Ks and 10 Qs. And what it actually does, which is amazing, is it creates a podcast, kind of like what we're doing right now, out of the technology. So it will read all of those earnings reports and it'll have, you know, two hosts that talk just like we are right now about the earnings in those documents. And it's. All generated through notebook LM artificial intelligence coming to life in that way is so fascinating to me and the voices sound incredibly real. We ran an example with a particular company's financial data that was all public and we just ran the 10 K's and 10 Q's through it. And then we actually shared with that company, Hey, we use this tool. This is what we built. How accurate is the readout that we received? And we got some really interesting feedback from that company saying, actually, this is, this is quite accurate. There was one place here where there was some nuance that it didn't pick up. And so we thought, well, that's really interesting. So if you had actually input additional data, cause you obviously know the nuances we don't, we just took kind of the street documents. It would be even more intelligent as a readout because it would have that extra data to bring that nuance forward. So those are two like low hanging fruit examples I would give but I find fascinating because they make my work a lot easier when I'm scanning South by Southwest speakers and I'm listening to their videos speaker's hour long session. You can now leverage the technology to give you those bullets and that advice. It's having an advisor in my pocket where I can have these conversations and then you can really personalize it. And I think one of the examples that sticks out in my head over the last couple of days is I actually had a conversation with a colleague who I deeply respect and She and I were having sort of a frictionful conversation around some things where we just didn't see eye to eye and when we got off the call, I asked Gemini how I might re approach that individual to have a further conversation that was all around checking my own communication style and the verbiage I was using and how that might land with someone. And this is the kind of advice or advisory that I think we jokingly cast aside when it comes from a manager or a parent or a teacher. Having the technology tell you the same information somehow made it easier, simpler, maybe less personal. There are some simple things that we can do as individuals to start. Advancing our knowledge of this technology and getting more comfortable with it. Cause it's, it's safe to say it's here to stay. And so I think the more we think about the technology as a springboard into higher impact work, I think the easier it becomes for us to play with it and not fear it. Is your organization dealing with scattered unstructured data across invoices, contracts, and records in multiple formats? For IT leaders, manually processing these documents is a time sink and prone to errors. Enter Google Cloud's Document AI, powered by SoftChoice. DocAI captures up to 99 percent of data from any document, no matter the format or location. It automates data extraction, classification, and validation, reducing operational costs and freeing up your team for higher value tasks. With soft choice and Google cloud, you can streamline document workflows, improve accuracy, and unlock new insights from your data. Ready to get started? Visit softchoice. com or call 1 800 268 7638 to connect with an expert today. Soft choice and Google cloud, helping it leaders drive smarter, faster document processing. With the rise of AI and other emerging technologies, thinking about the way that employee engagement and well being is transformed and that engagement and well being for customers and for companies and organizations, how do you see that transforming beyond the small use cases that you mentioned for yourself? There are a lot of pieces that we take for granted that happen with teams or colleagues around us that the technology can now assist. So human in the loop, the value of artificial intelligence and the fact that people will be able to do more high impact work as a result of using it. So it's not about removing people altogether. It's about how it becomes a great baseline for Uh, human to review or to leverage and then to call out where it's accurate and maybe where it's not so accurate because like people, the systems are learning and the more data we share with the systems and the right data that we share with the systems and who's determining what the right data is. Those are all of those conversations, especially around ethics that are happening. When you start to realize that writing a job description or a role description often has bias in it, it's no one's fault, we're human. We all have, whether it's proximity bias or whether it's bias from the community we're in or the work that we do, whatever it is, we all have that opinion based decision making. And what's really interesting about technology. And what we're seeing in Google is we have examples where we're now sort of looking at role descriptions and how we can use what we call help me write. So how we can use the technology to write a role description. And the fact that in using technology to write that baseline, maybe we're sanitizing the language a little bit more. Maybe we're not using genderized language that could put an individual off applying for a role because of the description that is in front of them. I think it's also really interesting to be able to look at something like replies. So a conversation I was having with our people operations team was when our people operations humans are replying to specific challenges and outreach from employees, being able to look across those responses. and figure out is there a way we want to maybe streamline a response. So it's important to have that personal touch, of course, no one wants kind of the, the same five sentences coming back, no matter what the question is, but maybe in Yeah, there's um, key themed questions. Like maybe there are questions around something like mental health and wellness. And so using the technology and being able to create baseline responses and then see the data that's coming back, the data that's coming back is teaching those baseline responses how to be more. empathetic, how to be more thoughtful, how to question the information it's receiving. People, operations, individuals can then do higher impact work because they're able to use a baseline or check their responses against a baseline. I think a lot of that kind of research and knowledge, that's what these technology capabilities are bringing to us, especially when it comes to the employee landscape. And those are some of the things that we're not only trialing, but really trying to make best practice in Google with customers we work with through Google. That's often why we open source a lot of our research and findings as we're trialing and erroring in some places It's incredible to think about AI making Things more human centered, because it can learn from us and then provide a, like you said, a more sanitized view on that environment to help us create welcoming spaces for our peers. It's a really interesting way to think about it. I know that there have been those that are concerned that AI will take our jobs or reduce what we need to do. So how can organizations strike a balance between automation and maintaining that human centered work environment? Yeah, isn't that just the million dollar question. I, I think we look at it a few ways. We've been through waves of technology change. throughout our lives. I was sharing with a group the other day, do you remember when bring your own device to work was like this big moment and challenge? Like, what do you mean people would get to choose their own phone or mobile? What do you mean they could choose what network it was on? How would you secure that information? What happens if that employee leaves your organization? Like there were all of these questions around that. And now bring your own device for a lot of companies is just mainstream. They've created applications and capabilities to securely monitor and give access to employees. Similarly, obviously when the internet, you know, happened and everyone was like, what do you mean information's accessible to everyone? That's nuts. Who's regulating that? I think the difference maybe generative AI being such a huge space that everyone's trying to grapple with, where data exists and how people have shown up to share their data has really evolved quite quickly. So if I think about my own personal data journey, I'm not a huge social media person, maybe because I've worked in technology and I know that everything, you know, that is put out there has a footprint and really can never be deleted. And so I'm maybe a little bit more cautious as a result, whereas talking to a student body the other day, they would love to share more data and information with companies that they purchase from and have a relationship with from a buyer perspective. They would love to give more information so they can get better deals, more accurate responses. And so Maybe my question back to us all is, are there groups of us that are more comfortable with data sharing than others? Yes. And is it an education thing? Is it just, uh, a rooted belief? Is it a bad experience for those of us that have felt fished or, you know, like we're getting marketing all of a sudden that we didn't sign up for and we're curious as to how they got our profile or footprint. So I think there are more questions around comfort with data and comfort with sharing of data more so than questions around, well, is generative AI going to be a problem or an issue? What will it do that will benefit us versus maybe some of the knock on effects of bad actors using it and what would that look like? I think it's more baseline getting a comfort with data sharing and how data has been shared and where data lives. before we get into maybe some of the deeper questions around solving the world's problems. I'd like to shift a little bit and ask you about something I've heard you speak about before, psychological safety. How do you define that and why is this such a critical factor in today's workplace? So I think psychological safety has changed a lot and what it means for employees and how you can, what we say at Google, bring your best self to work and bring your whole self to work. I think for me, that has manifested in a few ways in my experience, particularly in tech. I think number one, because the technology space moves so quickly, there's always a question around in an innovation cycle or when you have an innovative community, there are always going to be ideas that fail. There will always be frictionful conversations. And I think for employees. And especially with the customers that I tend to work with, the question comes back to, have you created a physical and sort of emotional space for people to bring great ideas forward, to celebrate failure, which I think is a foreign concept still for a lot of people. There are celebrations of successes, but then celebrating failures actually shows people that There's a way to fail and when you do that you can actually bring those learnings and those become kind of the golden nuggets that everyone else can leverage so that you're not reinventing the wheel every time. And so celebrating failures is really about showing your math, like maybe your end result, the answer was wrong. But, Somewhere along the way, you probably had some right steps and others can leverage those right steps because of my proximity to Google and also the way in which we develop our culture of innovation, I feel like there are a lot of acknowledgements around language that's used. And simple mission statements or strategies and the kind of verbiage that's used can perhaps determine whether an employee feels connected to it. I think something like performance reviews and how messages are delivered and how suggestions are offered and how people are grown and what skills are we honing and what skills do we already have? I think all of that positioning and psychological safety comes back to is my voice heard? When it's heard, is it acted on? Do I feel like my peers are listening to me? support me or at least what Kim Scott would say, care deeply and challenge me directly. So these are some of the things that manifest at Google and places like LinkedIn, where I've worked. Startups are trying to get that culture piece right from the start. So they know that they'll get more out of their team and employees if psychological safety is built into the structure and the strategy and the empowerment and the environment. And then I think. You know, we're not all created equal and there are inequities in the world. And so looking at underrepresented groups and trying to understand how those groups may not feel as included or respected or. even championed in the workplace. That's been another place where our employee resource groups has really tried to focus on underrepresented groups to say, if we're going to build for the world as a company, which is how Google thinks about our mission at large, then we have to represent the world to our best ability in 180, 000 people. And the best way to do that is to invite as many perspectives and experiences and backgrounds. to speak up and bring conversation and data to the table. That's huge. Thinking of themselves and the organization as impacting the world is true. Google does impact the world, so hearing that they're utilizing that thought pattern from the inside out, I'd love to hear some more challenges that maybe not just Google, but organizations face in maintaining a sense of psychological safety for their employees. One thing that I find gets an interesting look when I bring it up is we have this sort of mentality or approach maybe called head, heart, and feet that we have put different leaders through at Google. And the head, heart, and feet discussion really comes down to we all have a certain communication style and we all have a certain learning style that we lead with. And there are obviously components of other styles that come in there depending on circumstance for some people, but. You tend to defer to a certain communication style, as an example, and so what we find is in, say, a C suite or a board level conversation, if you ask people to think about the way in which they communicate, how do they lead? Do they lead with their head, which is that strategy mentality? Do they lead with their heart, which is that kind of emotional connection to what you're trying to communicate? Do they lead with their feet, which is that action oriented piece? And when you ask leaders to bucketize themselves, they tend to leave with one of those three. And then when you think about a community of employees, you're probably excluding two thirds of the organization by only considering leading with that one type of voice or that one approach to communication. And so the head, heart and feet challenge is really about looking at the team around you and figuring out if you lead with your head, do you have a heart and feet? support around you that is going to help you poke holes in the type of communication you send, the way in which you set strategy, or build strategy and goals. And then I think just at a human level, we all do what works for us. When we feel challenged, we defer to something that is comfortable. And what I think is interesting about psychological safety and the advancements in technology is. You can defer to that comfort level, absolutely, but then, like I was mentioning, you can use the technology to challenge how you would typically address a situation. And I feel like the technology, because it doesn't know you necessarily, but it knows of other people. other ways that other people are addressing a similar situation, it will give you tips and tricks that you haven't thought about. So it's almost like seeing around the corner for you, or seeing the blind spots for you. And Heather, I think you said it's often easier to hear it from say, Gemini, than it is from maybe my thoughts. Um, parents or my manager or my direct leadership. I think sometimes because a human is conveying it, there's an emotional friction in someone giving you maybe hard feedback, whereas getting it from a technology, this is drawing on other people's experiences without having to deal with a person delivering that message. So as we think about AI and leadership. With AI playing a larger role in decision making and daily operations, how can companies ensure that they are still focusing on the human element of work and support employee engagement? I think you made a good couple of comments earlier, which is some of the low hanging opportunities or use cases for AI and a lot of the generative AI use cases that are now being shared. I know. As Google, we just shared 185 generative AI use cases that different companies are leveraging, whether it's kind of customer focus or employee focus or more of a data focus, which are some of the agent capabilities. I think it comes back to whether it's repetitive tasks that can be made maybe more efficient, more streamlined, uh, more effective because you have a technology that can digest so much more to intake so much more information than humans can. The technology doesn't really have a tired moment, but as people we have up and down days, we have up and down moments in our day, we have moments of a lot of energy in moments of tiredness. And so I think that's where the technology wins. There's more consistency and there really aren't the down times in quotations that humans have. I think the other thing, when you think about a team, you might have one individual who's great at testing, whether it's building a report that's going to be used over and over again. But if that employee or that intelligence somehow leaves or changes, You're dependent on that individual. Whereas if you're sharing that capability through the technology with other team members, they become smarter of having access to that report, that data and that information. But you're also essentially reassuring the business that someone won't walk away and something won't break because you're sharing the intelligence across. I also think it breaks down silos. Employees. often feedback that information is not being shared across, which inhibits them from doing their best job. The more democratized, if I use that word, that it is for employees, the more they feel a part of the solution, a part of the movement. I'm not saying that artificial intelligence is going to change everyone's job to be the same job. I think we're all still going to show up with our skills and expertise, but I think that's where the human in the loop. Helps the technology to be better. I think those are some of the ways I think about it. I really appreciate being able to speak with you today and really dive into that more human side of AI and how AI can actually help us be more human centric. So, Anna, I'd love to hear from you where listeners can find you to learn more about you and your work. Well, as I mentioned earlier, I'm not the best on social media, but I do have a LinkedIn profile. So I would say that's one place for sure. A lot of that is searchable. And then I think what I was sharing in the 185 generative AI use cases that Google Cloud shared last month, I would encourage everyone to definitely follow our cloud blog for Google Cloud. And you'll see some really In my view, exceptional details around how some of these companies are starting to leverage this technology and see impact and results. So a lot of that is in that blog. Wonderful. Thank you so much again, Anna. Today's conversation made clear that integrating AI should never be solely about optimizing workflows or making your workforce more efficient. It's also about empowering your people and supporting their well being. And the key to ensuring a healthier, more engaged employee experience is through human centered leadership that leverages AI to foster a supportive and productive work culture, especially in the evolving hybrid work landscape. I want to thank Anna so much for joining me and thank you for listening. I'm Heather Haskin and this is The Catalyst. See you again in two weeks. The Catalyst is brought to you by Soft Choice, a leading North American technology solutions provider. It is written and produced by Angela Cope, Philippe Dimas, and Brayden Banks in partnership with Pilgrim Content Marketing. This episode is brought to you by Google Cloud. Simplify document processing with Google Cloud's Document AI, capturing 99 percent of data from any format. Visit softchoice. com to get started.