The Optimist Circuit
Building the Circuit Connecting AI, Tech, Nature, and People to Spark Optimism and Power Solutions for Society.
The Optimist Circuit Publication is your gateway to exploring how human ingenuity with AI, technology, and nature are solving societyās most pressing challenges.
Through compelling interviews with AI, tech, and business leaders, real-world case studies, and stories of groundbreaking innovation, The Optimist Circuit delivers insights and inspiring narratives that highlight how human ingenuity, technology, and nature can work together to create a better future.
Join us as we spotlight people who are pioneering businesses, startups, and research, revealing the human ingenuity behind transformative ideas that connect communities and amplify human potential.
Our mission is to empower changemakers, innovators, and thought leaders with stories and strategies that prove optimism, collaboration, and innovation are the keys to solving global challenges.
Ellen Spooner, founder and host of The Optimist Circuit, brings over eight years of strategic communication experience with organizations like NOAA, the Smithsonian, and the Waitt Institute. Her expertise in making complex science accessible to millions and her passion for AI and tech is the foundation of this publicationās commitment to impactful storytelling.
Subscribe now and be part of the conversation shaping the future of tech, AI, and sustainability.
The Optimist Circuit
Money Left on the Table: The ROI of Eliminating Bias
Ever wondered why some brilliant startups never get funded?
The answer might lie in unconscious biases that plague investment decisions. In this eye-opening conversation, we explore how Illumen Capital is revolutionizing access to capital through their innovative bias reduction program.
š What youāll learn in this episode:
- The research behind bias in investing: How identical profiles with only the race of a photo changed investor decisions.
- Practical tools for reducing bias: From standardized first interview questions to multiple reviewers in hiring and investment processes.
- Real-world impact: Illumenās portfolio has achieved gender parity across teams, proving that reducing bias isnāt just ethicalāitās profitable.
- A success story: A rural Texas founder nearly overlooked, who secured funding thanks to a fresh perspective that recognized opportunity others had missed.
- The human + tech balance: Why the most promising companies combine technological innovation with deep human understanding.
- Life & investing advice: How slowing down decision-making processes can actually accelerate long-term success.
About our guest
Joanna Kuang is the Senior Vice President of Product and Impact at Illumen Capital, where she leads efforts to dismantle systemic bias in investing and expand access to capital for underrepresented founders.
š Connect with Joanne on LinkedIn: Joanna Kuang
š Learn more about Illumen Capital: Illumen Capital Website
This podcast was sponsored by EVCHARGE4U your partner for fast, reliable EV charging solutions for homes, businesses, and fleets. Powering a cleaner tomorrow, today.
Welcome back to the Optimist Circuit podcast, where we are connecting AI, tech, nature and people to spark optimism and power real solutions for society. President of Product and Impact at Illumine Capital, a trailblazing firm that's using impact, investing data and innovation to dismantle systemic bias and reimagine who gets access to capital. Thank you so much for coming on.
Speaker 2:Thank you so much for having me. I'm so excited to be here.
Speaker 1:For our first question could you tell our guests a little bit more about what Illumine Capital is and what you do?
Speaker 2:Sure. So Illumine Capital, we're a fund of funds and we really that when you keep everything the same and just change the race of a photo on a page, people are more likely to, and more willing to, racial bias, and so we've kind of created this proprietary special bias reduction program for all of our fund managers that we've invested in in order to support them as a key part of stewardship Once we've made the investment, to reduce their own biases in different parts of their processes, and then we use rigorous impact measurement and management to understand whether or not these bias reduction efforts in our program is working or not. And so I lead our platform program, portfolio support basically everything that touches on bias reduction or impact, and my background is kind of in impact investing. So I've been working in impact investing in some capacity ever since I graduated from undergrad. I went to undergrad in the Deep South and experienced living and teaching in the Mississippi Delta, which is where I saw racial bias affect student outcomes and why I wanted to commit my career to working in impact investing.
Speaker 1:Yeah, and I feel like this is such a cool focus and something that, at least myself, I have not heard of very often and I did. I wanted to understand a little bit more about your journey and how you got into impact investing and then specifically with Illumine Capital, and how that's been for you so far.
Speaker 2:Definitely so. Like I mentioned, I really saw the disparities in outcomes in a town in Mississippi that was basically still segregated, and I saw the power of financial but also social capital like that actual emotional support and those actual social tools and connections and relationships that affected student outcomes in Mississippi. So I thought, okay, there's a lot of money out there. And I grew up in New Jersey, so I was near Wall Street, adjacent to New York knew that there's a lot of money out there. How can we get these dollars to be more impactful and more impact-oriented, specifically in a way that works for the communities? So I kind of spent my career looking for different forms of capital that could be effective. I started in public markets impact investing so working with the stock market and then spent a few years working with outcomes-based financing with the government, so similar to value-based payments in Medicaid dollars. This was basically helping government pay for performance for social services beyond healthcare, and obviously there it's super important to be able to measure what matters and measure whether or not you're actually improving people's lives and you're working with the most marginalized, systemically marginalized.
Speaker 2:Because of bringing a lot of these kind of tools to thinking about, we're not just making investments right, investing in these funds. Once we've invested, we're not saying, okay, you've got our money, like good luck. We're actually walking hand in hand with them over the entire investment period to support them in the bias reduction program. That's structured in some ways, but also ad hoc in other ways, where we're kind of almost an in-house consultant or someone that they can come to with any questions or needs that they may have where they're facing bias or struggling with different parts of their investment process or their hiring process or how they're supporting portfolio companies. And so I really like that.
Speaker 2:It sort of the through line of my career and what I get to do at Illumine is combining financial capital with social capital and doing so in a way that's at scale right. We're not just working with a few companies, we're working with. You know, right now we have over 25 fund managers and they have in total over 700 portfolio companies. So we're getting to kind of influence indirectly all of those portfolio companies as well and hopefully build this muscle where people are reducing bias for their whole careers, not just where they are right now, but they have that muscle and that skill set to move forward for their futures.
Speaker 1:I think you hear a lot about that at a company level or at an employee level, but I at least often haven't heard of that from an investment level, and so I'm curious if you could explain a little bit about what your process is like and how you conduct that training with and how you even choose who you invest in, and all of that from the beginning.
Speaker 2:Sure, yeah, we can start with kind of yeah, like you mentioned, it's the full life cycle, right? So at the beginning it's really are people in a place where they want to engage in bias reduction efforts and their core thesis the funds that we're investing in their thesis is focused on something impact-oriented. They're able to articulate why biases may be relevant to seeing different outcomes at the customer level for their portfolio companies. So almost looking for kind of that foundation of buy-in. We're not necessarily looking for funds to look a certain way or anything like that. It's more about their mindsets and we'll talk more about this. But this is kind of related to the data that we collect. It's not just quantitative but also qualitative. But then that sets us up for, like, if they're willing to go on this journey with us, right, the bias reduction journey.
Speaker 2:And so I've mentioned our bias reduction program. It has a lot of different parts. A core part of it is a biostruction curriculum or toolkit. The toolkit has basically evidence-based one to two page tools. So they're either how-tos or almost not checklists, but things to keep in mind when you're making a job description or when you're doing a certain part of your investment process. A lot of them are ideas for how you can improve processes to make them more equitable and really focus on creating that stickiness in processes to be reinforcing of the kind of human behavior you want to see and having that standardization, rather than focusing only again on the representation piece. With these tools we also have animated video tools and case studies of examples from the field. So with these kind of more research-based and we're constantly sharing new research or using new research to make these tools with these kind of research-based tools and nudges that are coming on this live living platform, we're able to kind of seed great ideas.
Speaker 2:But then of course, like an idea is just an idea until a person implements or does something about it, and so there's a human component.
Speaker 2:We have quarterly coaching where we check in with each of our fund managers. We have point people we try to keep someone in leadership involved and then we usually have like one or two people where DEI and bias direction is seen as more a part of their jobs, and so we'll coach them 30 minutes on average every quarter and check in with them on where are their actual opportunities for you to implement some of these processes or where are you getting stuck right as people are trying to change their behaviors, versus just something that is a good idea on a piece of paper, right? So I think we see this combination work. And then there's a cohort building, community building element to it as well stronger resilience and stronger sustainability of this work throughout time, because there will be attention to processes and community built, and both of those factors will allow this work to be a little bit stickier. So that's kind of what we're doing throughout our investment period as a part of our program in order to support our fund managers.
Speaker 1:Thanks so much for sharing all of that really great information. I was curious after you implement all of that training throughout the process, how do you measure your impact from all of that?
Speaker 2:Yeah, impact measurement is super important to us at Illumine and to me, because I used to work in outcomes-based financing. So when I joined, one of the first things I did was build out Ill Lumen's theory of change, or our logic model. So kind of started from what are our North Star outcomes that we're really trying to generate in the world? And this was more quantitative who's hired at fund managers, who's invested in in terms of portfolio founders and portfolio CEOs and how those companies are growing over time. But I think one big learning from my time in outcomes-based financing is that quantitative data requires context, and so quality metrics or process metrics, particularly ones that are qualitative, are really helpful to add additional context to quantitative metrics. And I think this is really important within the work we do around bias reduction, because biases are about changing human behavior and attitudes. So it's not just about the D in DE&I of diversity of representation, but also about understanding. Are processes changing? Are they becoming more equitable? Do people feel that their firms are more inclusive? And these are things that are a little bit more qualitative, right, and they're still I guess quote, unquote quantitative, in that you can ask them through a survey tool, but they're measuring things beyond just what it looks like in the portfolio, and so those were kind of more of our outputs process metrics, quality metrics, you can call them whatever you want, but we use an annual survey where we measure, where we ask every single person in our portfolio so you could be an operations associate, you could be an executive assistant or you could be a general partner and you get kind of equal, say, to fill out the survey around qualitative opinions. So there we're asking you know, why do you care about bias? What are barriers you experienced to reducing bias? Do you feel that you've changed your actions over the past year? Do you feel like your firm has changed their actions over the past year and like, for example, one takeaway we've seen is that there was a discrepancy between individuals and how they felt their firms had acted, and so this allowed us to create more programming to support that specific opportunity and I didn't want to say issue area opportunity that we identified through the qualitative data.
Speaker 2:So we're measuring qualitative data once a year and quantitative data once a year, because one other additional factor is reporting burden. We don't want our fund managers to see reporting as a burden. We want them to really feel that the data is useful, and not only useful to us, but also useful to them. So the final thing I'll share is that we always share the data back with our fund managers in dashboards. We have privacy agreements in place with all of our fund managers in our portfolio, so we're never going to share something at the individual level, or even at their fund level, but for our entire portfolio.
Speaker 2:So, across all the funds we've invested in, we share dashboards back with our funds and they like that as almost like a mini benchmarking tool or a way to get ideas for where they could focus their time in future years. And so I think creating that conversation around data as more of like a feedback loop and not data as impact measurement for just the investor us this is perspective and kind of more of a compliance measure of like oh, we have to do our quarterly reporting, oh, it's time to report data. We really want data to be useful to them, so we always make sure that that feedback loop is closed.
Speaker 1:Are there other interesting findings that you have seen from this reporting and what are some of the things that you've seen? What are some of the impacts that you've seen?
Speaker 2:Yeah, so one thing we see is one thing we measure, in addition to, like, basically, changes in demographics, is also hiring, exits, promotions. So we're trying to understand more of those things that might relate to retention and inclusion. And I think, you know, with general partnerships they don't change that much right year over year, that much right year over year, and so there's much more opportunity to create gender parity, for example, which we have gender parity across our portfolio for full teams, and we saw that change kind of happen over time right. So when we first started, um, doing this impact record in 2020, that wasn't the case, and by last year, so, um, we are always kind of six months behind because we don't get our reporting from our fund managers till end of Q1. So for 2023, we saw gender parity and then we continue to see that in 2024. And so that was really exciting to see that fund managers are seeing hiring and again, not just hiring for a specific type person, but more making sure their hiring processes are delivered in an equitable way or in a standardized way across everyone who's doing it and making sure that there's processes in place that create that stickiness that is an antidote to bias, that that kind of worked to then drive those longer term or north star outcomes that are more about what your firm looks like and if those people are also staying right versus like leaving and going somewhere else where they might feel more included.
Speaker 2:Um, and then we always see bias as a continued sort of actually this year it spiked even more as like kind of the primary factor that was driving a barrier to reducing biases. I recognize I'm saying bias a lot, but like when we first started, a lot of people said it was a pipeline problem, so there's not enough qualified women or people of color right. And now we're seeing that people are identifying at least that their own biases. So it's not the fault of the individual, it's more not fault but it's more the responsibility of the people who work at the funds already to reduce their own biases in order to make sure that there's increased equity and inclusion at their firms. And so I think we've seen that shift from almost blaming individuals to taking more ownership throughout time, as we've kind of gone through our bias reduction program and delivered it year over year.
Speaker 1:It's kind of I don't know ironic. It's funny to hear that At first they were saying, okay, there's not enough, but then now they're recognizing their own biases are what is getting in the way of reducing bias, which you know makes sense.
Speaker 2:But yeah, yeah no-transcript red or green.
Speaker 2:Like often, you're doing that almost in the back of your mind, while thinking about other things, while driving right, it's not like 100% of your attention is being paid on that.
Speaker 2:And so we're all doing things to make our lives easier and like that's not a good or bad thing, like it doesn't make you a good or bad person.
Speaker 2:And so I think it's like identifying first identifying that you are biased, then identifying the situations where you might be bringing bias and you might want to change right, and then it's identifying how you're going to change those processes and I keep talking about processes because, yes, it's about behavior change, but also it's about those processes and the data kind of creating more stickiness so that, like you know, when you're tired you still brush your teeth right, because it's a habit you've created from youth.
Speaker 2:And so I think it's like we're trying to create more habits or more processes that you just kind of rinse and repeat do, even if you are tired or want to, are under market pressure and want to kind of revert back to what your biases would naturally do, which is pick someone who's more similar to you, from your hometown, looks like you, et cetera, went to your alma mater. We want those processes to be in place so that, like, even if you are tired, or even if you're, yeah, like wanting to default to your biases, you have these standards in place of like this is how we do things here and so, yeah, I think there's just so many steps to it and a lot of it has been around even help people identify the first step of like okay, I want to change.
Speaker 1:Before making the change, yeah, and behavior change, I think is like one of the hardest things that human beings can ever do, even when you do acknowledge that you want to make that change yourself. You know even myself like in my day to day I want to make sure I'm aware of my own biases and changing that. But yeah, like you said, the first step is to have awareness, because it's hard to make change when you're not even aware of that. And it's interesting also to hear you talk about hard to make change when you're not even aware of that and it's interesting also to hear you talk about you know you want to kind of slide backwards sometimes when you're feeling market pressure. So I was wondering if you could describe that sort of scenario a little bit more and what are some of the processes that maybe some of your fund managers have used to get through that, so that others can learn.
Speaker 2:Yeah, so so we know that from research that not research we've done, of course that biases are kind of triggered by certain things, so a sense of urgency, and in investing, everyone acts like everything is urgent and and some things are really urgent because a deal you know, maybe they're about to close, right, and you have to decide am I in or am I not, and at what price. The bias that comes from a sense of urgency and market pressure right, creates that urgency, the fear of the unknown. These are all kind of triggers of bias. And so we need antidotes, which often include stickiness and slowing down, to create processes that are antidotes to those bias triggers. And so some processes that we've worked on, for example, are one when you're hiring, having two people rather than one in a hiring panel, so having two people using the same scorecard to evaluate the same candidate and we recognize this isn't realistic for maybe every single step, but maybe it's in the final step, or maybe it's in a step like, if you're hiring for the operations team, having someone from a different team, so they're bringing a different perspective, kind of sit in and also grade that, use that rubric to grade the candidate right, and so that's a way that you're slowing down because you're adding numerous perspectives and that's eliminating kind of the fear of the unknown or the fear of difference being a trigger of bias.
Speaker 2:Another example in investing is we worked with one of our funds to help create more of a standardized process to do their first initial interview or conversation with a founder. So obviously when you're in an investment pipeline, founders need to build a relationship with funds and you don't just like get a check from meeting with them once, right, and so in that first meeting I think it can it's a ripe opportunity for bias if there's not a standardized way to sort of ask those questions or introduce a fund to a founder. We've seen from research that, like, women get asked more questions about downside and risk, and so that sets kind of a negative tone for their business, whereas men get asked more questions about upside and opportunity. So that sets an optimistic tone for the conversation, right, and so, of course, an investor you're going to leave and you're going to be like, oh, that guy's company, like it could be a unicorn, it's going to be amazing. The woman's company. You're like you don't even maybe realize, but you're like I feel like there's a lot of risks, but it's like because you asked about the risks rather than asking about like what could you be in five years if everything goes well right.
Speaker 2:And so we worked with our fund to help with introductory questions.
Speaker 2:So everyone uses fives of the same kind of introductory questions in their first chat and again we recognize like you'd need to move with speed, you need to get the information.
Speaker 2:So we're not saying use the same questions forever, but it's more that everyone's incorporating a few of the same questions in their first chat to at least give people an equal playing field or equal opportunity to share their wins and their progress. And it's regardless of who the investor is, so you could get somebody who is more similar to you or someone less similar to you and they're giving you a similar set of kind of questions and opportunity to talk about your business. So I think that that's a really interesting like process example that we've helped our funds implement. And going back to kind of what our program is right, we have our bias reduction toolkit, so these are kind of where the ideas get seeded. But then we have our coaching, so that's where we help you actually implement it in a way that works for your specific fund. So we used both of those mechanisms to support this fund in creating those questions and then actually launching them and getting their whole firm of you know, say, 15 investors to change their processes and change their behavior.
Speaker 1:That's so interesting. I've never thought about the fact that, even like the types of questions that you ask because you would think like, oh, what are the risks in your business or what are the opportunities, you wouldn't think would have gender biases. But I mean inherently by asking those questions, it has a presumption that there is risk or there is opportunity.
Speaker 2:Exactly, and I think we also see from research that, like, women are more likely to answer the question that they're asked, whereas men are more likely to answer the question they want to be asked, and so that contributes to it even more right. In the same way that we see like women are less likely to apply to a job if they don't check every box, whereas men are much more likely to apply to a job where they only check a few boxes.
Speaker 1:I'm aware of that last one from personal experience.
Speaker 2:Yeah, and I've tried to combat it right, but it's really hard to do as an individual. It's so hard because you're like I don't want to be not truthful, you know, yeah, and I actually think this is a good point too. Around like another process or idea that we've helped funds with is so, like, when we're applying for jobs, right, if we see that they posted that research, like that company has that awareness, like oh, we know women apply to jobs less. Even if you don't check every box, we still welcome you to apply. It makes me more likely to apply, right. And so we've worked with our funds to do the same thing on their investment pages.
Speaker 2:So a lot of our funds in order to enable cold outreach and not only have warm outreach, so friend to friend introductions to potential investable companies.
Speaker 2:But a company like in rural Texas who doesn't know anyone in Silicon Valley, like give them the opportunity. They have kind of a form on their websites that's like you know what round are you raising? What's a two-centage description? What impact or issue area are you focused on, et cetera, like all these questions that they use for like a scorecard. But one thing we'll work with funds to do that's super simple, is literally like putting something like that their DEI, esg commitment, or just a sentence or two saying like we know that it's harder to get capital access if you don't have the networks to be networked to, like Coastal VC. So we welcome all to apply and we'll give, we promise that we'll like look at every single application. So even that allows people to feel more willing to put their time into it, and so it's like a similar small signaling thing that funds can do to invite kind of more different perspectives and different kinds of founders to the door.
Speaker 1:That's so interesting because I've never thought of that as like a biased reduction tactic, because I have seen that on job applications even if you don't qualify for all of the things, we still welcome you to apply and it does make you way more like okay, like they know that maybe I don't have everything, so it's okay. Yeah, exactly.
Speaker 2:And I think it shows that they're willing to be not progressive but like, willing to be open minded and take a risk on learning and teaching. And I think that's similar to like investors where, like, they're looking for good businesses. I think being a good investor, exercising your fiduciary duty, is like almost being contrarian to the market, right. So it doesn't really make sense to me that, like a lot of people want to just invest in the big, the companies that the big VC funds already invested in or that their friends invested in already know, like you want to find that contrarian kind of business opportunity that others aren't seeing, and so it allows you to also tie, like, biostructural work to financial returns.
Speaker 1:And actually that's what I that was leading perfectly to my next question that I was thinking of is like okay, so they're taking a little bit more risk. It's like opening up kind of like new, kind of unseen opportunities. So have you seen that in the fund managers and the companies that they're investing in, are they investing in sort of like innovative products and services, because they're like willing to change their biases, because I think, for example, like so many products that are needed for women that just like weren't done or weren't seen, and then it led to like this huge opportunity like Spanx, you know, or something like that. I don't know if that is something that you guys have seen in your work.
Speaker 2:Yeah.
Speaker 2:So I think that at the end of the day, like all of our funds are also trying to beat the benchmark, like we as a fund, and then all of our portfolio funds are not concessionary, so we're not like the kind of impact investors where we're okay conceding returns in order to drive a greater impact. It's more that there's a fundamental belief, kind of to what you're saying around Spanx that there's more opportunity that's being missed by the market. And this goes back to the research paper we did right, where same track record, everything's the same, just the race of the picture on the one pager is different, and that even that created bias, right, and so that showed that people are leaving money on the table. So our thinking is, like people are leaving money on the table by not seeing opportunities like Spanx or like a bunch of the women's health companies that we're seeing kind of come, rise up, and the thinking is that, like one, investors with different perspectives, maybe investors who didn't come up through just investing and came up through education or came up as a doctor or whatnot can better see some of those opportunities and that if your processes are less biased, you're enabling them to talk about kind of that magic of their company, what makes them special, rather than the biases of what the founder looks like or where they went to college, almost clouding your ability to see that true financial value.
Speaker 2:So, like, one of our funds has an interesting example where, like they, somebody a founder that's now in their portfolio from Texas, like not Dallas but like somewhere more rural DM'd them on LinkedIn and DM'd like one investor.
Speaker 2:And the investor was kind of like I mean, you can email me your stuff, but kind of like I've never heard of these people, so you know, and so kind of got them into the you know CRM and the initial process, but then kind of said like no, and then, like a few months later, they did this interesting process where they have somebody go in like every few months to kind of the rejected pile and look through all of them. And this was a different person. This happened to be a woman and it wasn't a woman's health company or anything, but just somebody with a different worldview. Um, and that person was like oh, this company looks interesting, and so then they brought it out of the reject pile and then we're're like let's take a closer look at it. And then had a bunch of team members meet with them, et cetera, went through the diligence process and now ended up investing in them.
Speaker 1:Oh, wow, that's so cool. That is really interesting In doing this work. How is this reshaping the tech, like industry, businesses that are growing, results that you're seeing and maybe just more in general, have you guys noticed anything about that?
Speaker 2:So it's interesting when we talk about what's tech-enabled right.
Speaker 2:Because they think like⦠there's a lot of overlap between what's tech enabled and SaaS and what's venture backable, because, at the end of the day, venture investors particularly they are making a lot of bets and hoping one or two of them grow enough to almost be the one that returns the entire fund or is kind of the driver of that entire fund success, and they might be ready to write off a lot of other companies to zero right, and so I think it requires almost this like rapid growth. It has this rapid growth mentality and that often relates to what's tech enabled, but I think we're seeing more companies where there is a tech enabled component but there still is that reliance on a human component, and I think that that's what a lot of this bias reduction work around thinking about human behavior allows people to do in a world that's increasingly moving towards AI being at your fingertips and everyone is using AI. I used AI to help me prep for this interview, like I'm not gonna lie about that, right.
Speaker 1:I use AI to help me prep for this interview.
Speaker 2:Both of us are using. Yeah, we're also tech enabled, but I think, at the end of the day, you and I are the ones having this conversation right, and so I think the kinds of companies that we're seeing emerge are ones where there's still a human component, like there's one. Where are ones where there's still a human component, like there's one, where, yes, it's tech enabled, but it's providing healthcare in the context of schools, and so that's being funded by a bunch of ed tech VCs, but it is still ultimately a service that's happening physically at a school with students who aren't otherwise maybe getting healthcare access that they deserve, even like a basic checkup or something. And so I think we're seeing a lot more of kind of this melding across sectors as well, that tech is actually genuinely enabling you to understand, like at your fingertips, how to change behavior day to day. Like another company that's interesting is like one where they are providing customers with like a food score of understanding the healthiness level, and so if you're looking at peanut butters, you can kind of understand which ones they all say like oh healthy, right, but which ones actually look healthy.
Speaker 2:They're like going under the hood and like looking at the ingredients, but you can use that as a customer when you're if you have SNAP benefits or something and you're at the grocery store, you can use that really easily to understand, like what gets me healthier food, and potentially, like they're working on a potential initiative where maybe you could actually get more benefits from the government if you make healthier choices, because we've seen that historically in WIC and so there is that opportunity to do so with the people who are on SNAP as well.
Speaker 2:So I think, like all these things are interesting right, because they are tech enabled, you're using tech to maybe bring in doctors who aren't in that community to provide more specialized healthcare. You're using technology to help customers make decisions. At the end of the day, it's a student who's getting healthcare, it's a customer who's ultimately walking through the checkout line and deciding which peanut butter do I want, which yogurt do I want, et cetera. And so it still requires that human behavior element to understanding the value in that work and also understanding which of these five companies that are doing something similar should I fund right. And that's, I think, where the tech and human sides kind of come together.
Speaker 1:And that's really interesting because obviously right now in this AI revolution, everybody's scared of like tech's going to take over everything. But it's interesting to hear you say that the companies that have that human element are correct me if I'm wrong the ones that you're more likely to invest in, because you see those human benefits, even though they're still tech enabled, it's still, at the end of the day, people interacting with people in some way or another.
Speaker 2:Yeah, yeah, and I should say, like we don't make the ultimate investment decisions. Right, we're investment funds. But for me as an individual, like that is where I have the optimism, like to the name of this podcast, optimism around the use of technology and how technology can genuinely improve people's lives rather than harm or replace them. Because I think, at the end of the day, like there's a lot of especially institutions that might be more historically entrenched, for example, public schools that aren't going away yet. Right, you could add as much technology as you need in those schools and that's not making those people who have, who live you know my students in Mississippi who live in a rural community, who have a single parent who works three jobs like that's not changing really. They're maybe changing their schooling experience, but that's not changing their lives outside of school, and so I think there's a lot of companies where they're being more innovative around how to bring tech to think about the person as a whole person. That is where there's going to, I think. I think gonna be a lot of value.
Speaker 2:Um, and like I think in health care, for example, the like care coordination space is very interesting. So like working with another historically entrenched, very almost bureaucratic set of institutions, hospital systems. How do you help them bridge the gap between the hospital system to when the customer, the individual who was maybe in the hospital, is now needing to go back to their primary care doctor or go back to navigating this complex system on their own as an individual, right Like once you leave a hospital, it's kind of like okay, hopefully you're healthy, like good luck, right. So I think there's a lot of interesting work being done around like the care coordination spaces.
Speaker 1:Cool, and that definitely also gives me optimism, which, as you noted, the whole purpose of this podcast is to share stories of how people and technology and nature are all coming together to make our future better, and I feel like these are great examples of that. So is there anything else that you would want to share around? What gives you the most optimism in the work that you're doing for our future and everything?
Speaker 2:I think the only other thing I'll say is that we've seen from research that's been published recently I think it was an HBR article that kind of looked historically at organizing movements and at kind of what's helped with resilience in the past, and this is a time where, again, our approach to biostruction isn't that you're a good or bad person, it's that processes can better enable the outcomes that you want to see and close outcomes, disparity, gaps and also make people hopefully you know, not promissory, but hopefully make people more money because they're seeing value where they're not currently seeing value. And a lot of this research that has been done has showed that community is a core element to this work. And so for us we're tech enabled in that we have a tech enabled community. We actually use a vendor that's one of our underlying portfolio companies. We did a whole vendor diligence and that was just the one that kind of scored the best. But for us we're building kind of this community where we want all of our fund managers to come together and support one another so that we're not the only central node and kind of this. You know, we don't want to be like above everyone or be this only reason that they're focusing on it. We want people to support one another and build the muscle to really have that community and ask one another questions rather than having to go only through us.
Speaker 2:And I think that community piece is like really interesting and is work that I'm seeing done at a lot of VC firms. So I'm in a community of people who are head of platforms or head of community at other VC firms or at other interesting firms working on AI. So one example is like I have a friend now through this community. It's called community of community. So I feel is like I have a friend now through this community. It's called community of community. So I feel like term but this is a community of people who who their job title is part of their job is building community. And one friend recent friend is the head of AI at Fast Forward and they work with tech nonprofits to support with responsible AI. So I think that there's a lot of interesting work being done around.
Speaker 2:We all recognize like AI is going to be a part of everyone's day-to-day lives and work. How do you do so in a way that's responsible? How do you do so in a way that acknowledges that a lot of the data it's being trained on and behaviors being trained on are what have have been, have worked and have gotten to the top and been historically entrenched, which isn't the world that we want to see. But I think it's really interesting that a lot of these people have community as a part, community building, as a part of their jobs, because I think that will be an important human element, no matter how much technology there is.
Speaker 2:Um, you see that that there's communities being formed, even on Reddit or more informal spaces around. What are the best ways you can prompt Claude, what are the best ways you can prompt ChatGPT to get what you want right, and I've seen some really creative uses. Here's how to prompt it to get a good chief of staff description based on the five challenges that your company is facing as a series A company right, and so I think that there's a lot of really interesting work basically being done out there by others not just me, and I want to lift that up that is focused on community building in tangent with AI.
Speaker 1:That really honestly makes sense to me, because anything, anytime I'm going through anything, if I hear that somebody else is going through the same struggle, then I'm like, okay, I'm not alone. And then if you expand that to you know company scale, network scale it definitely makes sense that that builds resilience. And I think, especially right now, we're in a time where we all need strength and community and resilience to help achieve our goals. And I just love hearing that building community in this way and doing this bias reduction can be done while at the same time also wanting to achieve your bottom line and earn money, and they're not, you know, opposites. You can do both at the same time, also wanting to achieve your bottom line and earn money, and they're not opposites, you can do both at the same time. And so I think all of that and hearing all of the great work that you guys are doing is definitely part of what gives me optimism and why I love doing this podcast.
Speaker 2:So if there's any last note that you would like to share a piece of information with our listeners, um, I welcome you to share anything basically continuing to pay attention to where there are opportunities to slow down and that that slowing down can actually potentially drive greater value in the future. Um, I think really is. I mean, I sound so philosophical, but it's almost more of a mindset beyond just right financial returns. It's like that's a mindset I'm trying to adopt in all of my life, using processes and using habits to help me slow down and really think about do I want to do this or am I doing it? Because it's what I've always done and I just turned 30. So I feel like that's a classic, like, oh, you're in your 30s, you're thinking about what it's going to be like. But I think that that's relevant to investing, but it's also relevant to all of life.
Speaker 1:I was just thinking. I can't help but think like I. Recently, you know, I've gotten into meditation and I use that sort of thinking in my work because I do my best problem solving when I slow down and that's what helps me solve those problems better in my work and achieve our goals. And so, yeah, just thinking the same processes apply across the board, even when you're, you know, a fund manager trying to decide which founders to invest in, or somebody who's interested in founding a company on your own, and thinking can.
Speaker 1:I do it, am, I, am. Am I the right fit for this sort of thing? You know at all levels, so that's really cool, exactly Well. Thank you so much for coming on. I have really enjoyed this conversation. I definitely feel more optimistic and I appreciate your time.
Speaker 2:Yeah, thank you so much for having me. This was so fun and, yeah, really appreciate it. Thank you so much for listening. Don't forget to subscribe to the Optimist Circuit on Spotify or Apple Podcasts so you never miss an episode and let's keep the conversation going.
Speaker 1:Follow us on LinkedIn, youtube and Instagram at the Optimist Circuit for more insights and inspiration.
Speaker 2:Until next time, stay optimistic, stay curious and stay inspired.