Nonprofit Nation with Julia Campbell

Nonprofit Program Evaluation Made Simple with Chari Smith

Julia Campbell Episode 80

Are you overwhelmed by the thought of conducting a program evaluation for your nonprofit organization? Don’t know how to structure it, or even where to start?  

There are many ways to do program evaluation, making it difficult to know which model is best or which format to follow. 

Help is here. My guest this week is Chari Smith, founder of Evaluation into Action to help nonprofit professionals create realistic and meaningful program evaluation processes. Chari believes evaluation should be accessible, practical and usable, and she’s on the podcast to discuss her book, Nonprofit Program Evaluation Made Simple. Get your Data. Show your Impact. Improve your Programs. This book outlines a clear approach, filled with real world stories as well as examples of evaluation plans, surveys, and reports.  

Some of the topics we discuss include how to create a logic model, measure impact, building a culture of evaluation, survey design basics, clear reporting techniques, and more.


Upcoming event: Survey Design Made Simple

Tuesday, April 18, 9 – 11am PST

Registration information is here. Use the discount code is ‘getyourdata


Connect with Chari:   


About Julia Campbell, the host of the Nonprofit Nation podcast:
 
Named as a top thought leader by Forbes and BizTech Magazine, Julia Campbell (she/hers) is an author, coach, and speaker on a mission to make the digital world a better place.

She wrote her book, Storytelling in the Digital Age: A Guide for Nonprofits, as a roadmap for social change agents who want to build movements using engaging digital storytelling techniques. Her second book, How to Build and Mobilize a Social Media Community for Your Nonprofit, was published in 2020 as a call-to-arms for mission-driven organizations to use the power of social media to build movements.

Julia’s online courses, webinars, and keynote talks have helped hundreds of nonprofits make the shift to digital thinking and how to do effective marketing in the digital age.

Take Julia’s free nonprofit masterclass,  3 Must-Have Elements of Social Media That Converts

Take my free masterclass: 3 Must-Have Elements of Social Media Content that Converts

Julia Campbell  0:00   

It's March Madness people. No, not the sportsball March Madness. It's my 30th anniversary of being in business and I have a goal of hitting 100,000 podcast downloads. Here are three ways to help, One, download one or more of your favorite episodes, including setting your fundraising mindset with Rhea Wong, ethical storytelling with Caliopy Glaros, What the best fundraisers do differently with Sabrina Walker Hernandez, and my special series on What's Next in Social Media for Nonprofits, just to name a few number to share an episode with a friend or a colleague, you can go to pod.link/nonprofitnation to find descriptions and links to all the episodes. Number three, take a screenshot of the podcast and share with your network. Be sure to tag me so I can find it and share it out. Also, I truly appreciate all of you, your time, your attention and your passion to make the world a better place. Now, let's get to today's episode. Hello, and welcome to nonprofit Nation. I'm your host, Julia Campbell. And I'm going to sit down with nonprofit industry experts, fundraisers, marketers, and everyone in between to get real and discuss what it takes to build that movement that you've been dreaming of. I created the nonprofit nation podcast to share practical wisdom and strategies to help you confidently Find Your Voice. Definitively grow your audience and effectively build your movement. If you're a nonprofit newbie, or an experienced professional, who's looking to get more visibility, reach more people and create even more impact, then you're in the right place. Let's get started. 

  

Hi, everyone, and welcome back to another episode of nonprofit nation. Just really excited that you've joined me today my guest and a question for you. Are you overwhelmed by the thought of conducting a program evaluation for your nonprofit? You don't know how to structure it maybe you don't even know where to start. And there are many ways to do program evaluation, which makes it difficult to know which model is best, or which format to follow. But help is here. My guest this week is Chari Smith, founder of evaluation and into action. And evaluation into action helps Nonprofit Professionals create realistic and meaningful program evaluation processes. And Chari believes evaluation should be accessible, practical and usable. And she's on the podcast today to discuss her book, nonprofit program evaluation Made Simple. Get your data show your impact, improve your programs. I love that title. And the book outlines a clear approach like a blueprint filled with real world stories as well of as examples of evaluation plans, surveys and report. So welcome, Chari. 

  

Chari Smith  3:18   

Thank you for having me here today, Julia. 

  

Julia Campbell  3:20   

Yes, I'm excited. And I want to mention that this is being released on March 15. There's an upcoming event that you're coordinating survey design made simple. It's going to be on Tuesday, April 18th, nine to 11am Pacific Time. Registration information is in the show notes. So make sure you click on that link. And you can use the discount code, get your data, that's all one word, no spaces, get your data for some money off that registration. So thank you for that. That's for nonprofit nation listeners. So I appreciate that. Sorry. 

  

Chari Smith  3:58   

Of course happy to. 

  

Julia Campbell  4:00   

Yes. And where did you get your start? We'll usually start with your journey, your journey into nonprofit work, and then how you started to focus on program evaluation. 

  

Chari Smith  4:09   

Yeah, that's a great question. So my journey into program evaluation started here in Portland, Oregon. In 2001. I landed a job at the Northwest Regional Educational Laboratory. That is a mouthful a few years ago, they branded and now they're called Education Northwest. And there I learned about gathering data through a specific framework to help these programs understand what is going well, and where improvements might be needed so that they are doing data driven decision making. And I just fell in love with it. I just love the whole process. And then in 2005, I broke out on my own and started evaluation into action. And that gives me the opportunity to work with a range of organizations at Education Northwest as the name implies, it was all education programs. So I have the great honor working just with so many different organizations today. 

  

Julia Campbell  5:03   

And you wrote this book, which I've understand I understand of writing a book, it can be incredibly challenging, but also incredibly rewarding. So yeah, talk about your book. Who is it for? And why did you write it? 

  

Chari Smith  5:18   

Well, my book nonprofit program evaluation made simple. I wasn't planning to write a book, I don't know how many people out there are like, Yeah, I'm gonna write a book. Okay. I wasn't planning on it. I do a lot of different workshops at different kinds of conferences and for organizations. And every time I did a workshop, I would have someone come up to me and say, Hey, do you have a book? And I was always like, No, do not have. And so one time in 2017, I was setting on building a culture of evaluation at the Oregon program evaluators Network, also referred to as open because it's easier to say, anyway. And so after I presented there, someone came up to me and said, Do you have a book, and without a thought what tumbled out of my mouth was not yet and I thought, wait, wait a minute universe has happened. And this little spark started of like, yes, you should just write a book. And it's really primarily for nonprofit organizations, nonprofit professionals. But it can also be for evaluators who are looking for a different way to approach program evaluation, and to understand how to do program evaluation collaboratively with nonprofit organizations. So that was kind of the start of it. And what I get really excited about with the book is there is a companion website. And in the book, there's the web address, as well as a passcode. And on that companion website, our real world evaluation plans, reports, templates, tools, and I update it on a regular basis with the generous permission from the organizations that I work with. 

  

Julia Campbell  6:53   

That's incredible. So to not just have the book, that's going to give people all sorts of fantastic ideas and how tos, but then actually having the templates and the cheat sheets and those kinds of documents available to them. I think that's wonderful that you did that. That's fantastic. 

  

Chari Smith  7:10   

I learned by example, like I, you can teach something to me all you want, but actually show me like an exam, a real world example. So it would be too long to put in the cumbersome to put in a physical book, an entire evaluation plan. So on the companion website, there are full evaluation plans on there. And you can kind of see a structure that I talked about in the book, and how it's applied and shows up differently. And it's customized for different organizations. 

  

Julia Campbell  7:36   

And I'm sure in the research for this book, and also just in your work every, every day, you come across some maybe common pitfalls, some common challenges. Can you talk about some of these commonalities, when nonprofits are planning to do a program evaluation? 

  

Chari Smith  7:55   

Yes, I can. So I have a big grin, because I get really excited about talking about some of these common pitfalls and the easy fix to avoid them. And so there are a number of them, I'm going to talk about one in particular that I probably see the most common, and that is to be realistic, of what you can collect, I cannot tell you how many organizations I've met with, and they hang their head in shame. And they admit to me, Well, we did a survey. And now we found no one has the time or the skills to analyze the data. So they're just in the filing cabinet, or they're on the network or or they're in our survey monkey account, like no one has the time or the skills to analyze the data, as well as synthesize it into a report so that data can be used. So the data just sitting there participants spent their time completing the survey, and no one's using the data, which breaks my heart. So this is such an easy fix. This is where you bring in that really organized person to be at the helm of the program, evaluation planning process, someone with that project management kind of gene in them. Because in the beginning, when you're doing a survey, one of the questions needs to be Oh, if we're doing this survey, at the end of it, who's going to analyze the data? Who's going to synthesize it into a report? How are we going to share out the findings with the participants and the community at large? How are we going to use the data? So answering those questions in the beginning, when you're also creating the survey itself, will ensure that not only do you gather the data, but then you actually are able to analyze it, use it and communicate it. 

  

Julia Campbell  9:32   

Having the end in mind is so important. Especially I mean, I teach marketing plans and digital marketing. And it's exactly like you said, there's just so much that happens. And oh, we have to set up all these platforms. And oh, we have to create all this content. And we have to register all these URLs. And there's so much franticness around the platforms and the tools, but very little thinking about the why why are we doing this? What do we hope to accomplish? How are we going to use these channels. So thinking all of those questions through, I would imagine is sort of the key pillar of any program evaluation. 

  

Chari Smith  10:10   

Absolutely. Project management and planning is key, right? You have to think about it with the end in mind, I love that. Because I think that's just really important to avoid that really common pitfall. 

  

Julia Campbell  10:22   

And also, I love that you mentioned the people component, you know, who's actually going to do this work? This sounds great. Or maybe a funder requires it, or maybe a board member asked for it? Who's actually going to do it? So that leads me to my second question, because it's a team effort. You can't just be the development director or the grant writer or the marketing person or the executive director, and just sort of create this in a vacuum. So how can nonprofits build this buy in with their teams and address kind of any staff resistance or fears that they might have? 

  

Chari Smith  10:59   

Oh, this is a big one. This is what got me so excited about writing the book to begin with, because I felt like this was a missing link, people were jumping into creating a plan or doing surveys, they were jumping into those pieces. It's like building a house before you've created the foundation. And building that buy in and addressing staff resistance or fears helps to build that foundation so that you have a solid house that you're building on top of it, right? Program Evaluation at score, it's a learning opportunity. And oftentimes, people are doing program evaluation, because funders require it right? I'm sure that you've seen this too. Like we have to get these data because the funder is requiring a report in two months, or whatever it is. So there's like, you know, there's panic with it, there's a resistance, because Oh, my gosh, I'm already so busy doing all of these different things, I don't have time to gather the data. Now, when people feel mandated, because it's required by funders, they show up differently, right? They may also have a fear of like, what if the data show that we're not meeting our goals? You know, will we lose the funding? Will I lose my job? Will the program be discontinued all of this fear, whether it's conscious or not, I think feeds into how people show up and do program evaluation. If we flip that script, if people change the narrative to Don't we need data to know if we are making a difference we intend to make. And then they put themselves as the nonprofit in the driver's seat of determining what kinds of data they need to collect, rather than it being mandated by funder. I will say you still have to gather the data required by funders to you know, satisfy those report requirements. But there's no reason why you can't have a conversation with a funder once you have a plan in place that works for you as the nonprofit and say, This is what we're already doing. Can we provide you these data? Will that fulfill the requirements that you have? Julia, I'm not going to guarantee that'll work 100% of the time. But my experience has been when I work with organizations, and they have that conversation with a funder, the funders just super happy that there's a program evaluation plan in place, and they're like, that looks great. Not gonna guarantee it. But that's been my professional experience so far, is that having that dialogue is really important. Write about, why are we doing program evaluation work? Like what is it all about? If it's a feeling like it's mandated, there's going to be resistance, if it's feeling like it will help us understand, if we are doing the best possible job at serving the people. We want to be serving. People show up differently. It's a mindset. So once that mindset shifts for people, then we make sure everybody is on the program evaluation train before it leaves the station. Because what I saw happen early in my career, is that some people be on the train, or maybe someone higher up would require everybody to be on the train, that doesn't work, don't do that. But once everybody is on that train, and it leaves the station, everything that follows gathering data, those pitfalls that we were talking about, those don't occur, because there's buy in to doing it. So then you're successful at measuring what matters. 

  

Julia Campbell  14:15   

The mindset shift is incredibly important. And I'm thinking about the mindset shift that often happens has to happen in fundraising, where fundraisers to be effective really need to move from this mindset of, oh, I'm bothering donors or donors are fatigued, or donors want to hit don't want to hear from me, too. I'm providing them with an opportunity to make a meaningful gift and to have agency and to be involved in something that they're passionate about. So this mindset shift, you're talking about around evaluation, rather than something Oh, it's on the to do list or it's on our plate or we have to do it and it's a slog to, let's see how this can actually improve our services or maybe shed light on some Some things that we knew were happening that we are excited about. And then we can help implement things that are working and maybe get rid of some things that aren't working. So that mindset piece is just it's so important because I know people, they are going to see this episode, they're gonna want to jump right into. Now we're going to do a little bit more of the tactics. But you know, what tool should we use in surveys and like how to get more people do you know, respond to surveys, but I want to talk about measureable outcomes. Now I remember when I was a development director, and I worked at a domestic violence shelter. And we got a grant from the United Way. And it was when United Way was shifting over to impact on logic models. And we could no longer just rely on outputs, we had to create measurable outcomes, and it was so challenging. So how can nonprofits struggling with this, how can we start to create these measurable outcomes? 

  

Chari Smith  15:59   

Well, that is a great question. Because the key is that it has to be a collaborative process. And I joke around when I do, you know, workshops, and whatnot, collaboration is probably one of the most words I say the most frequently, because collaboration is key for a successful program evaluation process. And it starts with making sure everybody is in the room to define those measurable outcomes together that either collect us report or touch the data in any way. And this is where you can make sure the data that being collected, it's realistic to collect, because program staff will raise their hand and say, no, no, I know, we can't collect that. Let's maybe compromise and talk about collecting these other things, or whatever they might be. So creating measurable outcomes collaboratively, really ensures that when grant writers need to put those measurable outcomes into a grant proposal, they know with confidence program staff are gathering those data, right? Because too often, I've heard the story that grant writers do that in silo, and they put in outcomes to secure the funds. And the program staff may or may not be already gathering those data. So that can create some tension and some report writing at 2am. Easy fix, easy fix, you have to collaborate, I want to share an example with you. I work with Project lemonade, and they have an internship program for foster youth aging out of the system. So we brought together staff, participants and partners to define measurable outcomes. That is what exactly do you expect to have changed as a result of the program activities. So together talking it through, we created seven measurable outcomes. And I just want to read one of them for the listeners today. And that is to improve life skills, such as communication, social interaction, and working in a team. By having that very specific, measurable outcome. development staff know now what to put into grant proposals, and program staff are gathering data around that particular outcome that can be reported out. So everybody's literally on the same page of what they expect to have change. I will also say measurable outcomes, like you were saying earlier about impact models and logic models, measurable outcomes are a cornerstone, whichever model you choose to do, you have to have those measureable outcomes in there, right? Because what an impact model or a logic model, it basically is a visual summary of what your program does. And the change is expected to make. measurable outcomes are a part of that story. They need to be in there. But just so that to drive the point home and has to be done collaboratively, do not assign it to one person to go off into a cubicle and create it by themselves. It has to be a collaborative process. 

  

Julia Campbell  18:46   

What about specifics in these outcomes, because I found that when I was writing grant applications, and you know, working with some funders, they might require you to put numbers in, you know, we will improve this like job skills by 50%, something like that. So how do you calculate those numbers? Or do you recommend if you don't need to put them in leaving them out? 

  

Chari Smith  19:13   

Well, I think when you put in, I just refer to those as targets. Right? And yes, you're right. For some funders, they do require that. And I feel like it's somewhat subjective, like what do you feel like is realistic to expect to have occur within the program that you're doing right? Are you doing this program three times if you're doing it three times, you know, in over two months and say classrooms? How much change can you truly expect? Right? So I think this is an opportunity sometimes to have that conversation, potentially have that conversation with the funder, you know, on expectations. But in terms of how to measure it, then yeah, you need to gather some baseline data to understand where people feel they are initially perhaps in, you know, improving life skills, for example, and then measure that over time so you could report out a percentage that have changed to see if that target has been met.  

  

Julia Campbell  20:05   

Okay, that is helpful because I imagine these evaluations and I know you talked about this in the book, there's quantitative and qualitative kind of outcomes that you can measure and talk about. And I usually recommend putting a story if there's a narrative place in the report that you can actually add characters beyond the character count. Like to put that kind of context that story in, amongst all the numbers, I think it helps tell a better, fuller story of what the organization's impact is. 

  

Chari Smith  20:34   

Yeah, 100%. Agree, I think you have to have the numbers and the stories to be able to talk about progress towards a particular outcome, right. So when you lit when you record it out, you restate the outcome, and then you share the numbers and the stories that aligned to that outcome to show to what degree progress was made. And that's how the alignment works, right. So you create the measurable outcome, you gather data that aligned to the measurable outcome, and then you record out the data, aligning back to or pointing to that measurable outcome. So there's consistency throughout. 

  

Julia Campbell  21:08   

That's important. And one of the key components of a program evaluation is the survey or surveys. I know, there's a lot of myths and misconceptions and pitfalls and challenges of creating a survey. So what goes into a useful survey so that we can measure what matters? 

  

Chari Smith  21:31   

Well, that's where you build on those measurable outcomes. Right? I think that is an excellent point, because it leads perfectly into this idea of alignment, you have to have alignment between your measurable outcome and your survey items. Julio, one of the most common questions I get is sorry, we need to do a survey when we're not sure what to ask. And I say, Well, what do you expect to have change, you have to define the impact you expect to see occur, then you can measure it, if you just create a survey, just kind of like, you know, let's just brainstorm what we want to ask, then it's not going to align to what you expect to have change. So it's harder to than to tell your story. Right? Once you have an outcome and provides kind of a plotline for your story, right. So I'm going to talk through another example. I work with Northwest real estate Capital Corporation, and they manage a number of affordable housing communities. In 14 of those communities, they have a resident services program. So we set up a program evaluation, to understand to what degree what is going well, and where improvements might be needed, right. So collaboratively, I don't know, if anybody's keeping track, that might be the fifth time I say collaboration, but collaboratively, we define five measurable outcomes to meet everybody's data needs, right so that all staff are getting the data that they need. So one of those outcomes is around building community. And I'm just going to read it to you. So you have like the and then I'm going to read off one of the survey items, because I want everybody to hear the alignment. So when you create your outcome, and you're creating your surveys, you should be creating your survey items based on what's in your outcomes, right. So one of their outcomes is to improve socialization within the community, which will lead to reduce feelings of stress and isolation. So again, all about alignment. We did a lot of different survey items to measure that particular outcome. But one of them is that residents completed on the survey, because of the resident Services Program. My overall stress level is and they have four choices. Better, the same worse? I don't know. So do you hear that alignment? I'm hoping everybody listening is nodding their head? Yes, I hear the alignment. So right, you hear the alignment of? 

  

Julia Campbell  23:46   

Yes. It's not a throwaway survey question that actually is going to lead into the measurable outcome. 

  

Chari Smith  23:52   

So once you have the data, right, you have the number in this case is quantitative piece of data. So then you can report back out X number of people or X percentage of residents said and whatever they had to say about feeling like their stress level has gotten better the same or worse, because of the residents Services Program. You're asking them to comment on did this change because of the program, right? And then you're reporting out those data. So again, we asked more survey items than just that. But we were able to then speak about building community and what that looks like. And so it's really empowering to have those data to understand what is really going well, and where improvements may be needed. Like you really uncover things you would have otherwise not known. Another thing was survey design. So you're measuring what matters is go through your survey. And if there any questions are asking because like you feel you should take it off your survey, only ask questions where you will use the data in some capacity, either to report out to funders for program planning or for some other purpose. There's a lot of times I feel people ask for data, because they feel like they should they really should ask, really? Would you use it? Use it with the lens of will you use it? If the answer is no, don't ask it. Right. That's how we measure what matters. 

  

Julia Campbell  25:14   

So how can we get our survey response rates up? I know this is something that people do struggle with at the love that, first of all, I want to comment on a piece of will you use this? Because I work with clients on donor surveys, or, you know, constituents stakeholder surveys around, you know, what would you like to hear from us? What kind of stories marketing content? Where are you on social media, that kind of thing? And I always say, do not ask a question unless you plan on using that data, because it takes away time, and it decreases probably the survey completion rate, if you just have filler questions. So short and sweet. I always agree with that. So what's, what is your secret to increasing survey response rates? 

  

Chari Smith  26:00   

Communication, before, during, and after. So ideally, before when someone first signs on to your program, you can include if you do an agreement, or there's a registration form, you can include a little paragraph of by participating in our program, you agree or you will be asked to complete surveys, participant focus groups, whatever your data collection methods are, basically, you're setting an expectation when they first become a part of your program. So you're communicating beforehand, hey, we are going to be asking you to participate in data collection activities. And your input is really valuable to us. Please plan on participating when asked, right, so you're setting an expectation, then when you actually have a survey that you're doing, maybe a month before you do the survey, it goes out in the newsletter, maybe on social, wherever your audience is, you want to let them know, hey, in one month, we're going to be sending out a link to a survey, the purpose of the survey is to understand what you think about our program, we can only improve if we get feedback from you. So then you're setting it up, right? Then once it goes out, I typically will leave it open for at least 10 days. So while it's open, there are reminders, right? Maybe three or four days, and that's a good benchmark 10 days, okay, that's generally what I do. It can vary. Sometimes it's longer. Sometimes it can be as much as a month, I work with organizations on what's going to make the most sense for their participants. Right. But 10 days is generally what I do. So you send it out, and then you send a reminder, thanks, if you've already completed the survey, just a reminder, it's Dubai and it remind them of the date, remind them why their voice is important. And also let them know you're going to be sharing back the results. So for everybody listening right now, raise your hand if you've ever done a survey. And you never hear about the results of this survey, you complete it almost every time. Absolutely, I bet you are all raising your hand right now. 

  

Julia Campbell  28:03   

Right? Or what is going to be done with the survey responses.  

  

Chari Smith  28:07   

So just like before, when we talked about some common pitfalls, and being realistic, the same is true here, you want to have an intentional plan, it doesn't have to be a robust plan, just an intentional plan on how you're going to share out the survey results. So during the survey, you're reminding them about it. And then after the survey closes, one last emailed everybody, thanks, everybody that completed the survey, our response rate was, and whatever it was that we look for in the next month, or whatever it's going to be to share a summary of survey results. And then do that share that survey summary. Right? So the survey summary can be very high level very simple. Where that here are the five things we learned, here are the five things we found, were going well, here are our action steps we're going to take based on the data, we look forward to you continuing to provide your feedback in the future. 

  

Julia Campbell  29:01   

Wow. And that really leads into my next question, which is reporting out this data. So we've spent a lot of time and we've been really intentional, creating these measurable outcomes, crafting a survey or crafting some kind of data collection mechanism where we're collecting the data and collecting the stories. So in your blog, and in your book, you talk about visual communication being important because and I totally get that no one wants to see you know, a bunch of numbers on a chart and using data visualization to create compelling reports. So what are some ways that you know small nonprofits can do this kind of on a shoestring? 

  

Chari Smith  29:41   

Will you set it right? Gone are the days of just text and tables thank goodness because those were like so long reports. We want interesting graphics and it doesn't have to be really fancy to be honest with you. If you are on a shoestring budget, you can just change the font size, change the color, put it in your brand colors So perhaps when you're sharing out, you know, 87% said this, your 87% is larger, maybe it's a 16 point font. And it's green, if that's part of your brand colors, right, with a statement that also supports that 87%. So you have the numbers and the stories and the outcomes. I do have on my website under case studies, there are some examples on there, you don't have to have the book to access my website, you can just go to my website, go to the case studies page. And there are examples you can look at. Because again, I think learning happens by example. And what you'll see there is like the usage, like I was saying, of different font sizes, different colors, photographs, different graphics, so you're visually sharing with people the results, it's so true that a picture is worth 1000 words. So I don't think you have to I'm not a graphic designer. You don't have to be a graphic designer to do this. You just have to think about visually, how can you communicate the results, I mean, you're still going to have words on the page. But there are opportunities. And if you need support, yes, I talk about it in my book. But one of the great data visualization gurus is an emery and she has a blog. And she has so much great information. She has an I think she even has classes and academies and so on. And she does a wonderful job of breaking down like, here's how you maybe have shared demographics in the past. And here's a more interesting way to share your demographic data in the report. So she does a lot of wonderful work of sharing by example. So again, I do talk about it in the book. But if someone's really looking for some more information, I do recommend her as a resource. 

  

Julia Campbell  31:39   

That's fantastic. That's really incredible. And I know that you know, just kind of shift gears here a little bit, that you are incredibly creative. And we've talked about writing a book, which is a one way to be creative. But you write songs, musicals and plays. So how do you keep up this creativity? And tell me about your creative side? And some of the things that you do? 

  

Chari Smith  32:06   

Yes, I have been creating, and particularly writing songs on the piano most of my life. 

  

Julia Campbell  32:12   

Oh, so my son just started taking piano lessons.  

  

Chari Smith  32:15   

Ah, then you'll love this book then. Okay, so the so I was a piano teacher a million years ago, like a couple of careers ago, because I started out wanting to play piano wanting to be a rock star. You went to Berkeley, I went for it started out at Berklee College of Music. That's not where I ended. But yes, my plan was to be a rock star, you can you can all see how that went. But I do love writing. My most recent creative adventure was in March of 2020, when COVID first hit, my daughter is an artist. She was in high school at the time. And we did a children's picture book called the piano. And it's about the friendship that evolves between a musician and her piano, and how it evolves over time as she gets older. And it's told from the perspective of the piano. I wrote the original story, actually, in 98, before my daughter was born, but my daughter has always been an artist. So we just busted it out. We got very focused, since no one knew what was going to be happening in spring of 2020. It was such an uncertain scary time, we just channeled that energy into a creative project. We got very lucky and found a publisher, Black Rose writing published it in almost a year ago, February 3 2022. And it's just the best part of that creative project was creating it with my daughter, L because you know, she's a freshman in college now. And you know, when our babies leave the nest is a little bit hard. But yes, we have this book forever and ever that we created together and just our Sunday meetings and talking through the process. It brought joy into a time that was extremely uncertain. So yeah, that that is something that I that we did. So thanks for asking about that. 

  

Julia Campbell  34:04   

Do you want to talk about what your dream is? Because you said it to me. You wrote it down? Oh, right. Right. So my dream your dream musical? And I would I would I love this. I love musicals. And I think this is amazing. 

  

Chari Smith  34:17   

Okay, so anybody listening that maybe has some funding dollars around, collaborate around this. My dream is to get a gig writing a musical about a day in the life of a nonprofit, because I love writing musicals. It's just so much fun. And I think that being a part of the nonprofit sector for you know, a couple of decades now, I could in my head, like already started I'm sure you could too, right. So to write, like, what would that look like? I also think it could be really fun at a nonprofit conference instead of a keynote. Let's have a mini musical about a day in the life of a knife. 

  

Julia Campbell  34:53   

A fantastic idea or some. I love that idea. Do you follow Voula A So whose latest thing? Because you know as we're recording this right and Valentine's Day on Instagram is to do reels about like romantic. Those like bodice ripping novels around nonprofit professional. 

  

Chari Smith  35:16   

Oh, I haven't seen that. I have to check that out. Yeah, they're hilarious. 

  

Julia Campbell  35:18   

And ASMR for Nonprofit Professionals. So I think I think Vu might be a great collaborator for you. 

  

Chari Smith  35:24   

Okay, so if you're listening, let's do it. Let's create a musical. 

  

Julia Campbell  35:28   

Yes. Okay. Awesome. And also, I wanted to ask you about your upcoming event, I talked about it a little bit. But if you could just give us sort of the who, what, when, where, and how people can register? 

  

Chari Smith  35:41   

You bet. So it's survey design made simple. On Tuesday, April 18, from nine to 11. Pacific Standard Time, I am so excited about this, because we're going to it's be very interactive, and we're going to dive into what to ask how to ask it. I know, we already talked today about increasing your response rate, but we're gonna get deeper into all of these different topics. So at the end of the two hours, people have concrete ways to design their surveys with confidence. So they're truly measuring what matters. And I know you have it in the show notes, you're gonna have like the link.  

  

Julia Campbell  36:12   

I have it in the show notes. And yeah, just make sure to use the code, get your data, all one word, 

  

Chari Smith  36:20   

all one word in an all lowercase. Okay. 

  

Julia Campbell  36:23   

and where can people find out more about you? The book and some of your services? You bet. So 

  

Chari Smith  36:30   

my websites the best place evaluation into action.com. There's a lot of information on there, about my services about different resources. There's a web page dedicated to my book. So I invite you to check that out. And I hope you enjoy your program evaluation learning journey. 

  

Julia Campbell  36:48   

Yes. Well, thanks, Chari. And I will be sure to update all my listeners on the evolution of this nonprofit Day in the Life musical.  

  

Chari Smith  36:57   

Oh, my gosh, it would be so I'm telling you, if someone's listening at coordinates conferences, reach out, we're going to make it happen. All right. 

  

Julia Campbell  37:04   

Thank you so much for being here. 

  

Chari Smith  37:06   

Well, thanks for having me, Julie. It was really wonderful. 

  

Julia Campbell  37:15   

Well, hey there, I wanted to say thank you for tuning into my show, and for listening all the way to the end. If you really enjoyed today's conversation, make sure to subscribe to the show in your favorite podcast app, and you'll get new episodes downloaded as soon as they come out. I would love if you left me a rating or review because this tells other people that my podcast is worth listening to. And then me and my guests can reach even more earbuds and create even more impact. So that's pretty much it. I'll be back soon with a brand new episode. But until then, you can find me on Instagram at Julia Campbell seven seven. Keep changing the world you nonprofit unicorn