Community IT Innovators Nonprofit Technology Topics
Community IT offers free webinars monthly to promote learning within our nonprofit technology community. Our podcast is appropriate for a varied level of technology expertise. Community IT is vendor-agnostic and our webinars cover a range of topics and discussions. Something on your mind you don’t see covered here? Contact us to suggest a topic! http://www.communityit.com
Community IT Innovators Nonprofit Technology Topics
Nonprofit AI: Human Rights and Community Rights Resources
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Nonprofit AI Check-in: Reclaiming Human Rights and Protecting Local Communities
In this midweek check-in, Carolyn Woodard steps back from the technical "how-to" of AI to look at the "why" behind the ethical concerns many nonprofit leaders are feeling. This episode focuses on two significant areas where AI intersects with our core mission of serving others: the protection of basic human rights and the physical impact of the infrastructure that makes AI possible.
First, we explore insights from human rights lawyer Malika Saada Saar on how we can reclaim AI to serve dignity and democracy rather than ceding it to companies that make a profit by exploiting sensitive and marginalized groups.
Second, we look at the reality of the data centers powering these tools—specifically the environmental and economic pressures they place on local residents. Carolyn shares resources on creating Community Benefit Agreements (CBAs) with data centers as a path forward for local communities.
Whether it's protecting a survivor’s right to privacy or ensuring a local community has a say in how a data center is built, these are the human issues that will define how we use technology in the years to come.
Featured Resources
- Webinar: Slow Violence, Fast Tech | Watch on YouTube In this 30-minute session from All Tech Is Human, human rights lawyer Malika Saada Saar discusses the "slow violence" of AI and how nonprofits can advocate for technology designed with consent and safety in mind, particularly for women, children, and marginalized communities.
- Report: Why Community Benefit Agreements are Necessary for Data Centers | Read at Brookings As data centers expand rapidly across the U.S., this Brookings Institution paper explains how "Community Benefit Agreements" (CBAs) can help local leaders and residents negotiate for transparency, environmental protections, and shared economic benefits.
- Community IT Resource: AI Ethics and Policy Webinar | View the Framework If you are ready to start moving from learning to doing, this webinar provides a practical framework for nonprofits to begin drafting their own AI use policies and start conversations around ethics.
Next Step for Your Organization
Does your nonprofit have an AI ethics policy yet? If not, now is the perfect time to start the conversation with your leadership and board. You don't need to be a technical expert to advocate for your organization's values. We encourage you to use these resources to continue your education and ensure that your use of AI remains mission-aligned.
_______________________________
Start a conversation :)
- Register to attend a webinar in real time, and find all past transcripts at https://communityit.com/webinars/
- email Carolyn at cwoodard@communityit.com
- on LinkedIn
Thanks for listening.
Hello and welcome to the Community IT Innovators Technology Topics podcast, midweek nonprofit AI check-in. I'm your host, Carolyn Woodard. I want to start with my disclaimer as always that I am not an AI expert. I am going on this journey with all of you and sharing what I discover, and hopefully we will all get smarter together about this new technology that is transforming everything. Welcome to this check-in. I wanted to share a couple of resources with you this week. A couple of weeks ago, I went to a fascinating webinar held by All Tech is Human, which was with the human rights lawyer Malika Saadasar. And this webinar was called Slow Violence Fast Tech: Reclaiming AI for Civil and Human Rights. I will share the link with you. It's a half an hour long, so it's worth checking out. It's on YouTube. But I wanted to share a couple of quotes from it with you and also just some of the things that stood out to me, particularly around the way she grounded the discussion of AI tools and the concentration of power in the human rights discussion that came out of World War II and their reaction to World War II, and the idea that this transformative idea at the time that there are, you know, what in the United States we'd call inalienable rights that humans have, whether or not their governments grant, allow, or recognize those rights. And so what happened in World War II to civilians, what happened to Jews in the areas that Germany controlled, was a violation not of a legal right that they had in those countries, but of a basic human right. And out of that grew the UN's Declaration of Human Rights Charter. And it's something that we've been using in these years since. I think it's something that kind of resonates. It just seems like a natural idea. But in this webinar, the lawyer Malika Saar said kind of went through how this evolved and where it came from. As a human rights lawyer, she kind of gave us this quick update that I had not thought about it in that way before. And then she tied in that discussion of human, basic human rights to the many, many ways that technology does not respect those rights, particularly for women and girls, marginalized people and children. And that from the start of these technologies, the safety and security and right to privacy of those groups has not, has not only not been respected, has not been built into the technologies, but in fact the technologies and the tools have been built on an economic model that finds profit in exploiting those groups with violence, with sexualized violence, with anything that is against the consent of the person whose image it is or who is being doxxed or being found through these technologies and has not does not have a right to privacy in the ways that people used to have a right to privacy. Or used to have privacy, I guess, through obscurity, maybe, and now that these tools connect us all, there are very few guidelines and guardrails on protecting those marginalized and vulnerable and sensitive populations. So if you are an organization that works with those types of communities, I urge you to watch this webinar. As I said, it's pretty short, it's 30 minutes. But two of the quotations that they pull out are if we don't claim this technology as ours as well, then we only cede it to the abusers. We allow it to be weaponized against us. The question now is how we shape it, design it, and deploy it in a way that serves our dignity, equality, and democracy. So just a very thought-provoking webinar, and I hope you'll have a look at it. And this relates back to a couple of weeks ago we talked about the several buckets of concerns that nonprofits have with AI as AI, not like, is this tool working for me or is there a glitch or do I have a problem with it, but just conceptually with how AI works. And one of those buckets was the imbalance of power between the small group of men who, for the most part, who are controlling these tools and the invasiveness of them in our lives and the amount to be able to use the tool, the amount of our own privacy and safety and security that we give up. So just a fascinating discussion of it. And uh related to that, another bucket that I had talked about a couple of weeks ago when we talked about these issues were the effects on the communities of data centers. And I think, you know, I've been seeing a lot of stories on both sides of the environmental impacts. Uh, there were some statistics that came out early on about the water use and energy use, electricity use. Uh, it seems like there's some counter statistics that are maybe part of different marketing campaigns on either side to make us be excited or upset about the amount of water that one query takes. And uh I can tell you living in Virginia, Northern Virginia, where we have the highest concentration of data centers on the planet right now, and all of our electricity costs are going up as just humans who live in a house and have an electricity bill. Uh, so that is all being driven by this frenzy to build these data centers. So I wanted to share another resource with you from the Brookings Institution about community benefit agreements. And this is a long article research paper on the relationship of communities to data centers, to AI tools, and to profits for the companies that are building the AI data centers. Um, and they come to the conclusion that, or that they lay out the groundwork. So data centers are crucial for AI to work. Protests have arisen throughout the US over the financial, energy, and environmental concerns. I would say also the community concerns and the places that they are building these data centers are on quote unquote marginalized land, which is often a way to say the people who haven't had anything for centuries are the ones who owned this land. And that's where we want to put our data center, or are living right next to it and are being uh excluded from the discussion about where these data centers are sited and how they impact the environment. Um, AI companies could be working closely with local leaders to establish community benefit agreements, which help address public concerns and provide greater benefits to communities that are impacted by the data center construction. That can be environmental, it can be cultural communities, just where your house is, that you're now suddenly next to this loud, noisy, uh, huge block. Uh, but transparency and cooperation between firms, local institutions, and residents that are not occurring at the moment or are infrequently occurring are essential to facilitate this community input into where those data centers are sited and to support residents' digital access, well-being, employment, and other concerns. So I will share the link with you to this report. It is fascinating. It's I what I liked about this resource is that it does a great job at laying out the issue, the problem, the challenges, but it doesn't just uh stop there. It does, in fact, propose this uh community benefit agreement model as one way going forward that can help uh communities, counties, cities, and residents uh interact with the companies building the data centers in more mutually beneficial ways. So I urge you to have a look at that. Those are my two uh resources to share with you, harkening back to our conversation about the more generalized issues and ethics of using AI tools. I would love to answer questions. Uh, please get your questions in. We're on Reddit r slash nonprofit IT management. We um you can contact me through our website, community it.com, and I really look forward to hearing from you and keeping this conversation going. So you can, I will see you again, or you will hear me again on Friday with our regularly scheduled podcast on many different topics, and back here again on Tuesdays for nonprofit AI specific check ins. Take care.