.png)
Edtech Insiders
Edtech Insiders
How EdTech Leaders Earn Trust Through Responsible AI and Data-Privacy Best Practices
In this special episode, we speak with Daphne Li, CEO of Common Sense Privacy, alongside leaders from Prodigy Education, AI for Equity, MagicSchool AI, and ClassDojo—recipients of the Privacy Seal. Together, we explore how the edtech sector is tackling one of its biggest challenges: earning trust through responsible AI and data privacy practices.
💡 5 Things You’ll Learn in This Episode:
- The purpose of the Privacy Seal
- How Prodigy builds trust through privacy-by-design
- How AI for Equity supports schools balancing innovation and safety
- How MagicSchool AI integrates privacy into teacher tools
- How ClassDojo makes privacy core to its values
✨ Episode Highlights:
[00:02:18] Daphne Li on why the Privacy Seal matters
🎙️ Recipients' Interviews:
[00:10:51] Ben Johnson, CTO of Prodigy Education on trust as edtech’s ultimate currency
[00:20:52] Aaron Cuny, Founder and CEO of AI for Equity on small district challenges
[00:38:48] Adeel Khan, Founder and CEO of MagicSchool AI on building trust through friction
[00:49:22] Tina Hwang, Head of Legal at ClassDojo on privacy as core values
😎 Stay updated with Edtech Insiders!
Innovation in preK to gray learning is powered by exceptional people. For over 15 years, EdTech companies of all sizes and stages have trusted HireEducation to find the talent that drives impact. When specific skills and experiences are mission-critical, HireEducation is a partner that delivers. Offering permanent, fractional, and executive recruitment, HireEducation knows the go-to-market talent you need. Learn more at HireEdu.com.
Every year, K-12 districts and higher ed institutions spend over half a trillion dollars—but most sales teams miss the signals. Starbridge tracks early signs like board minutes, budget drafts, and strategic plans, then helps you turn them into personalized outreach—fast. Win the deal before it hits the RFP stage. That’s how top edtech teams stay ahead.
As a tech-first company, Tuck Advisors has developed a suite of proprietary tools to serve its clients better. Tuck was the first firm in the world to launch a custom GPT around M&A.
If you haven’t already, try our proprietary M&A Analyzer, which assesses fit between your company and a specific buyer.
To explore this free tool and the rest of our technology, visit tuckadvisors.com.
[00:00:00] Aaron Cuny: I think the most innovative system, school systems are gonna find ways to support innovation that involves lots of small scale pilots, and we see a lot of systems that are thinking about like, how do we let a thou a thousand flowers bloom, if you will. The challenge though is that you can't do expeditious data privacy evaluation of a thousand flowers.
[00:00:27] Alex Sarlin: Welcome to EdTech Insiders, the top podcast covering the education technology industry from funding rounds to impact to AI developments across early childhood K 12 higher ed and work. You'll find it all here at EdTech Insiders. Remember to subscribe to. Pod, check out our newsletter and also our event calendar.
And to go deeper, check out EdTech Insiders Plus where you can get premium content access to our WhatsApp channel, early access to events and back channel insights from Alex and Ben. Hope you enjoyed today's pod.
We have an incredibly cool and very unique episode. Today we are talking to Daphne Li. The CEO of Common Sense Privacy. And the reason is privacy is an incredibly important issue right now with the rise of ai. Privacy and EdTech has never been more important and it's never been more complicated. And we have a panel today.
She's gonna join me in talking to four recipients of the Common Sense Privacy Seal. Aaron Cuny from AI for Equity, Adeel Khan from MagicSchool AI, Ben Johnson, the CTO of Prodigy Education. We've never talked to them on the show before. And Tina Hwang, the Chief Privacy Officer of ClassDojo, it's gonna be really, really cool.
So let's start with Daphne. Daphne, welcome to the podcast. Let me introduce you first. Daphne Li is the CEO of Common Sense Privacy, a spin out from Common Sense Media. Building on Common Sense Media's mission to create a safer digital world for kids and families. Common Sense Privacy helps identify privacy risks, implement best practices, and protect student and consumer data in an increasingly connected digital world.
Daphne, welcome to EdTech Insiders.
[00:02:13] Daphne Li: Thank you Alex. Really excited to be here. Thank you for hosting this really important conversation.
[00:02:18] Alex Sarlin: Privacy is such a big deal, and frankly, listeners of the podcast will know that I try to sort of skirt around it whenever I can because I am so excited about AI and education.
I just want it to happen. But at the same time, I know that privacy is so important, and I think you guys have really taken the lead as sort of the arbiters of what privacy looks like in the AI era. So tell us about the origin of Common Sense Privacy and of this privacy seal.
[00:02:43] Daphne Li: It all started with a really simple question, Alex, which was, who is protecting our kids' data, both inside schools and outside at the home?
Right? And there's no clear answer. There's no yardstick, no scorecard, and no way to know. What's actually safe and who's actually trying to do the right thing to protect kids. We actually started, oh gosh, 10 years ago when eight of the nation's largest school districts came forward and asked us to help help us make smarter, more informed choices.
So we did, we built a rubric and a rating, and recently we introduced a seal that you can trust.
[00:03:22] Alex Sarlin: And so tell us about the seal. Because the seal is really increasingly becoming sort of the certification, the stamp that a AI based education company, whether it's an incumbent or a new company, is actually truly thinking about privacy.
[00:03:37] Daphne Li: Absolutely. So here's the big challenge for teachers, for school administrators, for parents like you and me, because the only public thing that you and I can see is a company's or products privacy policy. And researchers have researched this for years, right? If the average person read every single privacy policy of every website they visited in a typical month, they would spend over 46 hours doing that.
Wow. 46 hours on top of feeding your kids, getting your kids ready to school.
[00:04:10] Alex Sarlin: That's over a week of work. Yeah,
[00:04:12] Daphne Li: right. Just to make sure they're safe and their data's safe, and so we wanted to create the seal to actually make it easy. Easy for parents, teachers, and school administrators to make informed choices.
And by the way, we're talking a lot about parents and teachers and schools here, but companies actually have the exact same problems, right? Yes. Because they face a complete patchwork of regulations. They have technology that they're incorporating that is also. Innovating and changing, and there's not a lot of transparency about how they treat data sometimes, and so again, we wanted to make it easy.
We wanted to take that burden off people who are trying to make those choices and take the burden off the companies.
[00:04:59] Alex Sarlin: Yeah, I mean, you make such a good point because in this AI era, privacy just has this totally different meaning that it did even a few years ago. AI ed tech companies are inheriting, they're using APIs to use these frontier models.
The frontier models have their own privacy situations, policies and supports, but also risk factors. Then they're adding additional choices on top of that, and then. As you say, the end customer, they're being hit by all of these different local policies, school policies, state policies, potentially federal policies, although we haven't seen a huge amount of that yet, and it's, you mentioned the word patchwork, but I would imagine for the companies having a common sense, privacy seal sort of ends the conversation.
It's like we have gone through the gauntlet on this. We've thought about it really carefully, and if the seal matches your policy needs. Then you're done and neither side has to sort of get as deep into the weeds of all of the legalese and all of the contracts and all of the privacy policies in the tech.
I would imagine it's a benefit for the companies as well because they just get to be able to say, we take privacy seriously with one symbol, and they do. It's so interesting because privacy of data. Of consumer data and especially of student data has become such a, you know, I feel like 10 years ago, maybe 15 at this point, but let's say 10 years ago, people really didn't realize or think about data privacy all that much.
Schools did, but I think individuals didn't. And I think we're at a period now because of all of the social media scandals, because of all of ai, because of all of these leaks and breaches and cybersecurity issues, I feel like data privacy is now something that. Is on everybody's mind and that's why they're worried about it, to your point.
But that doesn't mean people are trained on it. Just 'cause they're worried about it, doesn't mean they actually understand the nuances. And it's very complicated. It really is. So I think you're doing a great service by sort of taking the complexities here and putting them into something where companies can actually assess their own.
Practices. And what I've heard through the grapevine is it's not an easy rubric. I mean, this is very serious, and companies, when they get their results, sometimes they really have to make some serious changes to change their privacy policies. I'm curious about that dynamic. How do companies react if they go through the rubric and they say, oh boy, we are not quite doing this the way we should be.
[00:07:12] Daphne Li: Absolutely. So the rubric reflects not only current regulation and we update it as it evolves, but we also include those best practices. Part of the best practices are actually also informed by extensive research we've done with parents and other consumers, so we understand what issues they care about the most.
So we've also incorporated those. Now we, as we work with a company. We do, they actually do make some significant changes to actually comply with best practices, and it's really interesting, right? We meet companies where we are. Our philosophy is everyone has the opportunity to improve. No one is perfect, in part because the laws are changing, in part because technology is always evolving, right?
Other important thing is we, another really important feature of the seal is that we keep track. So it's not laws that are changing, it's not just technology that's changing, but the products that we use every day, right? Those products themselves. The rules that they set for themselves are changing every day.
In fact, you know, I'll just be upfront with this. Three of our SEAL winners updated their practice and policies after we awarded them the seal, right? Sure. And we monitor. So we are actually in touch with those companies. They have to continue every single day. To earn that seal, and so we monitor and we make sure that that is the case.
[00:08:37] Alex Sarlin: Yeah, that's so important because I imagine the lifespan of the seal and then the speed at which things change are not always aligned.
[00:08:45] Daphne Li: I think the other really important thing is, you know, unless. Your lawyer has, I mean, and lawyers are amazing, right? But unless they're super, super proactive in terms of business development, right?
Oftentimes you are not getting proactively contacted when there's a new requirement. And candidly, a lot of people we talk to kind of plug their ears and don't wanna hear about because they're worried about having to write, write another check, right? And people who do contact companies, often you get a newsletter and then you are left to fend for yourself to say, okay, what does this new law mean for me?
What do I need to do differently? What changes do I need to make? And so, you know, part of what we want to do is make privacy easy. For everyone, right? Especially for those companies, those products that are the ones that actually are holding our kids' data.
[00:09:42] Alex Sarlin: So we should talk to some of our amazing guests.
You're in great company in Common Sense Privacy. You have incredible companies working with Common Sense Privacy for the Common Sense Privacy seal. And our first one is going to be Prodigy Education. So for the first company that is a recipient of the Common Sense Privacy seal, we are talking to none other than Prodigy Education.
It is a absolute mainstay of the EdTech space. And we're talking to Ben Johnson, the CTO of Prodigy Education. Prodigy, as you probably know if you're listening to this podcast, is the leading math platform. For teachers in the US used by more than 800,000 educators and 20 million students a year, they deliver unrivaled free access to a game-based learning platform with standard aligns math with for first to eighth grade and English from first to sixth grade content, which adapts to individual student needs.
It's free for educators with revenue driven by optional parent memberships and in-game purchases. It's a lot of fun. Ben, welcome to the podcast. Thanks for having me. Appreciate it. Yeah. So I gave a little bit of an overview of Prodigy, but give us a little more in depth. You've been there since 2016. Tell us what's so amazing and special.
[00:10:51] Ben Johnson: I have been here for a really long time and Prodigy has been in business for about 14 years. Yeah. It started as a tiny, terrible math game and slowly got better and better and better over time. And you said the 20 million students. I think beyond the 20 million students. The thing that's really rewarding for me is hearing from teachers.
That we've helped turn their most reluctant math kids into kids who are actually really excited and confident about math. The fact that it's a game really does yes, give us an amazing environment in order to teach.
[00:11:25] Alex Sarlin: Yes, I did my master's thesis about gamification of education. I am a huge fan of game-based learning and gamified apps, all of it.
And Prodigy has been. Way out in front doing this for a long time. Really, really amazing. I mean, they're really fun as well as really educational and can change mindsets on math. And so let's talk about privacy. We are here to talk about privacy and Prodigy. 20 million students has a lot of responsibility to keep data private and your working with both schools and parents.
So how important is privacy when parents or teachers are selecting educational tools?
[00:11:59] Ben Johnson: And it's not just because I'm in charge of the privacy department, but I think it's one of the most important things. Parents and teachers are looking for high quality educational tools, but they wanna be able to trust them.
Yeah. Our users are primarily kids. It places a lot of responsibility on our shoulders. We want parents and teachers to be as comfortable as possible, that we prioritize that kids security and privacy. I would say school districts especially are increasing their level of scrutiny. Around cybersecurity and data practices of EdTech vendors.
They're spending a lot more time training teachers on what they should look for when they're choosing tools so that they can keep kids safe. But at the end of the day, I just like, I think it's like in EdTech, trust is the ultimate currency. It's like we build trust with teachers through the free model and our privacy practices, and then they become our biggest advocates, and then that is the engine of our growth.
It really is foundational.
[00:12:54] Alex Sarlin: Yeah. What do you feel like sets Prodigy games apart from other EdTech companies? I love that phrase. Trust is the currency in EdTech. It totally is. And how do you build that currency of ed Prodigy?
[00:13:05] Ben Johnson: So the weird one for us, and what sort of always comes up all the time is because we're free.
To use for educators in schools, there's often some suspicion about like, how do we make money? Right? They think we've gotta be monetizing user data, we've gotta be monetizing student data. But I actually think for us, like because we're funded by optional parent memberships, it's actually a key strategic advantage.
It allows us to build everything with a privacy by design. Philosophy. We don't need to be creepy, right? We don't trade for value any student data, or we don't allow any third party advertising on our platforms. We don't have to choose between revenue and doing what's right for our users. Then I think our philosophy on this is really just to be as transparent as we can.
We have two different experiences. We have the game for students and apps for teachers. We built out two specific privacy policies for each experience to try to outline exactly what's collected and exactly what is used.
[00:14:07] Alex Sarlin: So tell us about what Common Sense Privacy has been doing to help build that currency.
[00:14:12] Ben Johnson: So the biggest thing about being a recipient of the Common Sense Privacy Seal is that it's additional third party verification or validation that we're doing the right things when it comes to online safety and security. We've been successful in earning a whole bunch of badges to validate our approach, but I think the common sense name really carries important weight among educators and parents.
And you know, I was just saying like transparency and trust are really important, but I think one of the other things I've sort of learned over the couple of years of doing this is that teachers aren't lawyers. It can be really hard for them to look at a product at a glance and know that they can trust it and.
They're also really busy. Yes, they are likely preferring to spend their time actually educating kids than becoming experts in privacy practices. And they've been given as much training as a district will be able to give them. But the Common Sense Privacy seal, I think just gives teachers and parents away at a glance to know that it's not that, just that we are saying that we're doing the right thing.
Right. Common Sense Privacy checked it. It gives everybody some peace of mind.
[00:15:18] Alex Sarlin: Yeah. Daphne and I were just talking about how teachers are not lawyers and the idea of having to make our way through these complex and ever-changing privacy policies, both from the companies and from the regulations themselves.
The idea of a teacher being stuck in between that and having to make sense of it is just nightmarish. You really can
[00:15:35] Ben Johnson: feel it actually, when you go to conferences and talk to teachers about this stuff. Like there is a thing of like, they're scared in a way, right? It's like they're being told that they have to make sure that they keep everything copacetic, but.
Do they know how to do that. And so I think giving them easy ways to be able to look at it and understand whether or not it's gonna keep their kids' data safe is so important.
[00:15:55] Alex Sarlin: It makes a lot of sense. And then, you know, I mentioned the ai, the, the second piece of this that I think is the far more complicated, so you could rabbit hole on this for hours, but I'm curious about it.
You know, one thing that everybody is very excited about in the AI era for education is the idea of more. Personalized learning. And you know, we mentioned how a prodigy has everything is adapted to the individual student needs, differentiated, personalized. I like the word precise, right? But at the same time, the more data you're collecting about a student, that you start touching on privacy things.
So I'm curious how you sort of balance that need for differentiation, personalization, and the privacy, or do you see them in conflict at all or am I just projecting that? I think our principles
[00:16:35] Ben Johnson: are the same. I think we're really committed to like exploring that new tech in a way that is transparent and keeps student safety at the forefront.
We have actually been pretty conservative on the AI front, at least for a little bit, partially because we wanted to see how this stuff was gonna develop. And in fact, even now, just there's way better structure for How do you deal with AI privacy this year than there was last year? Mm-hmm. I think last year, especially going to conferences.
People were sort of a little panicked about like, how are we gonna maintain student privacy in this world? We're starting to see better structures in place to be able to sort of do this, but I think we're just going to have to do both innovation and privacy at the same time. Yeah. In terms of like the actual technical, how are we gonna do it?
I mean, for students we collect first name, last initial. Their state so that we know what curriculum they are in and their grade past that. Like we're not able to sort of identify a, a particular kid. And so we'll just hold the same guardrails. And then the other thing we do a lot internally is things like data protection impact assessments, right.
Or data privacy impact assessments. Where the idea is specifically like if, if there's any new feature that we're gonna build, we're gonna look at it through a data privacy lens to make sure we're only collecting the things that we need to collect. Yeah.
[00:17:47] Daphne Li: I just would add on to that. Right. That was one of the things that really impressed us with Prodigy and one reason why we are really happy to award Prodigy the Seal, right?
Prodigy is one of those companies where when a parent signs up their child or a teacher signs up a student, the student or the child is immediately directed to a special web address where they log. Right. And once they log in, the data that's collected is very, very, very limited. And if a kid forgets that they're supposed to go to this place to log in and wanders onto the main prodigy website, prodigy doesn't.
Excellent job having the student immediately raise their hand and say, oh, I'm a student. Mm-hmm. And they make sure they get to the right place, right? Mm-hmm. And I love that because you know, there's a lot around age verification these days, and when you ask a student or a child their age. You know, sometimes they wanna be a little older, right?
And I can't think of a single student who's gonna raise their hands and say, I'm actually a teacher, or I'm a parent. Right? And so I actually think that the way you do that age verification or finding out who that person is, is very elegant.
[00:18:54] Alex Sarlin: Yeah. Appreciate. Yeah. So that's in terms of, you know. PII, right?
It's like Justin O from Missouri, and that's sort of all you're gonna know, but then there's all this academic data and this, what games have they played? What concepts have they struggled with and all of that. And that, I imagine that becomes the real fuel for the personalization differentiation piece.
Yeah, so I think
[00:19:15] Ben Johnson: part of it is, is also, so we have an adoptive algorithm rhythm today. It's not crazy AI under the hood, it's, it's a much more traditional algorithm in terms of how it operates. And basically the way it works is it just grabs what is the kids' previous. Academic experience and then what is the curriculum?
And it maps those two things together to sort of see what the most logical next question is for that kid. And so part of it is like keeping that data, that data is specific to the kid. It's only shared with, you know, a teacher or a parent who we vetted can be connected to a teacher or a parent. In terms of the academic data we store, we only store the.
Core data that we would actually need to be able to determine how to send them in in the same direction. And I think a lot of the time when you do these impact assessments, one of the things you find is. You don't need as much data as you think you do, right? To be able to get the outcomes that you want.
And so I think that just making sure that it's top of mind in terms of like what is the most minimal set is the way that we really look at this.
[00:20:18] Alex Sarlin: Ben, this has been so interesting. I'd love to talk more about Prodigy Education in a future episode, especially with the gamified pieces and just you've been doing this for so many years with so many students, so I wanna just thank you so much for being here with us on EdTech Insiders.
We are here with the founder and CEO of AI for Equity, Aaron Cuny. Aaron supports and convenes leaders at more than 50 innovative school networks around the country, positioning to them to be smart decision makers on all things AI in service of a more just world for all students. Aaron, welcome to the podcast.
[00:20:51] Aaron Cuny: Thanks Alex. Great to be here.
[00:20:52] Alex Sarlin: So tell us about AI for Equity, what you do, and sort of how you came across the Common Sense Privacy seal and the Common Sense Privacy work.
[00:21:01] Aaron Cuny: Yeah, sure. So over the last six to nine months, we've seen several studies come out that show an emerging equity gap in ai, and that is where we come in.
That's why we exist as a nonprofit. It's to make sure that. Ai As the technology gets leveraged to reduce, rather than exacerbate the equity gap, we see school system leaders as the key levers in that mission. So we work with innovative school systems around the country that are serving kids furthest from opportunity, and we build the capacity of the leaders in those systems to be smart decision makers on all things ai.
Everything from the implications for adult work streams to student facing tools. To the huge implications for curriculum and instruction, and we do research on behalf of our partners, develop a bunch of implementation resources, and then build community largely through virtual role-based community practice.
[00:21:59] Alex Sarlin: So tell us about some of the perspective on the ground from people who are thinking about using AI and how they think about privacy.
[00:22:06] Aaron Cuny: Yeah, so I would say my read is that in most school systems around the country right now, especially in small to midsize systems, that data privacy practices probably far short of what we would consider to be ideal in most systems.
I think we have a small number of busy leaders with limited technical insight as it relates to data privacy, but I think because we get to work with a set of the innovators, I can share some of the things that some of our most innovative systems are doing on this front right now. And I think in those places, privacy is increasingly less of an afterthought.
In the way that it might have been historically. It is no longer just a box to check. And in those places, I think we are seeing systems doing more to both build internal capacity and tap into external resources and capacity. So I think internally. We're seeing folks do more to develop engagement from other folks across the organization, cross-functional teams in ways that I think wasn't present before.
They're doing more to make sure that everyone across the organization knows when they need to reach out to the tech team or the procurement team. But I think even with that, many systems just still don't have the expertise internally, and so. Our leaders have found third party signals to be a real value add.
And so for us, this is where Common Sense Privacy comes in. We, earlier this year, partnered with them to get the evaluations that they've done, consolidated across all the tools that our leaders are using and exploring, and we've supported our leaders in putting all this information into a conditionally formatted Google sheet and essentially trying to give them.
A really efficient, easy to track interface with the data that they need to be paying attention to. And then we've also subsequently gone out to the vendors and said to the vendors, Hey, our leaders are looking at this data. And you know, effectively we are trying to leverage the collective purchasing power of all of our partners.
To push the market towards stronger practice. Another thing I would say that I think is worth noting here is that in some of the systems that are getting this right, right now, they are adopting or exploring software to support. Strong decision making around data privacy. So sometimes that looks like this happening on the front end in the process of engaging vendors initially.
So this might look like contract lifecycle management platforms that are setting school systems up to really evaluate privacy clauses in a systematic way, ensure that there's consistency and expectation across contracts to automate approval workflows, rocket ship. Is, you know, big national, CMO is one example of this.
They're using ironclad for that and I think finding that helpful. I, on the other end, you have some folks that are using software to evaluate data privacy for tools that are already in place. So what are the tools that are being used across our district or system right now? How often by whom and are the evolving privacy policies of current vendors meeting the security requirements?
So innovate another of our partners in Southern California, I think is an example of the school system that's doing that. So I would say. As we look at like what emerging best practice looks like for our partners right now, it's internal capacity building, tapping into third party resources, evaluation information like we see from Common Sense Privacy, and then leveraging software to just create some systematic efficiencies with both potential new vendors and existing vendors.
[00:26:01] Alex Sarlin: Yeah, I think it's a really well put description of how it's sort of working, that there's the need for third party signals. I'm glad to hear that the level of understanding is moving up here because the risks are high. I am curious among your consortium members, as the sophistication around AI gets deeper, do they understand more and more what the sort of stakes are for some of the privacy breaches that are possible?
[00:26:24] Aaron Cuny: Yes. Our leaders are increasingly, I think, developing appreciation of the stakes as I think we all are as a society right now. But I think a few of the things that they are attentive to right now, one, like they're cognizant of the fact that the data that is being collected is just deeper than ever before.
Tools are capturing more unstructured data. Personal narratives, essays, chats, voice, data, and in a way that wasn't true before, the market is just hungrier for data than it's ever been. And I think in many ways that can be justified as like vendors trying to improve their service to provide more personalization.
But I think that that creates some challenges. You also have this data getting leveraged across the AI stack. You have vendors of vendors who are getting access to data, and I think with each move of the data from a vendor to another vendor or to another part of the AI stack, there is more exposure for that data.
More places that it could be leaked or just retained in a way that it's not supposed to be. So I think we see that like those things as challenges. The other thing that I would note that's top of mind for our leaders is this idea that all this data will likely eventually be used to create profiles of students, of staff.
And I think the implications are far more consequential than has ever been the case. And so we're gonna have AI that's gonna be able to take disparate data points and create a profile with student or staff member and likely influence. Future opportunities for those individuals. I mean, that's gonna happen probably in an opaque system where folks aren't necessarily going to get the transparency into what's the data that's being used and what's the algorithm that's driving the decision.
And so I think just given the potential consequences of that, our leaders are sensitive to the data privacy implications of the tools that they're using right now.
[00:28:30] Alex Sarlin: You are making such great points, and I think really looking to the future of where this is all going. I mean, I think your point about the vendors of vendors is really well, taking the ai, everything happens in a stack and like the voice data, you mentioned the idea that EdTech vendors are collecting voice data, but to process that voice data, they may need to be using third party systems to process the voice data, which means there has to be privacy all the way down because that data could be being.
Passed in one format or another, even to other systems, and everybody means well here. I think one of the things that I think is so tricky about this, when I hear you talk about using disparate data points to put together a learner profile or a educator profile for various reasons. When I talk to ed tech founders, they are so excited about the possibilities there.
They say, look, imagine a world in which you could say. The student is doing this and this class and this and this class, and they were absent this many times this year, and this is what they're doing. This is what they said to their friend in this chat at this point, and using all of that to improve the educational outcomes, founders are thrilled about that.
And personally, I am too, but at the same time, it's a privacy nightmare, right? The idea of following information from all these systems. Putting it together and having to have all the policy and all the really deep thinking in place to be able to do that in a way that doesn't feel like a violation. It doesn't feel like spying on students or keeping data for longer than is appropriate or legal.
It's such an interesting balance here between all the possibilities that a AI offers and all the risks that fall into place. When you think about compliance and the complexity, how do you, you know, obviously EdTech founders wanna personalize. They want differentiation. They wanna pull together disparate data points at the same time.
You're mentioning your consortium sees the risk of that. How do we balance this as a ecosystem? I think this is one of the most interesting aspects of EdTech right now.
[00:30:13] Aaron Cuny: I would say that our leaders are not like tech naysayers. They're bought into wanting to leverage the opportunities that AI presents to serve kids, to reduce the equity gap.
And so I think they're aware of these opportunities and the I, the idea of data being used to create a learner profile that helps a kid do better, I think is something that they're excited about. I think the path forward is figuring out. How we do that in a way that reliably, right, provides for the right protections and the right security around the data.
And I think you know that no small task, but I think if we can find ways to provide that. Assurance and that security, it will allow the decision makers in our school systems to be even more bought into the upside of the tech.
[00:31:06] Daphne Li: So how do you make good decisions? How do you think about not just great educational outcomes?
But also how do I protect my students? How do I protect their, their data? How do I protect my community's data? You guys are on the ground, so what are the limitations you're seeing and how can you overcome some of those limitations?
[00:31:25] Aaron Cuny: Yeah, thank you for that question. I think so many challenges, you know, one of the things that comes up for our leaders is just the variance in state laws and, and the fact that there is like, no.
Officially agreed upon yardstick, and we realized that that is also a huge challenge for vendors. But I think on the school system side. When you don't have an agreed upon yardstick, then schools end up with conflicting information. So we've had multiple scenarios this year where, you know, one data privacy evaluation has said one thing, maybe suggesting that a school system might like need to pump the brakes with a vendor, but then the vendor says, but look, we've got this validation over here, or this school district, you know, which has a very rigorous.
You know, procurement process, we've been approved by this school district, and so the position that that puts our school systems in, especially our, you know, small to mid-size systems, asking them to evaluate all this conflicting information is just bonkers. And so I think that is a huge challenge.
Another one that is of particular interest to me, I think sits at the heart of a real cultural challenge for school systems right now. So I will say this is a generalization, and it's not true of all school systems, but in general, most school systems have not evolved to be bastions of. Innovation or bastions of, of r and d.
You know, they're fairly risk averse and slow to adopt a technological change, and there's lots of understandable reasons for that. But I think as a result, our school systems haven't developed great muscle for the kind of frontline innovation that you might see in private industry. And I would say that the most innovative systems, including a lot of folks who we partner with, are recognizing that that is gonna have to shift in the age of AI as ai.
Democratizes the ability to innovate in order to stay competitive, in order to serve kids well. In order to keep top talent, school systems are gonna have to get better at fostering innovation. And part of what that will look like is shifting from these big, slow, high stake procurement decisions. You know, where you're kind of going zero to 60 and adopting a tool for the.
The entire district, I think the most innovative system, school systems are gonna find ways to support innovation that involves lots of small scale pilots. And we see a lot of systems that are thinking about like, how do we let a thou a thousand flowers bloom, if you will. The challenge though is that you can't do expeditious data privacy evaluation of a thousand flowers.
And so I think we are seeing that in so many places. Teacher experimentation is outpacing the vetting capacity of the tech team or of the procurement team. School systems, I think are dealing with this challenge of like, how do we support frontline innovation and position the organization to learn quickly, to move quickly while still having a very intentional procurement process that doesn't put, you know, student staff data privacy at risk.
Like how do we foster agility without compromising safety? So that is a real tension. I think there's a challenge for our school systems right now. Another one that we are seeing just has to do with the pace of product development. So software is evolving faster than never. A school system could have invested all this time in doing a thorough review of version 1.0 of a product.
But by the time it is widely deployed, you have version 1.5 of the product out, and that version could be collecting data in a different way or processing that data differently. And so school systems are left with this moving target. Yep. Which requires constant. Monitoring and evaluation, and especially if you don't have software in place that's systematizing that, then you're, you're in trouble.
And I would say the last thing that we're hearing from our leaders is, and, and maybe the biggest challenge of all I think when it comes to evaluating vendors for data privacy is so much of this comes down to the, like the self attestation dynamic. So with so much of this, you know, we are essentially just having, you know, a vendor kind of make a general legal promise that they're complying with ferpa.
But school systems are rarely given the technical evidence to support this, and it's just not auditable. In a reliable way. And so school systems are forced to rely on the good faith of the vendors to update, you know, the privacy policies when the product evolves. And I think for the ecosystem, the real like question or challenge here is like, what do we do if the verify part.
Of trust, but verify is not actually an option for school systems. And so I think, you know, to the, to the question, Daphne, you said like, what do we do about this? I would say, I don't have answers for all of this, but the more that we get investment in building the capacity. And I would say I'm speaking to folks in the philanthropic community right now.
I think the more we get investments in building the capacity of the decision makers in our school systems, the more we get investments in like, let's say subsidizing the software that can support our school systems in executing these processes, you know, consistently. The more we get investments in third party folks like Common Sense Privacy, who can, you know, essentially provide technical insights and do you know evaluations at a scale that we should not be asking school systems to do.
I think the better off we'll be.
[00:37:14] Daphne Li: Yeah, I think we're at the beginning of a long, ongoing journey. Right. And it's so interesting. One of the things you talked about is how quickly products change. Products change, the laws change what they do with data changes a lot. And so one of the things we've introduced with the SEAL is to have continuous monitoring.
Mm-hmm. Right? And so it's really interesting. You're absolutely right, Erin. Three of our SEAL winners within a month. Right. And of course we absolutely were monitoring that and we were working with them on those changes. They have to continuously sort of requalify for that seal every day.
[00:37:53] Alex Sarlin: So interesting.
I really appreciate your, uh, insights on this and the insights of your consortium. This is Aaron Cuny, he's the founder and CEO of AI for Equity, talking about all the different challenges, but opportunities as well when it comes to privacy and AI and education. Thanks for being here with us Edtech Insiders.
All right. We are here with Adeel Khan, the founder and CEO of MagicSchool AI, the leading generative AI platform for schools in just a year and a half. As of Y 6 million as of yesterday. Signed up making it the fastest growing technology platform for schools ever. Amazing Adeel Khan. Welcome back to EdTech Insiders.
Thanks so much for having me, Alex. Good to be here with you and Daphne. It's great to see you again. So let's talk about MagicSchool AI and privacy. What has MagicSchool done differently from some of the other competitors when it comes to privacy? And how do you think about privacy when you think about this fast growth trajectory you're on?
[00:38:48] Adeel Khan: First of all, we are a product that is primarily made for K 12 schools. So if we're gonna be a viable company and product, we have to take privacy really, really seriously by pure concept. And we do take privacy really, really. Seriously, because we work with teacher schools and sometimes students. So first off, when you use our tools, we don't collect any of your personal information to make the service work.
We don't use things like your name, email, or role or IP address when you visit. We don't use that to do anything but authenticate you. We rather just use it to keep secure and communicate with you, but. We don't like sell your personal information. We don't allow targeted ads. We don't let our partner AI providers train on your data, right?
So those are the kinds of things that we do from the outset to make sure that schools and districts feel comfortable. I'm here in my hometown area and the DMV as we call it here. I grew up in Northern Virginia, and, uh, in every conversation I'm having with district leaders in the region, the first thing they ask about is privacy and safety, and what are we doing to make sure that their students PII, is protected as well as their districts.
PII that they're trusted to handle is.
[00:39:52] Alex Sarlin: And you know, you mentioned that it's the first thing that people are asking about, and I think that is such an interesting dynamic right now, especially in the AI space. How do you assuage some of that fear and uncertainty and help them understand the power and excitement of AI and not get caught in the what ifs?
[00:40:06] Adeel Khan: Well, fundamentally, we have open source. All of our privacy documentation we have. Very easily accessible. Both, you know, the longer policy that maybe A CIO would wanna scan and understand, and also like a simpler terms policy on our website so people can understand exactly the way that we're keeping the information that our schools entrust us with.
Safe. So we make it really simple to understand and allow the user and the customer to make their judgment about that data and that information. But clarity and not hiding that and making that really hard to find, I think is really important and it ingratiates trust. The second thing is we have flags all over the site and things that we call.
Intentional friction. So in certain tools in the platform and when any educator logs into the platform, the first thing they see is a acknowledgement flag of how to use this appropriately and safely. I actually think that was one of the reasons that we grew so fast, so early was because as schools and districts were nervous about using this technology, something that was really.
I think a typical consumer product that was growing virally might tell you this is like an added piece of friction that might stifle your growth, right? But before a teacher could use the platform, they had to accept these terms that said, Hey, do not share private information before you can get to our tools.
Here's all the ways that you should be thinking about using this safely. And while maybe that added some friction from a user perspective, it ingratiated us. So strongly with school and district leaders that we found early in the platform's life. And you know, AI's become more accepted, especially to be used by teachers today.
But early in the products lifecycle, when there was a lot more uncertainty and fear, that flag made us. The only AI product that was even let through filters sometimes, right? Districts. So the fact that we thought about this from the very start and continue to iterate and improve on this, and we don't hide when we make a mistake either.
We listen, we acknowledge we are humble about our approach to privacy, and we're looking to learn from partners like Common Sense, our school district partners and users who tell us when we need to improve. And that posture has been an asset to us since the very start.
[00:42:19] Daphne Li: It's so interesting when we were evaluating MagicSchool, that was one of the things we really loved, and I love that you actually think about it as we're intentionally adding friction because it's the right thing to do, right?
So many places we see companies give school districts choices. Right, and sometimes those choices can be really, really overwhelming, and so it's how you actually implement and set up a product in your district is incredibly important because sometimes there's a more privacy preserving way to do it and a less privacy preserving way to do it.
[00:42:51] Alex Sarlin: I mean, we talk in the ed tech space about how different AI tools have different end users and that because MagicSchool AI is designed primarily for educators. You mentioned you have some student users, you have some student features, but you have over 80, I don't even know how many tools for educators.
So the idea of sort of in platform. Privacy education. That's really what is happening there, right? Saying, Hey, you should not be putting names of students in here. You should not be putting anything that could be X, Y, Z, that has any privacy risk. It really serves as almost like embedded professional development.
I'm curious how you would describe that.
[00:43:25] Adeel Khan: That's actually a really great way to describe it, is that when you choose MagicSchool, you're choosing a platform that wants you to use this technology with the responsibility that schools and districts require and deserve. So when we have aligned intentions.
Aligned responsibility positions. That makes a big difference for the district to know that, hey, when a teacher is using it for this novel use case, they're gonna be reminded of how to use it appropriately in every single step of usage. And we do have a very substantial amount of student usage as well.
And in that we are even more thoughtful about all the ways the tool interacts with students, including another responsibility flag. Like no matter if you're a student and you join the platform first, your teacher has to determine whether or not. They want you to use it. You can't just log in by yourself without the teacher's permission.
And then two, whenever you log in, where teachers get that flag once and can acknowledge it and not require it to be there anymore. Students always get that flag when they log into the platform every single new time to remind them of it. And in addition to all of the in platform you could call professional development or exemplary guidance in the platform, we do provide tons of.
Free resources to teachers from asynchronous certification courses to teach you how to use the technology responsibly in addition to professional development that we provide to partners as a company, as well as decks that you can find and customize for yourself where we reiterate a lot of these ways to use this technology safely.
[00:44:54] Alex Sarlin: I think the fastest growing AI tool that I know, and I think you're the fastest growing AI native tool out there in education. So you're quickly becoming this huge institution in the space with 6 million educators. You are in every state. You're in so many different areas. Definitely, and I were talking earlier about how the state laws and policies also continue to evolve so that there's a need to, you can't just sort of say we're on top of privacy.
Be done with it. Everything keeps changing. The whole landscape keeps changing. I'm curious how that affects you, especially 'cause an school is so widely distributed.
[00:45:26] Adeel Khan: There are a couple things that we've done from the very start that have allowed us to be nimble around different policy changes. Some of them are things that we made.
Ethical decisions around pretty early in the platform that we think are gonna be really aligned with any regulation that comes out. One of which is, for example, we don't grade for teachers or we don't tell teachers that this is the grade of the student. We think of that as a high stakes decision. And while you know, you can use the platform to suggest a grade for the student, that's the teacher's prerogative.
And the teacher's the one who signs their name to the grade book and says, this is the student's grade. So while there are platforms out there that are a little bit more forward of like. We're gonna do this for you. We remind you that you know, you are the teacher of record in the classroom. When I was a teacher, I remember my grade book.
That was my grade book, right? So high stakes decision making purely by machines without like human verification, I think is something that we've thought about from the very start. And we just, we think teachers are still responsible for that. And the adults in the building, you know, the whole platform's built around.
The idea is it is an augmentation for the teacher. It's to amplify their impact. But the teacher is still in control. You know, they're the leader of their classroom and we're just here to help.
[00:46:32] Alex Sarlin: It's an important approach and I think it, obviously you can bake some things in and then obviously also have to be agile and reactive to other things as they change over time.
What has the Common Sense Privacy seal sort of meant to MagicSchool, and how has it changed the conversation when you go talk to educators or prospective customers?
[00:46:49] Adeel Khan: You know, I'll talk about it from like two perspectives. One is that it is great to have an independent third party evaluate your tool and give you feedback on things that simply by, you know, some of the incentives that we have as a platform to get people to use the product and love it.
Right? And our product managers work on maybe a specific feature in the product. They might not be as. Expert as somebody who's a third party, purely looking at it from like a, a risk adjusted lens and thinking about how teachers may or may not use this. So we think of them as like an external party that can, can hold us accountable and we welcome that accountability because we are here for the same purpose.
We think that we're like very value values aligned organization with common sense. So we think of them as an asset in the development process. Our development process is ongoing, like I mentioned, like we're not perfect today. We still need to improve certain places and, and think about even adding certain friction in, in different places to make the platform even stronger.
And we're proud of where it is today, but we think of this as an ongoing process. Yeah. It's not like a Chatbox done. And then two, you know, the common sense brand has been something that, you know, has been trusted for decades. Right. It's, it's just, it's been around and people know that there's a high standard, that common sense holds it's technology platforms that they.
You know that it endorses or, or says that it has a privacy seal. So that is credibility walking in the door that hey, an independent third party has, has looked at this. And while, you know, as a district we have our own internal body that's gonna review this, we also know that an external body that might have some even more resources than we have is also looked at at at.
That is definitely a comforting thing in a world of a brand new technology that is inherently a little bit. When anything's new, it does welcome or invite additional level of scrutiny for the right reasons. We wanna make sure that it's, it's, uh, implemented appropriately. So any things that we can do for that assurance is really valuable to us.
[00:48:43] Alex Sarlin: Yeah, and Adeel Khan is the founder and CEO of MagicSchool AI leading generative platform with over 6 million educators signed up. It's like we can't keep the bio who sped up in time. We really appreciate you being here with us to talk about the Common Sense Privacy seal and your privacy, uh, philosophy.
For our next Common Sense Privacy Seal recipient, we're talking to ClassDojo. Tina Hwang is the head of legal at ClassDojo. Of course we know ClassDojo, but as a reminder, its mission is to give every child the education they love and it reaches more than 45 million students and parents around the world.
Tina, welcome EdTech Insiders.
[00:49:20] Tina Hwang: Thank you. Thank you, Alex. Great to be here. Yeah.
[00:49:22] Alex Sarlin: So let's talk privacy. I have heard from a few sources that ClassDojo has one of the most comprehensive and thoughtful privacy policies in all of EdTech. Tell us more about how you think about privacy and why you've decided to focus on privacy as such a core aspect of your business.
[00:49:39] Tina Hwang: Yeah, well, privacy for us, it's not just a feature or a compliance function, it's really a reflection of our values. We know that families, teachers, and school leaders rely on us, not just to share information, but to build meaningful relationships and for a stronger sense of community. That's a lot of trust placed in us on day-to-day interactions, and we are very, very committed and we hold that responsibility very, very seriously.
And that means that we focus on privacy and we commit to privacy day in and day out. Because we know that trust can't exist without deep commitment to privacy and security.
[00:50:22] Alex Sarlin: Yeah. One of our other guests said, trust is the currency of EdTech. And that stuck with me. And I think you're saying something similar there.
If you lose trust, you're in real trouble In education, people, parents, teachers, it's all about trust.
[00:50:34] Tina Hwang: It's totally true.
[00:50:35] Alex Sarlin: I'm curious how your privacy philosophy and policies have evolved over time. We obviously are in this era where privacy is getting more and more visible. It's getting more and more feeling like a bigger and bigger part of the story.
And of course, where AI is now in part of the picture and where laws and policies keep changing too. Of course. How have you thought about evolving your privacy philosophy over time? ClassDojo has been around for quite a while.
[00:51:00] Tina Hwang: It's a great question. When I first started out in privacy and in law in general, there was no GDPR.
There was no. State privacy laws. It was very, very rudimentary at best in terms of what the body of law was. And now it's not just the laws in terms of international laws like GDPR, it's also about the state laws. Right? It's also about the sector specific laws, so you're absolutely right. Compliance has definitely gotten.
Harder and more complicated, which again, organizations like Common Sense are beautiful and wonderful to help us navigate that. But also I would say that privacy has become more than just compliance. It is actually. Very much intertwined in terms of what users expect from a company. And I think that your comment about currency is just so spot on.
Users are now very, very savvy about the fact that their data can be used in nefarious ways, and so they have higher expectations for how a company can communicate with them and also. Notify them about their choices and also be able to understand what's happening with their data in a simple way. I would say that it's evolved, but I think it's evolved in such a wonderful way.
Of privacy no longer being a backend compliance. Check the box issue right to the forefront of how you communicate with your users and how you keep their trust.
[00:52:45] Daphne Li: It does seem, Tina, the very, very best companies when it comes to privacy, ClassDojo certainly as a leader among them, right, has privacy fully integrated into every.
Aspect of their organization, right? From product development to infrastructure to, as you said, communications, marketing, et cetera. And so I think ClassDojo does that really, really, really well. Wow. Well, thank you, Daphne. We definitely try. One of the things we notice is ClassDojo, I think Alex mentioned this at the beginning, has one of the most comprehensive privacy policies we've ever seen.
Right, and it's interesting though, right, because that is perfect for the CIO of a school district who's going to really, really drill down and wanna understand. All of the details. And so how does ClassDojo make it easier for its users to understand how their
[00:53:37] Tina Hwang: data's treated? It's a great question, and it's one of those things where actually we relied on common sense a lot in terms of.
How do we understand how to communicate these very complicated privacy topics in a simple and digestible way when you're in it day in and day out, like me and my team and others across the company? Sometimes you're in your own head and sometimes you don't know what you don't know In terms of how it sounds, and again, Daphne, your team, I think was instrumental in helping us this time around.
I think you have to commit to transparency in this space. Not just because of compliance and sometimes even beyond compliance, because trust demands of it. And you nailed it on the head that we're talking to different audiences here. And so I think a key to transparency is really understanding. Which audience you are talking to and also tailoring that communication to that audience.
So we have a set of documents on our site that is very, very transparent because as you mentioned, districts, CIOs. They are very technical and they wanna understand the nitty gritty of what we are doing. We have another set of documents that are more geared towards the common user. We have FAQs. We have highlights of our privacy policy and digestible snippets so that users can just read.
What they need to in terms of understanding what's happening without getting into the nitty gritty, but with the ability to get into the nitty gritty if they have to. And so we have what we call highlights or t LDRs if you want, colloquially of what our privacy policy says. And then because our space is the.
Children's space. We also make a really concerted effort to make sure that our topics are digestible even by children. So we have a children's version of our privacy policy. We have videos to communicate in a accessible way to children what's happening when they use our products and services. And again, not that we think that any of this is mandated by compliance.
Necessarily, but because we think this is the right thing to do to keep that trust. And so, Daphne, to your question, long-winded way of saying, I think you have to really tailor your communication to your audience and really understand. What's the most important for that particular audience? So getting that outside feedback, like through common sense, through user studies or however a company does it, is so critical to this process as well.
[00:56:26] Alex Sarlin: I love the point you're making about compliance, just being Daphne, you've said compliance is table stakes, right? It's like it's not about compliance. Checking the box. It's truly about building trust beyond compliance, beyond COPA and ferpa and going to, no, we are really going out of our way to explain to you, even to the student users how we handle privacy in really every different aspect of it.
That's a way to build a really deep, trusting relationship with your end users and. And procurement and CIOs, but you know, everybody needs different levels of detail. It strikes me as you talk about the ability to adapt that language to students and to end users, including educators that, you know, we're in this AI era where educators and students have so much, these conversations are really opening my mind about privacy and sort of turning it from a, from a checkbox, from a sort of, if you're gonna ask about privacy, we've got an answer in, in our back pocket in the FAQs two E.
Privacy is almost like a lead. You're leading with privacy. It's part of your strategy, it's part of the communications. We just talked to Adeel Khan and he said that one of the things that he attributes to the growth of MagicSchool AI was they put all these privacy warnings into the product, and he's like, yeah, some people would consider that a friction.
But the teachers found it incredibly comforting that we were saying, don't do this. Don't do that. Be careful here. Be careful there. Because it showed that they were thinking about privacy first, just as ClassDojo does even, you know, has for years. It's really interesting to turn that around and see privacy as a sort of differentiator.
You've mentioned user expectations many times as user expectations rise and rise about, we. Need these ed tech tools to be thinking about this for us and to be ahead of it and to be watching these laws and watching these policies. You know, the expectations are rising and I think it's really interesting mo moment AI is sort of forcing this moment, I think, where everybody has to get their privacy and data story.
Not just straight from a compliance perspective, but actually like really, really think about it and lead with it and use it to open doors. That's what I'm hearing you say.
[00:58:26] Tina Hwang: It's really true. I do think that you're, you've hit the nail on the head that I think the companies that get it really do understand that this.
Element of their program, building privacy, building trust into their DNA as well as their product development growth. That could be a big differentiator between them and the next person, and investing in that upfront could be the reason. Why your users stick longer than another person's user. It could be the reason why they would choose your product over the other, especially in this day and age.
[00:59:06] Alex Sarlin: Tina Hwang is the head of Legal Ed ClassDojo, which is reaching more than 45 million students and parents to give every child the education they love. Thanks so much for being here with us on EdTech Insiders. Thanks for listening to this episode of EdTech Insiders. If you like the podcast, remember to rate it and share it with others in the EdTech community.
For those who want even more, EdTech Insider, subscribe to the Free EdTech Insiders Newsletter on substack.