.png)
SEND Parenting Podcast
Welcome to the Send Parenting Podcast. I'm your neurodiverse host, Dr Olivia Kessel, and, more importantly, I am a mother to my wonderfully neurodivergent daughter, Alexandra, who really inspired this podcast.
As a veteran in navigating the world of neurodiversity, I have uncovered a wealth of misinformation, alongside many answers and solutions that were never taught to me in medical school or in any of the parenting handbooks.
Each week on this podcast, I will be bringing the experts to your ears to empower you on your parenting crusade.
SEND Parenting Podcast
EP 129: Protecting your child online with Emily Keeney of the ICO (Information Commissioner's Office)
The digital world offers a unique sanctuary for neurodiverse children—a place where they can connect, explore, and engage in ways that might feel challenging in person. But this digital refuge comes with hidden costs as platforms silently harvest our children's personal data, using it in ways most parents don't fully understand.
Deputy Commissioner Emily Keaney from the Information Commissioner's Office pulls back the curtain on what's really happening when your child scrolls through social media or plays online games. From recommender systems that create detailed profiles of your child's interests to geolocation settings that can expose their physical whereabouts, the conversation reveals the mechanics of data collection that operate just beneath the surface of your child's favourite apps.
What makes this discussion particularly valuable is its practical approach. Rather than inducing panic, Keaney shares concrete successes where regulatory pressure has forced major platforms like TikTok, Instagram, and X (formerly Twitter) to implement stronger privacy protections specifically for young users. She explains how sophisticated age verification technologies are becoming increasingly common, ensuring children receive appropriate safeguards even when they've managed to circumvent parental controls.
For parents of neurodiverse children who may be especially vulnerable online, the episode offers a perfect balance of technical insight and practical guidance. Learn why simply checking privacy settings isn't enough and how ongoing conversations about digital safety can be naturally integrated into family life. Discover the power of showing genuine curiosity about your child's online experiences rather than approaching them with suspicion.
What emerges is a heartening message: parents aren't alone in this fight. Through significant fines and regulatory action, the ICO is steadily shifting the burden of protection away from overwhelmed parents and onto the powerful platforms profiting from our children's attention. Join us to understand how you can better protect your child while still allowing them to benefit from the connection and community the digital world provides.
Got questions about protecting your child online? Join our private WhatsApp community where parents share experiences and support each other through the challenges of raising neurodiverse children in a digital age.
Emily Keaney’s Blog – Protecting Children’s Data
Press Release on TikTok and Investigations
www.sendparenting.com
Welcome to the Send Parenting Podcast. I'm your neurodiverse host, dr Olivia Kessel, and, more importantly, I'm mother to my wonderfully neurodivergent daughter, alexandra, who really inspired this podcast. As a veteran in navigating the world of neurodiversity in a UK education system, I've uncovered a wealth of misinformation, alongside many answers and solutions that were never taught to me in medical school or in any of the parenting handbooks. Each week on this podcast, I will be bringing the experts to your ears to empower you on your parenting crusade. If you're looking for a safe space to connect with other parents navigating their neurodiverse journey, our private WhatsApp community offers support, insights and real conversations with like-minded parents who truly understand. Join the conversation today. You can find the link in the show notes.
Speaker 1:As parents of neurodiverse children, many of us are acutely aware of how much time our kids spend online, sometimes because it's their safe space, their way to connect with the world. But while the digital world offers opportunities, it also raises real concerns about how our children's personal information is being collected, shared and used. In this episode, I am joined by Emily Keeney, deputy Commissioner at Information Commissioner's Office, known as the ICO, the UK's data protection regulator. The ICO has been leading investigations into major platforms like TikTok and Reddit and holding them accountable for how they handle children's data. In this episode, we are going to unpack what the ICO's work means for you as a parent and how the Children's Code is reshaping online safety and, most importantly, what steps you can take to protect your child's privacy in a world that is always connected.
Speaker 1:This is an episode that every parent should listen to. So welcome, emily. It is such a pleasure to have you today on the Send Parenting podcast. You know online safety for our children is so important and so difficult for parents to manage, especially if you're a parent of a neurodiverse child, because oftentimes the online world for neurodiverse children is where they feel safest. It's where they actually can socialize and interact. It's almost an easier place than the real world, which makes it really vulnerable actually for them, and as parents, you try to do everything you can to protect them, but it isn't always easy, which is why I am so excited to have you on the podcast today to really educate us and to make us kind of understand, because you don't know what you don't know. So I'd like to start, emily, with if you could take us through kind of what ICO does and why this matters and ICO stands for we're going to use that acronym for the podcast, but it stands for the Information Commissioner's Office, so I will hand it over to you, emily.
Speaker 2:Great, thank you very much, and a real pleasure to be here. Thank you very much for having us. So the ICO is the independent regulator. We're here to look after people's data, the personal information, all of that information about who we are, what we do, what we like, what we think, our kind of biometric information as well, so things like what our faces look like, our fingerprints and all the information that companies might collect about our behaviour, our health, etc. We make sure that organisations are using that information responsibly and we make sure that people know about and can use their information rights. So you have certain rights about how your data is being used, about being able to see what's held, about you being able to access that. So we make sure people know about and are able to exercise those.
Speaker 2:One of our biggest priorities is children's privacy. We obviously cover the personal data, the information about everybody of all ages, but we are particularly focused on personal data for children and making sure that's protected, both because children are less able to understand and make judgments about and take action to protect their data, but also because the way it's used can create particular harms that are particularly acute for children, and so that's a particular focus for us. We've got our children's code, which sets out our expectations of how internet society services, so online organisations, should use children's data, and that is all focused on ensuring that the best interests of the child are always at the forefront.
Speaker 1:Yeah, and it's something you know. I know as an adult, you think very much about your own data. You're taught about it at work, you know GDPR and it's, you know, very forefront in your mind, but then you don't. There's a whole nother layer to it when you get to children and maybe one that I wasn't cognizant of or my listeners, in terms of how those are being actually tracked, you know, with the things that they're on all the time. You know, because I would imagine that this is quite a priority because of the accessibility to that kind of data, with kids who are often left on their devices by themselves, you know.
Speaker 2:Yeah, absolutely, and I mean, I'm a parent myself, so I absolutely understand these challenges. And these devices can be fabulous. They can give you very much needed peace and quiet to get things done, but they also, you know, provide some really fantastic learning material the ability to connect with other children, the ability to explore an online world, and actually we've seen in some of the research that we've done that this can be particularly beneficial for certain children who might learn in different ways or who might find online communities that they connect with more. So there are great opportunities. But all of that use and all of that opportunities all come with a huge amount of data that's collected about our children and they come with potential risks, right, potential harms.
Speaker 2:So making sure that, first of all, that it's not all the responsibility of the individual parent. Parenting is a tough enough job already. Parenting send children can be particularly challenging, and we know from our research that about half of British parents feel that they've got little or no control over the information that social media and video sharing platforms are collecting about their children, and actually people don't necessarily feel that they've got the knowledge or the skills to help their children in this area. So it's also really important that we focus on ensuring the platforms, the organisations using that data, are doing it responsibly, are taking the measures that they need to try and take some of that burden off parents' shoulders.
Speaker 1:Because it is almost like a black box. I've veered to the side. I've got probably a couple more years that I can do this, where I just don't allow social media because I don't know how to manage it properly. So I will let her watch TikTok on YouTube because she can't interact with it. She can watch but not see. But there's only so long that you can do that and you want to. You don't know what you don't know from these platforms. I know I hear on, you know programs that you see on TV about how it gets your child's information and then it feeds it information and that can sometimes be negative information or harmful information. So it's quite scary for me as a parent actually.
Speaker 2:Yeah, and one. I mean that's absolutely right. One of the things that we've been looking at as part of our work is the use of children's personal information in recommender systems, and that's exactly as you've described. So when anybody a child or an adult uses a social media site, that site collects lots and lots of information about the user. It collects information about what we watch, what we like, what we, how long we watch it, for you know what's of interest to us, and it uses all of that information about us, potentially also kind of what other sites we've been visiting and what else we've been doing to create a profile, and that profile then is used to decide what content is fed to us based on what it thinks that we will like.
Speaker 2:But I think the concern is that sometimes that means that children are not just if that goes wrong, children are not just sort of occasionally accidentally stumbling on negative content, but they may actually be being fed that content, being recommended that content, and that that content is harmful for them. So we have been doing lots of work in this area. We've launched an investigation into TikTok and how it is using information in this context. It's important to say that that investigation is still ongoing. We haven't come to any conclusions yet, but it's exactly as you say. The ability to understand how that's working and ensure that it's not causing harm is really at the top of our agenda.
Speaker 1:Well, it fascinates me that TikTok in China is not allowed. Children are not allowed to access it. Yes, that says it all to me. But it's wonderful that you are looking into this on the behalf of all the parents, because there almost seems like you have no power or you have no control. So the fact that you're looking into it, so you're looking into TikTok what other areas are you looking into in terms of this online world?
Speaker 2:Yeah, so we have also been doing a lot of work on what's called age assurance, which is how platforms understand the ages of their users, and that's obviously really important, because if a platform doesn't know how old somebody is really important, because if a platform doesn't know how old somebody is, then how can they put the right protections in place.
Speaker 2:We are taking action against two platforms who um or because we're concerned that they don't have any age assurance in place. So that's reddit and imager, and we've also been talking to lots of other platforms about ensuring that they have good processes in place, and we'll be continuing to work closely on that with Ofcom, who are the online harms regulator, so we're taking a coordinated approach. We've also been doing lots of work on ensuring that children's profiles are private by default. So what that means is either when you create an account, all of the settings are set to private so that strangers can't find children's profiles and get in touch with them, or that there are other measures in place around things like search, to make sure that people can't find those children's profiles, and we've seen lots of platforms make changes in that area to properly protect children's privacy. We've also seen platforms, as a result of our work, improve the default geolocation settings on their service. That's slightly technical language, but what that?
Speaker 1:means is so that's where they are, so they can locate where kids are.
Speaker 2:Exactly and make sure that that's not showing to everybody. So you might want it to show if you're a parent and you've got a geolocation setting to help you keep track of your child, but you certainly wouldn't want strangers to be able to see your child's physical location. And we're also doing lots of work with organisations to make sure that they are not using children's data to give them personalised advertising. So that's important, because children often find it harder to understand the difference between content versus advertising, and when that advertising is personalised based on children's profiles and their data, that's even more difficult.
Speaker 1:Wow, I mean you would think that this would have been how they would set it up in the beginning. Do you know what I mean? But having to go back and kind of do this because it's interesting to me on a lot of these games, like even Minecraft or Roblox, strangers can come in and play with the child, unless you, as a parent, have set up that kind of like security device. And I didn't even know of my daughter when she started playing with them and some people came into her Minecraft world that she thought were her friends, and then they destroyed everything and she was shattered. And that's when then I as a parent was like oh, there's settings that you can put on there, but kids also figure out how to get around those settings is the problem.
Speaker 2:Yeah, absolutely. I mean, there definitely are settings and it's really worth having a look at the settings on the services that your children use.
Speaker 2:And you can Google it to try and figure out how to do it, which is what I had to do, yeah and there's lots of great advice out there from reputable sources kind of about specific settings on specific platforms. But I would also say don't panic if you haven't done that and your child does encounter some kind of problem like you've just described. Actually, the most important thing is being able to have those conversations with your child, is being able to talk to them about what are they doing online. It can be as simple actually often as showing curiosity about their online world, and one of the things that the research that we've done with children and with parents really brings home is that we probably still see there being an online world and an offline world, a real world. Parents, kids, don't see that at all. It's just their world. So showing curiosity about what they're doing, what they're up to, you know, having that conversation and opening up that dialogue is probably the most critical thing that a parent can do.
Speaker 1:And starting to understand their world, you know, especially when you're, because you do kind of feel like a dinosaur.
Speaker 2:My kids are at the age where they're very keen to point out what dinosaur I am on a regular basis.
Speaker 1:I also try with my daughter, like if I have on my social media, if I have someone trying to become a friend with me, who's who looks suspicious, and stuff like that. I've taken my, I've taken, I've said, look at this, alexandra. I said, look, this person wants to be my friend, but look, if we click on his profile, there's nothing in this profile here. And look at this and just start pointing out so this is not a real person. You know, this is that there's also that line too for them, when it's not just online world and real world online, they, they, it's very, very real. You know what I mean? They, they don't see that, uh, and they think people are going to be friendly or nice.
Speaker 2:You know that that's and that's not the case. No, absolutely not so. And I think having those conversations, uh, showing curiosity about their world, being there when, if they do have a negative experience so that they can talk to you about it, they're all really important. They're probably you know it's great kind of the technical side of it and putting the settings on, but the conversation, human side of it is, I think, equally, if not more important.
Speaker 1:And I think also there's a false sense of security when you've put those settings on, that they're not going to get access to it, which is, again, this is the path I have gone myself.
Speaker 1:You know, when she then took my phone and went into my TikTok and started posting on my TikTok and I don't even use TikTok except for my business I didn't actually even know I had TikTok on the phone. Okay, how bad I am, until other mothers started contacting me that she was posting online, you know. So you really, you know you really have to be on your toes and even with putting those barriers up, you can't necessarily protect them from everything. So I really like what you're saying about that open communication and understanding, because we kind of do hand our devices over and then it's kind of like our time to get stuff done and we don't necessarily go into their world and see what they're doing and what they're listening to and what they're doing online. Absolutely. Are there any other tips that you would give to parents, you know, in terms of things they can do to protect their children and also to educate their children?
Speaker 2:I think trying to the kinds of things that you've been describing yourself are also really helpful. So, looking at your own use of social media, looking at your own experiences and helping your children, helping use that to help your children understand what risks might be. Having conversations with your children in an age appropriate way about those risks and also helping them to think about regularly who can see their information, who can see the information about them and whether they're comfortable with that and what the risks might be to that. We find, when we talk to children about this, that they don't and this is understandable, because actually this is the case for adults as well. They find it very hard to understand kind of what might happen to their data beyond that immediate transaction. So you know you're using it for this particular service, but have you thought about the fact that that means that, well, maybe all of these other people can see your location and maybe they'll then tell others about it? So sort of helping them think further down the line about the chain of what happens with their data. Having conversations as well about kind of what measures that they can take themselves, how they making sure they know how to report things if they're using social media, making sure that they know about their settings.
Speaker 2:So not just feeling, as children get older, that that's just your responsibility as the parent, but having that conversation with them as well. Responsibility as the parent, but having that conversation with them as well. Obviously, some of these um conversations will be harder or different or more unique when you're talking about um send children, um but I think generally parents are good at tailoring those conversations in ways that their own children will understand, because that's what you've got to do all the time if you've got a SEND child. For us at the ICO, we will really continue to focus on making sure that the organisations are making changes, because obviously there's lots that parents can do, but it can't just be the parent's responsibility. It's not appropriate that all of that burden sits on parents' shoulders, because the systems are complicated and they're designed in ways to be difficult to navigate and difficult to. Protections are in place and the defaults are privacy preserving and are done in ways that have the best interests of children at their heart.
Speaker 1:Yeah, and I think that's music to parents' ears because it is so overwhelming and you hear these horror stories of children taking their lives. You know horrible things that happen because these platforms have been unregulated and have been allowed to treat children like adults when they shouldn't be treated like adults. And we should protect our children and you guys have had some big successes. I know you touched on it, but you hold TikTok and X to account, I believe.
Speaker 2:Is that correct Absolutely? So we've seen lots of changes, actually, and we've set them all out for people who are really interested in our most recent update that we did on our children's strategy, but some examples include that ex-formerly Twitter stopped serving adverts to under 18s and removed geolocation sharing. Instagram defaulted to safer settings for teens following our engagement with them. We've seen BeReal and SendIt improve their privacy protections, including stopping automatically putting geolocation information in children's profiles by SendIt, and BeReal stopped allowing children to post their precise location online. So that's just a few examples. There's lots and lots of changes that we've seen. We're continuing to work on it, though. We're not kind of resting on our laurels, and one of the reasons why we're particularly keen to focus on making sure the right approach to age assurance is in place is because all of those protections they only work if an organisation knows that somebody is a teen or is a child, so that age assurance is a critical element of this.
Speaker 1:Can I ask you how they do that? Because that again is something that kids can get quite savvy to. If it's just you know what your date of birth, your year, you know, I mean, you can easily, as a child, figure that one out. What are? Are there any? You know fancier ways? There are fancier ways, yes.
Speaker 2:And actually that's an area that is, the technology is improving all the time. The most common fancy way of doing it is using a kind of a picture of the face and the software. And the software, the system will then make an estimate of how old the person is based on their face. Obviously that's not 100% accurate, but it's pretty good now. And there are methods in place where, if it's denied you access, if you're a particularly youthful looking adult, for instance, then you can then go on and show what's called a hard identifier so you know something like a passport or a driving license or you know something else to say no, no, come on, I really do need access.
Speaker 1:I'd love to have that problem. I really do need access. I'd love to have that problem. I know it would be great.
Speaker 2:So that's the most common. There is also approaches which tend to be more for proving someone is an adult, as opposed to say proving somebody is old enough to use a social media site, which are based more around using a proper ID and you can get a kind of verified online identity which will be then shared with the site in question. It might be using credit cards as well, that kind of thing. Those tend to be for adult sites because they are very robust in terms of not allowing children onto those sites. Again, the accuracy of those is improving all of the time, and one of the things that we're also interested in with those kinds of services whether that's using the face or using hard ID is making sure that they're not then sucking up lots of data and using it for other things, so that they're just using the least amount of data they need about you to perform that service and then they're not using it for anything else.
Speaker 1:And then it's being deleted. So to speak. And this brings to mind also, because different platforms have different ages that you're considered able to use them. I would presume that this kind of validation, though, is if you are considered a child under 18, then no matter what platform you're on, because some parents give access to platforms earlier than the recommended age guidelines, but this kind of age identifier would then protect the child no matter what, even if they're on the platform. Am I making sense there?
Speaker 2:Yeah, so-.
Speaker 1:In a convoluted way.
Speaker 2:Yes, definitely it works in different ways. If you've got something like a social media site where you could use it at, say, 13, which is often the age that they say is the minimum age, then the age assurance will check whether it thinks you're 13. If it's confident that you're 13, you'll be allowed then to set up an account, and all of the relevant protections that they have in place for child users will then kick into that account until the point at which you reach adulthood, and then you'll normally be given a kind of reminder about whether you continue with those or whether you change your settings. So that will generally mean that things like the default settings are all, at the most, privacy preserving for that group of under 18s who are old enough to use the service, but still children, okay.
Speaker 1:So that's I mean. And when do you see that becoming a reality across these platforms, or is that too hard?
Speaker 2:No, some platforms are already starting to use it and others are, I think you know, not far behind starting to use it and others are, I think you know, not far behind. So we expect to see it becoming increasingly common. We are encouraging it to become increasingly common and we'll also be. We'll be looking at in more detail at those organizations who, continuing to look at those organizations who aren't doing anything in this space.
Speaker 1:And how do you hold them to account? Like, how do you? You know these are huge, multi-billion dollar how do you make them do what you want them to do? So?
Speaker 2:we have a whole range of enforcement powers, including the ability to levy quite large fines in the most severe cases levy quite large fines in the most severe cases. So, um, we have for instance I mentioned tiktok we find them um a significant uh amount I think that was over 12 million um and where we you know organizations all will have the ability to appeal against that, to contest that through the courts. But we are able to use a whole range of regulatory tools. We're also able to do things like information notices, where we compel them to give us information about how they're using data. We can do audits to make sure that the right approach is being taken. So we have a range of tools in our box. We also work very closely with Ofcom, who are the online safety regulator, and you know these two. Our remits obviously kind of bump up quite closely together in this area, so we work hand in hand with them as well, and they have similar powers, including the ability to find organizations who aren't adhering to their online safety requirements.
Speaker 1:That's great. So there is that accountability and you can look underneath the hood of the vehicle and see what's going on there and that's where change will then happen and you are seeing happening. If you were to look into the future, where do you see what is going to be the focus moving forward? And you know, one of the things I mean and I know you probably get asked this a lot too is like artificial intelligence. Right, and you know I've heard some horror stories of kids using artificial intelligence, with, you know, having conversations, very inappropriate conversations but where do you see your focus going to be going in the future to keep kids safe in this area?
Speaker 2:Yeah, we're continuing to focus on recommender systems and on some of the areas that I've outlined. We're in the process at the moment of doing a bit of a stock take to think about what will be next in terms of where we prioritize our resources. Think about what will be next in terms of where we prioritize our resources. Obviously, we only have limited resources and there's always a lot going on, so we're thinking hard about where we can have the biggest impact. We will, I think, continue to look at age assurance because, as I've said, you have to know that you've got a child on your site to take the action to protect them, and then there will be some choices for us. There's lots of things that we're thinking about at the moment, including whether we look at the use of technology in schools and education, or whether we look at other sectors like gaming and how we address the increasing use of AI in the way in which it's those are those settings.
Speaker 2:Yeah, well, exactly, and more broadly. So, there's lots on the horizon and we're currently working out exactly where we target our activity to have the biggest impact and take the biggest burden off of parents.
Speaker 1:I mean, they have a saying in Africa. You know, you can only eat an elephant one bite at a time, and it kind of sounds like you've got one of those tasks.
Speaker 2:A bit of a herd of elephants, I would say.
Speaker 1:And it just keeps you know the minute you get one thing technology just seems to be going at a rapid rate and you know it's yeah. Is there a way that parents can help Like, is there a way that they can help advocate? That they can help support you guys Is whether that is signing petitions or writing to MPs or just generally doing things as well, like using the reporting functions that services themselves have.
Speaker 2:All of those actions can help and we will keep engaging as well, and so, if you do have concerns, you can always come to the ICO and find out more from us too.
Speaker 1:Okay, that's great, and that's actually a good point with the reporting functions, because maybe you don't take the time to do that, but then actually, if you are able to audit them, then that creates a trail, doesn't it?
Speaker 2:Yeah, indeed, and those reporting functions are important for organizations too to understand where there are problems. And I think it is important to say that when we engage with organisations generally they are keen to make changes, to improve. So if they're getting information through those kinds of reporting functions that there are problems, that helps too.
Speaker 1:Okay, so there's generally a goodwill is what I'm hearing from you. It doesn't feel that way when you're sitting on this side of it trying to manage it.
Speaker 2:No, it will always vary a bit from organization to organization, but many of the organizations we engage with are keen to take steps to improve.
Speaker 1:Yeah, which is good because ultimately, you know, many of us have children and I imagine many of the people working in those organizations have children as well. So you know who wouldn't want to keep children safe? You? Know, Well, it has been an absolute pleasure speaking with you today, and I like to end my podcast with kind of what three top tips would you give parents on this topic that they can put in their back pocket and take home with them today?
Speaker 2:Oh, let me think. I think number one is have the conversations with your children, be interested in their online world, even if it means having to have very long conversations about Minecraft.
Speaker 1:Being there?
Speaker 2:Yeah, absolutely. Number two is check those default settings. You know whether you've got young children and you're doing it for them or whether that's part of a conversation with older children and checking in. And I think number three is actually it's related to the first two, but it's not seeing it as a one and done. Keep dipping back into that conversation, keep dipping back into those default settings, because it's very easy to assume that you've had that conversation once or you've checked those settings once and it's all fine. But I think, as you said yourself, children get around those, they change those and they themselves change and evolve. So it needs to be a regular.
Speaker 1:It's such a good point. And I said to my daughter I check her phone. Right, I have her, I look at it, I look at what she's been doing, I look at her browsers, I look at stuff like that. And she said to me mommy, it's not fair that you're looking at my phone. I'm not allowed to look at your phone. And I said, no, it's because it's my duty to keep you safe, just like when you learned how to cross the, who was upset that her mother had been looking at her phone. And I heard my daughter say it's just because your mother loves you and she's trying to keep you safe.
Speaker 1:I was like, yay, but it's so true, because you don't know. And then sometimes I'll just be outside where I can hear her on what she's listening to and what she's watching, because then I can hear what it is. It is without her. You know input, and I think that's important too. You know what, what, what are they interested in? It's not spying, but it's just you know, just like if they were playing with their, their friends, and you might hear their conversation. You just check in every now and then to make sure that things are okay, and I think that you know we have it's. It's not natural for us to do that, so you have to kind of make the effort at first, and then it becomes just normal to us as well.
Speaker 2:Yeah, I totally agree.
Speaker 1:The dinosaurs. Yeah, exactly, I think those are three really great tips and I think you know really helpful discussion today and we'll include in the show notes kind of where they can contact you, where the child code is as well, so people can read that as well and, you know, keep fighting the fight. I'm so you know, so happy that you know you're out there and you're protecting our children and that you're holding these organizations to do the right thing and that they have the will to do it. It makes it less scary, I think.
Speaker 2:Good and thanks very much for the opportunity to come along today. It's been a pleasure.
Speaker 1:Thank you for listening. Send Parenting Tribe If you haven't already, please click on the link in the show notes to join us in the private Send Parenting what's Up community. It's been wonderful to be able to communicate with everyone in the community and for us to join together to help each other, to navigate challenges and to also celebrate successes. Wishing you and your family a really good week ahead, Thank you.