The Science Pawdcast

Episode 21 Season 7: The Environmental Price of A.I. and Touching Pets

Jason and Kris Zackowski Season 7 Episode 21

Send us a text

We dive into the environmental impact of AI technologies and explore the science behind how pet touch affects human well-being.

• Generative AI consumes significant energy with a ChatGPT query using approximately 10 times more energy than a Google search
• By 2028, data centers could account for 12% of US energy use, up from 4.4% currently
• Companies rarely disclose the true energy costs of training and running AI models
• Smaller AI models can achieve similar results with dramatically lower carbon footprints
• Practical solutions include choosing efficient models, using AI during off-peak hours, and minimizing unnecessary words in prompts

• Recent study of 443 pet owners shows both giving and receiving touch contributes to owner well-being
• Stroking pets shows stronger positive effects on well-being than hugging or holding
• Pet leaning behavior (like when dogs press against their owners) is particularly beneficial for emotional health
• The act of giving touch to pets may be more beneficial than receiving touch from them
• The physical mechanisms of touch involve specialized nerve fibers that respond differently in hairy versus non-hairy skin


Our links:

Our Website!  www.bunsenbernerbmd.com

Sign up for our Weekly Newsletter!

Bunsen and Beaker on Twitter:

Bunsen and Beaker on TikTok





Support the show

For Science, Empathy, and Cuteness!
Being Kind is a Superpower.
https://twitter.com/bunsenbernerbmd

Speaker 2:

Hello science enthusiasts. I'm Jason Zukoski. And I'm Chris Zukoski, we're the pet parents of Bunsen, beaker, bernoulli and Ginger.

Speaker 1:

The science animals on social media.

Speaker 2:

If you love science.

Speaker 1:

And you love pets.

Speaker 2:

You've come to the right spot, so put on your safety glasses and hold on to your tail. This is the Science Podcast. Hi there, Welcome back to the Science Podcast. We hope you're happy and healthy out there. This is episode 21 of season seven and it's a minipod because we're talking to you from the past. This is scheduled for when we're on holidays, which is odd. Hello, future Chris.

Speaker 1:

Hello, future Jason.

Speaker 2:

We're only doing two articles this week just because we don't have a lot of time before we go. It's been a very busy week. Are you excited to head to the island?

Speaker 1:

I am Jason. It's been a while since we've been on a nice holiday together. You definitely haven't gone anywhere since COVID. I, of course, got to spend time in the eastern part of Canada in the Maritimes with Adam and the Royals, and you missed out on that and you gave yourself food poisoning.

Speaker 2:

So it's okay. Last summer we didn't do anything because we had Bernoulli. He was I don't think. I don't think there's any trip that would be as fun as a puppy like. He was just fun every single day.

Speaker 1:

It was right and that was a decision that we made, when I said, hey, what about getting a new dog? But that would mean that we would be spending the summer with the puppy.

Speaker 2:

Yeah, and that was fine and then bunsen got sick and he survived, of course, but that also we were thinking about, maybe in the last bit of august, when the puppy was older, going, and then bunsen, of course, got sick and we're like, nope, that's the end of this. So it all worked out and I'm gonna miss the dogs big time. I know Bunsen and Beaker will be fine. I don't know what it is, but I'm just feeling like I'm gonna miss Bernoulli so much because he's the baby.

Speaker 1:

I think he'll be fine, though he he is the baby, but I know our dogs will be fine. They're in good hands and we just love them so much.

Speaker 2:

All right, what's on the show? This week In science news we're going to talk about the impact of AI on the environment and in pet science. There's a study about being touched as a human and how it's beneficial and perhaps the role that pets will have in that area, of the benefits emotionally of being touched. All right, let's get on with the show. There's no time like science time. This week in science news we're going to talk about generative AI and its environmental impact.

Speaker 2:

Now we've posted some AI stuff in the last four months and for the most part, people have been. They like it. It's like the video, the generative Google VO3 stuff and for the most part, people like it. But we do get some very angry comments every single time we post it. Even back to when we would occasionally post photos, actually cause people to DM us and unfollow us and leave angry comments. So I thought we would look at the environmental impact of AI, this generative AI stuff, as that is one of the things that some of the upset folks have said in our comments on social media.

Speaker 1:

Because you think, oh, create this cute picture of a dog looking like it's the pop head. It actually does take a lot of energy to do that, more so than you think, because it's the pop head. It actually does take a lot of energy to do that, more so than you think, because it generates the image quite quickly. It's oh, let me generate that for you. And seconds later you have your dog dressed up as a Barbie in a box.

Speaker 2:

I think the one thing that's starting to become very clear to me, though, chris, is that it's AI is now everywhere. My mail app on my phone has AI. Do you have that on your phone? If I go into my mail app, it summarizes some of the unread texts, and on my phone, when people leave messages in its voice, it summarizes what the message was like, using AI. Like it's everywhere, I don't know if we can get away with it. A lot of it is extremely handy and time-saving, but what we're going to be looking at is the environmental cost.

Speaker 1:

So OpenAI is a company and the CEO, Sam Altman, claimed he was talking about the average chat GPT query used energy similar to an oven for just one second and experts chimed in and they argued that figure lacked context because the definition of an average query is quite unclear. But we do know that each AI interaction consumes energy and emits carbon, which raises those environmental concerns.

Speaker 2:

So if we look at LLMs that's large language models they're described by the number of parameters or internal variables tuned during training, and the more parameters you have, the greater the learning capacity, but then the higher energy demands. So the better the AI, the more energy it requires to both train and give you the answer. For example, I have GPT-4. I actually pay for chat GPT and that has 1 trillion parameters and these models are hosted in huge data centers with very powerful GPUs to process these queries quickly, because the last thing you'd want if you're selling a product is if you ask a question and it'll get back to you like an hour from now. Early LLMs used to do that. Now these data centers already account for about 4.4% of US energy use and the shocking stat is that could rise to about 12% by 2028. So only three years from now.

Speaker 1:

Yeah, by 2028. So only three years from now, yeah, and think about what AI can do three weeks from now, not even in comparison to three years from now.

Speaker 2:

Yeah, that Google VO stuff blows me away still, and we'll talk about the energy cost of it, but that's the prompt to video. I made Bunsen, beaker and Bernoulli snowboard with Norbert and it was cool. I just typed it in and it made them do it, which is wild.

Speaker 1:

And it's getting more accurate as it gets more information. Now there's I want to talk a little bit about the challenges in measuring the LLM carbon footprint. So there is a training phase which requires weeks of processing on thousands of GPUs and Adam worked in the tech department at Staples and he sat me down and he talked about GPUs and he talked about CPUs and I listened to him and he was so passionate about it and then I was like are you mining Bitcoin in the basement? Because he was talking about the GPU, cpu and it. I'm not gonna lie, it went over my head.

Speaker 2:

But who stands for graphics processing unit, for everybody who's listening.

Speaker 1:

Yeah, and it's different than the CPU, which is the.

Speaker 2:

Central processing unit.

Speaker 1:

During the training phase, like I did, got off the rails there talking about the GPU. It does require weeks of processing on thousands of GPUs and it has a high energy use, but companies rarely disclose the data, like what kind of computational time or what kind of energy type, what kind of data is being used. And so the emissions from training are largely unknown. So it's in a mystery box, right? You don't know what's there. And then, after the training phase, there's the inference phase.

Speaker 1:

The inference phase, and so emissions will occur every time a model is used. So you ask chat GPT something that's your query input and people are gravitating towards chat GPT as opposed to Google and for simple questions that typically Google would have been able to answer. And so these? That's a challenge because chat GPT takes up more energy. The inference emissions are expected to surpass training over a model's lifetime, and that impact will vary and it varies by the center that the data is located. It also will vary based on the source of energy Energy is stored in a grid and it could be renewable versus a fossil fuel type of energy but also time of day. So if you're at the peak time or the off-peak power demand, how is that having an impact on the emissions and the inference phase.

Speaker 2:

So estimating how much large language models use in energy is tricky, because these companies really don't want to tell everybody how much energy went into training. And if it's bad news, I can see why they're not going to tell everybody that Without political or social pressure, you'll never know. We'll never know. If nobody's telling them to tell the public how much energy they're using and it's a lot they'll be like why do we need to tell you this? This is bad news for us.

Speaker 2:

Okay, so we can make some inferences about how much energy is being used, but we have to use it at scale, right. It's difficult to determine one person. We have to look at using it at scale. So a 70 billion parameter reasoning model answering around 600,000 queries, like asking them a question to do something, is about the same emissions as a round-trip flight from London to New York in a plane. This is current models, because things change so fast in technology that they're constantly using better cards. Nvidia's new H100 chip is even more powerful than the A100, and that means it's going to use more energy. Another thing that you have to take into account is this is just the energy to answer the question, chris, but these systems have to be cooled, they get so hot that an enormous amount of energy goes into keeping them cool from starting on fire and melting down Like Adam's computer. He's got some weird cooling system right, like some crazy cooling system.

Speaker 1:

Yeah, and you go into his room and he has Annalisa's computer in there, so that's just double the infrastructure to create heat, and I'm thinking we have air conditioning and we're cooling down the computers that are ramping up.

Speaker 2:

So determining how much energy is used in these large language models is difficult, and the estimations online are invariably off and they're probably low. That's what the I guess their conclusion is. So what can we do about this?

Speaker 1:

Well, there's definite ways to reduce the AI environmental impact. For sure, choose a different model, use a smaller model. Another example is there is a standard model Quen 2.5, and that matched the reasoning model Cognito 70B's accuracy with using less than one third of the carbon footprint. And there also are some available tools which you know. I liken this to being in the car and you think of how much fuel am I using and you watch the fuel gauge go down as you're driving around town.

Speaker 1:

If you look at if there was some kind of meter that you could look at and be like whoa, my query, my reasoning task query, is using so much energy on a scale that might be beneficial to help curb that behavior that you're seeking AI, putting your questions into AI. So Hugging Face's AI energy score ranks models by efficiency across tasks. Another energy ML energy provides similar comparisons, which helps users choose greener models, also looking at behavioral changes. And so, instead of going at the most power hungry time, use AI at off peak times, so the cooler periods will have less grid strain. And here's another one minimizing the unnecessary words and prompts.

Speaker 2:

I love this one.

Speaker 1:

I know I like to be very polite when I'm talking to AI, please and thank you, but what that does is that's needless because it just adds to that processing. So if you refrain from politeness, that will reduce the processing. And some people are actually quite friendly with AI and chat GPT and they're asking it philosophical questions, which this is another branch that we could talk about on another podcast. It's like the loneliness situation that's happening in some areas of the world that people are experiencing and so they're seeking other outlets. But that outlet, without a gauge on it, could definitely be ramping up the consumption, and there's always the opportunity for implementing policy, so implementing energy rating standards for AI models. So, similar to your appliance in the laundry room, your washer dryer, your microwave, has energy labels that are written on it.

Speaker 2:

Like a transparency right, Like some kind of transparency.

Speaker 1:

Yeah, so if you're at the store and you're comparing models, they all have a sticker on it Like a transparency, right, like some kind of transparency. Yeah, so if you're at the store and you're comparing models, they all have a sticker on it so you can make a more informed choice. But then also, I liken this to the e-scooters in town. The company that they, that our city, leases to, has max speeds on the scooters, and so you can only go a certain speed on those scooters, versus buying a scooter at a different store that doesn't have that policy or that restriction, and so you could be clocking at 50 kilometers an hour and you can get into some serious damage to your body if you got into an accident on a scooter at that speed. Another policy suggestion is requiring high usage models. So less, but so greater than 10 million daily, but greater than 10 million daily users to meet minimum efficiency grades. So being scored this is B plus or higher.

Speaker 2:

We literally had to get some new lights and they said the incandescent bulbs are being phased out. Like you, literally aren't even going to be able to get them anymore jason.

Speaker 1:

That legislation came in 2013. I taught about it in social studies 9 lights out to inefficient bulbs. It's been a while's taken a while. 2013 was the target year.

Speaker 2:

Yeah, but yeah, I guess we don't really have that long to wait, because the big challenge is as AI grows, there may not be enough energy. You ask ChatGPT for help with your homework and it's the straw that breaks the camel's back in your area and you lose power and there's a brownout. So if ai accounts for 12 of entire energy demand, the rest of energy has to go down or societies have to make more power, like one of the two.

Speaker 1:

Something's got to give right and what needs to get improve. I'm just letting my students know, I'm letting people know that chat GPT does not do math very well.

Speaker 2:

Oh yeah, like it's not great at everything for sure, no, it's things and it's terrible at other stuff.

Speaker 1:

You would think procedural step-by-step it would be able to handle, but it cannot distribute a negative into a set of brackets. It's so bizarre.

Speaker 2:

Who knows what we'll be able to do in three years, though, chris.

Speaker 1:

Who knows?

Speaker 2:

Okay. So I guess it's tough to make estimates about how much energy is being used. But as we close, I have this table that was put together by an engineering firm. I have this table that was put together by an engineering firm and a single Google search uses zero, decimal, zero, zero, zero, three kilowatt hours and it's about 10 times the energy. If you use a chat, gpt, prompt and a Google VO three estimate is about twice the energy. That's what the early estimates are. So those Google VO3 estimates may be about 20 times the energy of a single Google search. Now, it could be way worse than that because we have all of these unknown variables, but that's what the data today says.

Speaker 1:

And that's only for one query or one video. How many people are making a video in an hour? What is our global population? 8 billion.

Speaker 2:

Yeah, those good those, those Yeti videos are extremely popular on social media, like they're everywhere. There's thousands of accounts making these Yeti videos. I don't know if you've even seen them yet, chris, but they're everywhere.

Speaker 1:

I've only seen them because you've shown me.

Speaker 2:

Yeah, they're on my algorithm because I watched a few. All right, that's science news for this week. This week in pet science, we're going to talk about friendly touching. Now I think I should probably start by saying I do not like being touched by people I do not know, or even people I do know, or even my friends or my coworkers. I have a very small group of people that I enjoy being touched by.

Speaker 1:

Yeah, jason. And so for me, I'm more of not necessarily a touchy, feely kind of person, but a gentle touch on the shoulder is part of my love language. And part of my love language is hey, let's hold hands as we walk down the street, and you are not a hand holder.

Speaker 2:

I'll do it for you.

Speaker 1:

I'll do it for you, you do which I appreciate, but that took a little while for me to to reconcile right. So you will hold my hand, but you are like a noble gas, you're like oh, I'll bond with you, but I just want to be hanging out with my own element, right?

Speaker 2:

yeah, yeah, I'm not blind to the science, like I absolutely know and I've done studies on. I absolutely know and I've read studies that people do need touch in their day. Uh, friendly human touch is known to enhance well-being through your feelings, emotionally and your senses, and I guess our modern society limits that for people that need that and maybe it's taboo. You don't touch people. Maybe it's because we're more separated and we're more sequestered, and this study that we're going to look at is how pets may fill this gap, and maybe that's where I get my fill from is I'm okay with my family touching me and hugging and holding hands with you, but, like I touched the dogs all day long, all day long I touched them and they seem to be okay with it. Ginger less she's sometimes you can touch me.

Speaker 1:

Every once in a while. Now. What I really liked about this study is it basically incorporated a literature review and talked about a lot of the prior research and made connections to that. The prior research has focused mostly on human to human touch benefits and lots of studies are out there about newborn babies and the skin to skin with their mother. But the prior research has shown mixed results regarding pet ownership and well-being and rarely explore but prior research has rarely explored the specific role of touch type. So whether you're giving the pets to your dog or you're receiving a boop in that pet owner interactions.

Speaker 2:

So pretty conclusive that touch is super important for humans and, as you said, especially for the little ones. But this one's looking at pets. So we have 443 pet owners more cats in this study, which is interesting 246 cats as the pet owner has, and dogs are about 197. And they're basically Asia to Europe, from Austria to Hong Kong. This is the area that they're getting it from. The mean age is about 30 years and it's diverse in gender and occupation and health status. There was an online survey that was done that assessed their owner and pet touch types how frequent, how long, what's the context and how effective was the response of touch. And then, on a validated scale, there was subjected well-being analysis. So they were like said how they felt and you do much better with the data analysis, being a math brain, but I think they did pretty good with their data analysis.

Speaker 1:

For sure. So they did go into a lot of how they did their controls. They looked at multiple regression models that were controlled for mental health and then they did separate models for owner touch and then pet touch, which is reciprocated, and they standardized the variables to interpret it but to interpret the effect size in their data analysis and they did some pretty important key findings.

Speaker 2:

So here are the somatosensory variables that the study found and their I guess their findings. That is like a sensation which can occur anywhere on your body. Okay, so that's what somatosensory is. With owner touch stroking frequency and duration positively predicted owner well-being Stroking effects were stronger than for other touch types hugging and holding and the area of physical contact during stroking also predicted better well-being. And with pet touch, pet nudging frequency had a small positive association with owner well-being. What's a pet nudge? Is that like when Bernoulli dolphin punts you into petting him?

Speaker 1:

Or like he rubs up against you. Well, that's funny because if we're sitting at the table and Bernoulli comes over, he's boop and his head is so strong that your elbow flies off and it's. I'm like I would say that would be a negative association.

Speaker 2:

Yeah, because you throw your coffee everywhere if you're not careful.

Speaker 1:

Yeah, exactly, but with Ginger, when she comes close she's very gentle, and so maybe that came more from the cats.

Speaker 2:

Ah, and then it does say pet rubbing frequency negatively predicted wellbeing. I don't know what that means. Maybe they rub up against you if you're feeling sad. Pet rubbing frequency negatively predicted well-being. I don't know what that means. Maybe they rub up against you if you're feeling sad. So I don't that's. I guess. If your pet's rubbing you all the time in this study, that negatively predicts how well you were feeling.

Speaker 1:

Oh, Because that could reflect the pet sensitivity to the owner distress. So the owner might have been feeling negative in their well-being and the pets oh wait, what was going on? Or they could have got some shocking news from their bank account and they're like, oh no, and then the last thing that they might want is a pet to be rubbing against them as they're fretting about something that might've happened.

Speaker 2:

Yeah, Pet leaning was associated with a positive effect that predicted owner wellbeing and that make when Lee and Bunsen sit on your feet and lean into you. That is a good feeling. That is such a good feeling. Bernoulli was leaning into you the other day and he was so happy and we just had to stop and be part of that moment because it was such a fun moment.

Speaker 1:

Yeah, yeah, I love it, I absolutely love it when the dogs come in and lean in and that's a trait of the burner. To come in and do the burner lean, it's called, but if they sit on your feet, you are absolutely chosen.

Speaker 2:

And negatively associated with wellbeing is if you're resting and your pet is jumping around. I thought that was a fun finding and that makes sense. You're trying to sleep and the cat's jumping on your head, or I guess the dogs are jumping around. It would be frustrating or if they're unexpected.

Speaker 1:

You're sitting there and then unexpected jump on the lap ah, I didn't ask for this, there's not really a consent.

Speaker 1:

Sometimes they're like I'm gonna jump on you and there, ah, I didn't ask for this, there's not really a consent. Sometimes they're like I'm going to jump on you and there's no time to be like, okay, that sounds like a good idea. Another part of this study looked at touch giving versus receiving of that touch and both giving so stroking your animal and receiving. So that comes in, with the nudging and the leaning touch, contributed to owner wellbeing and if an owner is giving touch like stroking, that was particularly significant, despite a more traditional focus on receiving touch in social touch literature. So again, this study really went into the other types of research that has been done and they talked about some potential mechanisms. And this is where bodily science, biology, comes in. Biology comes in and it talked about AB fibers and T afferents via the non palmer skin talk, but skin via non palmer skin contact, and so they talked about the your palms not having hair versus other areas of your body that have hair, and the hairy parts tend to respond more to touch than the non hairy parts.

Speaker 1:

And yeah, and there's emotional and relational benefits from engaging in caregiving touch.

Speaker 2:

So I think, as we wrap up and we look to the future, it'd be interesting, as they noted, to look at this long term. Long term. Do people with pets that they can touch and be touched by, do they have better well-being than people who don't have pets? And then also, how did the pets respond, as this is their response relative to the owner well-being? Luckily we have three animals Ginger too. All of our animals like to be touched. Beaker does not like to be hugged. She does not, so if you hug her she'll tolerate it, but she doesn't like it. Whereas Bunsen and Bernoulli they love a hug, they love being hugged. Bernoulli especially, he just loves a hug, especially by Adam hey.

Speaker 1:

Yeah He'll. He does a trust fall into Adam and he's hug me, please hug me.

Speaker 2:

Yeah, but where I'm going is if your pet doesn't really respond well to being touched, perhaps that has an impact on your well-being. Like you go to pet your dog and it runs away from you. That would be heartbreaking. And then also, they only look to cats and dogs. It would be interesting to look at other animals that like to be touched. I see a lot of people on social media with their birds and their parrots, and the parrots like to get petted. I don't know if you want to get petted by a parrot, though They've got scratchy feet, but I digress.

Speaker 1:

But I think a parrot will nudge you.

Speaker 2:

Yeah, that's true, yeah.

Speaker 1:

Which is part of the touch that was looked at in this study. So this study offered some conclusions. They talked about pet owner touch, especially stroking, shows a small but meaningful association with owner well-being and the somatosensory which is bottom up. So those are variables like frequency and duration and non-somatosensory variables. So that's the context that the situation is happening in. So both somatosensory and non-somatosensory factors do play roles, but tactile stimulation appears more consistently beneficial to owner well-being and the study highlights that the nature and context of touch matters and touch giving may be as or more important than touch receiving in human pet interactions.

Speaker 2:

There you go. It's nice to have some good news and the good news is that if you're petting your pet, seems based on this study, it's good for your well-being. Makes me going to miss my dogs even more when we're gone and Ginger, but definitely it's going to be a big thing we're not going to have around. Maybe there'll be some dogs. Maybe there'll be some dogs on the Island We'll get to see.

Speaker 1:

And that's what I did when I went to Halifax and Prince Edward Island. I saw all the dogs and I petted all the dogs.

Speaker 2:

Yeah, all right, that's pet science for this week. That's Pet Science for this week. That's it for this week's show. Thanks for coming back week after week to listen to the Science Podcast. And a shout out to all the top dogs. That's the top tier of our Patreon community, the Paw Pack. You can sign up in our show notes. All right, chris, let's hear those names that are part of the top dogs. That are part of the Top Dogs For science, empathy and cuteness.