Info Window

The Cost of Social Media

April 28, 2021 Season 1 Episode 3
Info Window
The Cost of Social Media
Chapters
Info Window
The Cost of Social Media
Apr 28, 2021 Season 1 Episode 3

In this episode, we discuss what it really means to be present on social media. Our guest in this episode was Dr. Jeremy Lipschultz from the social media lab here at UNO. We also had Calli Carlson give us some of her input as well.  We hope you learned some more about misinformation and social media in this episode and don't forget to be a part of the survey to get entered into a raffle for a chance to win an Amazon gift card!

Link to the survey: 
https://docs.google.com/forms/d/e/1FAIpQLSea-RSkOSGwHJp7YJCb8tX8IbMm1xwJcr6jvF_SzpG2fqX2Bw/viewform

Show Notes Transcript

In this episode, we discuss what it really means to be present on social media. Our guest in this episode was Dr. Jeremy Lipschultz from the social media lab here at UNO. We also had Calli Carlson give us some of her input as well.  We hope you learned some more about misinformation and social media in this episode and don't forget to be a part of the survey to get entered into a raffle for a chance to win an Amazon gift card!

Link to the survey: 
https://docs.google.com/forms/d/e/1FAIpQLSea-RSkOSGwHJp7YJCb8tX8IbMm1xwJcr6jvF_SzpG2fqX2Bw/viewform

Cali Carritt :

We are a part of the college of business administration scholars Academy from the university of Nebraska Omaha. And we have collaborated with the Department of Homeland security to bring you this podcast with this podcast, we hope to inform you more about the spread of misinformation and give you the tools to combat it in the real world. Also, don't forget to follow us on Instagram at info window UNO. I am Cali Carritt,

Phillip Stelling :

And I am Phillip Stelling

Cali Carritt :

here with us today is Dr. Lipshultz doctor . Would you tell us a little bit about like what you do here on campus?

Dr. Lipschultz:

This is my 32nd year at UNO and I am the lead professor in the UNO social media lab for research and engagement. And I teach a course called social media measurement and management, as well as another one called communication law and policy. I'm also interested in media entrepreneurship theory and research methods, and I've written several books about social media communication.

Cali Carritt :

Perfect. Sound like the perfect guest for this.

Phillip Stelling :

That's awesome. Okay. So I guess to start off our podcast , we have a statistic that our statistics team has come up with here from their research that says that 29% of Americans believe that social media sites are most responsible for the spread of fake news. Um , from your research and experience , uh , do you think that's true

Dr. Lipschultz:

Hard to completely nail down with survey research what's happening in the social media spaces? In my teaching research, I focus more on the presence of social networks and how those networks behave. I think it's much more important to talk about the people around you in social media communities than it is to paint with broad brush.

Phillip Stelling :

For sure. For sure. How do you think those , uh , those people in those communities kind of affect that , um, uh, that exchange of information

Dr. Lipschultz:

We've been studying personal influence in the field of communication since way before the internet and social media back in 1955, a couple of professors were studying the influence that people have on each other in a small rural community, Decatur, Illinois, and found out that people lo and behold do influence each other. Right. So if you are trying to decide about a new movie that's coming out, do I want to go see this movie? You probably have specific friends, family members, or acquaintances who you trust when it comes to the decision about whether you should spend money to go to a movie or not. The same is true for almost every activity products we purchase places we go , uh, people we meet with. So it comes down to a source and message Kress credibility. Do you believe who you're talking with or who you believe you're talking with and does the message you hear sound credible when source and message credibility or high messages are most likely to be believed? The problem that we have with misinformation within social media communities is that in some cases, people are credible with certain groups of people, even though they're really not sharing credible information.

Cali Carritt :

Yeah. To , um, to kind of go off of that. We talked to Dr. Gina Ligon and she kind of told us a little bit about , um, the like Russian bots and how they kind of like get , um, like your trust and stuff like that. So why do you think it is that people are more likely to trust people that they've never even met over social media?

Dr. Lipschultz:

Well, I think it's interesting that people let their guard down sometimes. I think when they're consuming all forms of media and engaging with social media, they are usually doing it to be entertained. And I think when we're in that entertainment mode, we are much more likely to not be critical consumers of the information.

Cali Carritt :

Yeah, no , definitely. I was just gonna say , um, we do have this fact that we'll fact statistics, that our research team found , um , where it says 10% of Americans have knowingly shared a fake news story. So in my mind , um , kind of what you were talking about with entertainment, the only reason I could see someone sharing, something that they know is wrong is for entertainment. So why do you think someone would still like share that if they know it's fake, even if it's for entertainment?

Dr. Lipschultz:

Well, we know from media effects, research that in terms of persuasion and persuasive ability, the most likely way that people are going to be persuaded by information is if that information aligns with their current worldview, that is to say reinforcement of existing viewpoints is the easiest form of persuasion. So what we see in social media communities, and we see it in social networks, for example, on Twitter, where these are highly polarized crowds, the red crowd, the blue crowd, they are sharing information with their followers that aligns with their point of view. The problem that we have particularly in the United States compared to other countries right now, is that the two polarized worldviews don't talk to each other very much. So this is the filter bubble problem, and it continues to be pervasive. So if you're in a community that thinks one way about , uh , political information and you are constantly being bombarded in social media with messages that reinforce that point of view, you're never going to challenge yourself to try to understand the other point of view. And when you have polarized crowds, as we have with social networks, then we never really have deliberation. We never really reflect. We never really trust each other. We never really respect each other. And then the result is we never find consensus. Part of it is the lack of leadership. If you look at, for example, Congress, and even the United States Senate, these are political bodies that a generation or two ago had their political differences, but tended to find common ground. We tended to have a set of values that we could all agree at bond that's difficult right now. Now part of it is society is changing at an extremely rapid rate, both socially and technologically. And in those times of dramatic change, there are ripples. And it's really hard sometimes to understand what's happening. I mean, take Bitcoin for example, and cryptocurrency, how many Americans really understand that right now? Not many, I would say. So that leaves the door open for all kinds of misinformation about that. Uh, either promoting it or saying it's the worst thing to come along ever and everything in between when we don't have the facts, when we have ambiguity, it's much easier whether it be through humans or bots to influence

Phillip Stelling :

You think a lot of those, like, I guess Boston, your sense or , um, AI, do you think that's kind of helped , uh , spur on a lot more than information spread , uh, showing people a lot more what they want to see instead of what the true things are.

Dr. Lipschultz:

I think the algorithms from Facebook, Twitter, other social media platforms make it easier. And Tik TOK pack is a great example. Right now it's a highly focused algorithm. If I start liking cat videos, they're going to just send me more cat videos than I can watch. They do that because the goal of these platforms is to attract you and keep you there. So you can see the advertisements, that's their monetization, that's their model. And with that comes an ability then to profile people. And once you have people profiled within filter bubbles, it's much easier to infiltrate those filter bubbles and purpose messages that have the most likelihood of having impact on those groups of people.

Phillip Stelling :

Now, do you think that the , um, those AI companies, or just algorithms in general today spur , um, people to find more like information or do you think they try to show even different information to other people?

Dr. Lipschultz:

Well, I think most of the companies surrounding technology and the whole tech sector are in it to make a lot of money as quickly as they can. And so in that space, there's a lack of law because the technology is moving so quickly and even a sense of what is ethical or not. Isn't always clear from the standpoint of people within these companies, it's a highly competitive marketplace. People are trying to survive and prosper within these spaces. And I just think it's a bit of a wild, wild West. It always has been on the internet, but particularly when something new comes along, then there is a period of time in which we first have to figure out what it is. Law makers have to figure out what it is, and then they have to decide, does this need to be regulated or does self-regulation work in this context? So I think it's a mixed bag,

Phillip Stelling :

For sure. For sure. Um, I guess last thing I'm going off of AI there, is there anything you think that those tech companies can do to , um, better conduct their business ethically and try to stop the spread of those fake news articles?

Dr. Lipschultz:

Well, they need to, because they are being scrutinized now, like never before from members of both political parties about how they have functioned , uh, in and outside of election cycles. I do think that they should turn to , uh, ethical models that are based upon a set of values. What kinds of words represent that trust, transparency, honesty, virtue, doing the right thing, all of these, we know what these are, but the , they need to be applied and they need to be applied consistently. Um, so you know, the most troubling aspect, I think from the user side of this is that our data get collected. As they've never been collected before by private companies who don't really have to tell us how they're using our data and in a sense, they take our valuable data and use it to make more money. And the problem, I think that social media companies face right now is that they , they don't seem to have really that customer service orientation that a traditional business has , uh, and tries to tries to exercise their global companies , um, to, to an extent they sort of hide behind their technology. Uh , they , they make a lot of postings, but it's really hard if you've ever tried to talk to somebody at Twitter or talk to somebody at Facebook about a problem that you're having with a page or an account it's nearly impossible to get through and talk to a human being. Uh, and I don't, I just don't think they've taken that , uh, their users as customers very seriously. I think it's a problem that is , uh , across the board and still a con Valley. It's just a part of the culture. It grew so quickly , uh, that I just don't think it, it grew as if , uh , a traditional business. And in fact , uh, sort of this whole ideology of disruption that, that drives these companies really , uh , runs counter to traditional note notions of running a good business.

Phillip Stelling :

No , for sure. For sure. Um, are there specific things that you yourself try to do to prevent spreading fake news?

Dr. Lipschultz:

Well, if you look at my Twitter account, that's probably the best example. I share news stories every morning at Jeremy HL. Typically I share them from sources that I spend a lot of time with. So the associated press is a highly credible Newswire that provides news around the world. I trust them in general. It doesn't mean that once in a while , they don't make a mistake. Everybody makes an isolated mistake, but they have rules in place that , uh, guard the trust and credibility of information. And so I'm much more likely I still read it with a critical eye before I would share a story, but I find that most of their information is trustworthy and I want my personal brand associated with trustworthy information. So I do that. There are other other sources of information. If you look down through my feed that I think share, share high quality information inside higher education is a source that , that I get an email every morning , uh, that shares really, you know, usually news stories from credible sources about what's happening in colleges and universities. So, you know, I stick to information that I know something about so that I can critically consume it and share it when I think it's valuable to my 6,000 followers on Twitter. So I'm kind of a micro influencer on some topics, social media, journalism, and the like, and you know, what I, what I tell consumers of this information to do is to not blindly accept everything you see and hear . I think it's really important to exercise media literacy skills. And that means maybe taking a little refresher on what you learned in a social studies or civics class, because you know, most of us it's been a long time ago since we took those courses about how government functions about , uh, how to deconstruct a news story and see who's , who's this from, do I trust this source? Do I trust this message? All of those things, you know, as well as slowing down, we get in such a hurry on social media. You know, that there's a temptation to just read a headline and share it. You don't want to do that. That's a dangerous, dangerous practice. And that's, that helps spread the information. And you were talking about bots and other bad actors. They take advantage of people within social networks who do just that. Just share a sensational headline about something without really, without really thinking about it. There's a conversation this past week , uh, by two us Supreme court justices, it's on YouTube now , um, Sonia Sotomayor and Neil Gorsuch. And they're talking about their fear about distance information. I tweeted that if you want to look for the link for that, but , um, their fear is that with less than a third of the public, really understanding the constitution and how our system works, that we're in a very dangerous place. And, you know, you were talking about professor Leagon , their , their fear is that , um, that this is really a danger to our national security that nations fall not from typically outside invasions. They fall from within, from people who , uh, get the wrong idea about the way things are running and the way they should run them, what the rules are. And so their example in this video talk webinar was just how the courts work. People's misunderstanding of how the Supreme court works, how the courts of appeals work, the lower courts. Um, and the fact that news stories tend to dramatize the court tend to emphasize the five, four decisions as opposed to the majority of decisions, upwards of 95% or more cases are unanimous by the Supreme court. And it's even more dramatic at the appellate court level where most of the are heard because the Supreme court now here's less than 100 cases a year. So, you know, just some of that basic understanding is really critical to consuming news and news, I think is really critical to having a democracy. I'm a big proponent of the first amendment and the belief that for us to vote , uh, on our candidates every two or four, six years , uh , we gotta have a good source of information, and we've got to spend some time knowing what's going on in the world and what our elected representatives are doing. But that is what the Republic is about. It's a representative form of government. And , um, you know, the first amendment journalists are part of that, the challenge that we have going forward, and this is the one I think that, that you face in this project when I was a journalist and I was a radio journalist back in the 1970s and the 1980s, I tell my students way back in the last century. But back at that time before the internet, we were the gatekeepers. We were the people who looked at the stories. In fact, what I do on Twitter today is just what I did in the newsroom. You know, I would take a look at an associated press story if I weren't covering my own local stories. And I would judge whether we were going to put it on the newscast or not. And a lot of times we didn't put those stories on the newscast . We were the gatekeepers. So there was a judge there between the Newsmakers and the audience on Twitter. There are no gatekeepers. There are influencers who push ideas for a variety of reasons, but the gatekeeping , the bar is very low. And so that requires all of us to be our own gatekeepers. And we have to do that by practicing media literacy.

Cali Carritt :

Yeah. That was all really, really good information. I liked a lot of the things you touched on in that , um, kind of go off of all that information you gave us. Um, do you have any like, very specific tips for our listeners on like how to stop preventing? I know you said being a gatekeeper and checking your news, like checking your sources and stuff like that. Do you have any more, like really specific kids you want to just throw out there for them?

Dr. Lipschultz:

Last year, we created a document in the UNO social media lab, Dr. Adam Tema and myself. And it is online if you, if you Google, but , but you and Omaha shared it out. And it was a set of, of tips that , that go over what we've been talking about today to slow down in social media consumption, to really read carefully. Don't just read headlines and to think before you share those are, those are the most important ones I think.

Cali Carritt :

Yeah. I definitely agree with that. It definitely is important for you to look before just pressing that share button. I know it's so easy, but kind of need to put a little bit of work into it. A little bit

Dr. Lipschultz:

Case communication also tends to be richer than computer mediated communication. That's another class that I teach and it's because in CMC, we know that things get confusing sometimes when you send somebody an email or a text or post on social media. So I think use your friends, use your family, significant others really have conversations over dinner about these matters you can learn from each other.

Cali Carritt :

Yeah. I definitely think that's very important. Um, so I last two questions. So the one I have first is what do you want to see done for like the preventing of fake news?

Dr. Lipschultz:

Well, I can tell you what I don't want to say . I resist the notion that we need more government regulation and that's because again, I grew up as a broadcast journalist in a regulated industry, the federal communications commission and regulation didn't work very well. It really didn't, it, it , um, it , it had different issues than the ones we have in a more deregulated environment. But I think all things being equal, I like innovation. I like technological and social change. I think we're, we're trying to move toward a more perfect union. That's the goal, right? And that involves more , uh, um, inclusivity, more diversity. All of those we didn't have in a regulated environment, even though there were attempts to regulate diversity, it didn't work well. So I, I, I rely upon the marketplace in general, but I do think that the , the limit that I have on that is a company like Facebook and a company like Google. And to a slightly lesser extent, a company like Apple or Microsoft really concerned me, the concentrated power that they have. And so the , the one area that I do think needs some attention by the government is antitrust and monopoly, because for there to be a healthy marketplace, you have to be able to compete. And I think right now, the evidence is quite clear that Facebook's approach was we'll just buy Instagram. We've got enough money. Why compete with them? Why try to create another Instagram? And I think there are good reasons to have competition. We know, from back in the days of cable television, when you had competitive cable markets, where there were two or more cable companies going down the street, trying to connect households for the first time with cable television, there was better service at lower prices than when there was one, when there was a monopoly company, prices up and service wasn't as good. So we know that from history. And I think we need to understand that we need, we need companies. We need fresh companies, fresh voices in the, in the internet spaces , uh, to offer alternatives when we don't like what Facebook is doing, we need somewhere else to be able to go. But, you know, even Google testified in a court case that , uh, the Facebook social graph was so large and massive billions of people that when it tried to launch Google, plus it couldn't compete with Facebook. Well, if Google can't compete with Facebook, who's going to be able to, so it suggests the need to break these companies up. And right now Facebook leverages , if you, if you've look at an ad, even give any attention to an ad on Instagram, and then you get on Facebook a week later, that ad shows up there too well, that's, that's leveraging their, their data that they collected on another platform. And I think that gives them , um , unfair, competitive advantage in this space. So I , I worry more about that probably than I do the disinformation problem, because I think over time, I think consumers get smarter over time. I think when something's new, they make a lot of mistakes. I mean, I know when social media launched and I was first looking at it and I'm talking 2006, seven, eight, I'm sure I made more mistakes than I make today. Cause I didn't know what I was working with. I didn't know how to judge the information. I didn't know how the tools worked. Well , we know that now. And so it just falls to us to , to be smarter about it.

Cali Carritt :

Yeah. I, that was all really good information. I definitely agree. Um , with Facebook, I feel like I can't really escape. I'm just on every single, like it's all kind of the same now. Um, so kind of to go off of what you talked about, cause you gave us some really good, like things that we can do and stuff like that. What do you think like we as individuals are we as college students even can do to like help this issue that we're encountering?

Dr. Lipschultz:

Well, I'm not so sure you can disconnect from Facebook. I know some of my friends have done that at least for periods of time, but what I do think we can do is just generally be smarter. And this goes back to media literacy back in the , the high point of television viewing kids were watching seven hours of television a day. That wasn't good, right? That's that wasn't healthy. So somebody namely a parent or someone else needed to intervene there and say, that's, that's too much television. Let's figure out what's really good television and spend our couple of hours on that and then do something else. Right. Same is true with social media. It's very addictive. Internet addiction is a phenomenon that has been born by the data and science. So we , the problem is , is that we've got these phones with us all the time, right. And you know, they have things like push notifications that urge us to get back online after we get off. And you know, you were talking earlier about finals and getting ready to study. Uh, I think we need to turn those push notifications off. And I think we need to really think about and schedule our time with social media and not just reflexively use it. And so, I mean, for example, I'm okay with looking at my feed, first thing in the morning, over my breakfast and coffee, that doesn't tend to disrupt the rest of my life. But during my Workday, I tend to tune it out. I CA I may come back at lunch hour and catch up a little bit and then I won't pay much attention to it again, unless there's something dramatic happening. And I find out about it and I want to go to social media to get a little bit more information. Like, you know, the other day, the , the shooting at , uh, re West roads led me to go back to, to social media just to see what was happening, but just normal events. They can usually wait. So, you know, I'm not saying completely disconnect from your social media. There's a value there, but I'm saying don't live on social media. There's, there's a real life to be lived. Uh, and you don't want to miss that. I mean, yeah, if you're taking a picture of a flower and posting it on your Instagram, that's great. But then put the phone away and enjoy nature and enjoy the beautiful day that we had over the weekend. You know, realize that time that you spend is valuable. And I think sometimes students , uh, I'll speak as the old guy here. Don't realize how quickly it goes. They hear that, Oh, it goes so quickly. But I can tell you, 32 years later at UNL , it feels, it feels like that fast to me now, right? From my first day in class, to me talking to you now. So I think if you can keep that mindset, that time is our most valuable commodity. We used to take our students to, to talk with Warren buffet until he quit doing that a few years ago. And that was really his most important message to students from around the world that he met with was he said, if I would, if I could change places with you, I would gladly do it to be 20 years old again, and have the opportunity to live another 60 or 70 years. That's the most valuable thing we have is life itself. And so, you know, if you can keep that in mind, then I think it's a little bit easier to reorder your priorities and ask yourself. So how much of my life is Instagram worth? How much my life is tick tock worth, because I don't get that time back. And I think that's an especially important question for people your age, because in a sense, you're, you're at the most important point in your life as you finish college and go out into the work world. It's the point at which, for example, going back to the buffet example, if you invest in almost any good company, you'll be very wealthy when you're an old person. So there are, there are better ways to spend your time than being amused all of the time. Doesn't mean you can't be amused some of the time by a tick tock video, but just realize how much time you're spending and act accordingly.

Cali Carritt :

Yeah, that was my last question. And you gave us some really, really good information. I'm actually sitting here thinking about all the time I spend on mine and all this sort of stuff. So that's, it's really nice that I was able to like, feel the effect of what you all had to say. So that was really interesting. And I'm really glad we could have you on this podcast today. You're definitely,

Dr. Lipschultz:

And I like social media as much as anybody, but once in a while, pick up a book, talk to a friend, you know, have dinner, all of those things. Yeah.

Cali Carritt :

Perfect. Well, thank you so much for coming on. You were a great asset. I feel like our listeners are going to really learn a lot and that's all I have for ya

Dr. Lipschultz:

Will do. Do you send me a link to the podcast because I will share it in social media. And if you could send me a link to your recording file, I like to archive those in my, in my archive as well, but it's been a pleasure speaking with you and getting to meet you.

Cali Carritt :

Yeah. Thank you so much. What was your Twitter handle again for those listeners out there that might not have caught it

Dr. Lipschultz:

At Jeremy H L J E R E M Y H L. And that is for Twitter and Instagram and on Facebook and LinkedIn I'm well, I'm Facebook, I'm professor Jeremy Lipshultz and on LinkedIn, I'm Jeremy Lipshultz. So I'm easy to find and I'm always happy to connect with new new folks on social media channels.

Cali Carritt :

Perfect. I got those written down and I'll put them in our episode description as well for our listeners.

Dr. Lipschultz:

What platform are you using to , uh, share the podcast

Cali Carritt :

We are using? Um, Buzzsprout is our main form right now. Um, Apple podcast is a little, a little bit finicky at the moment cause I had to submit it and it's under review. Uh, but it is on Spotify also. So it's on Spotify and Buzzsprout

Dr. Lipschultz:

Spotify. What's , what's it called on Spotify?

Cali Carritt :

Um , info window. I can also send you the link through it all too . I can send that all in an email.

Dr. Lipschultz:

That'd be great. Cause I am a big Spotify fan as well.

Cali Carritt :

I was glad I got it. Spotify, Apple podcasts is being a little little mean about it. So I just have been to get it on Spotify. At least

Dr. Lipschultz:

Casts are really interesting too, because I mentioned my computer mediated communication class. We were, we were doing work with podcasts back in 2004 in that class and people didn't widely listen to podcasts back then. And so I kind of quit doing that for a while . And then all of a sudden , um, my daughter was going to school and , uh, she was in Chicago and had a fair amount of commute time and really got into podcasts. And it , it turned me on to the fact that particularly in large cities where people spend a lot of time on trains and buses, podcasts really, really have made an important contribution.

Cali Carritt :

I definitely agree. I listened to them a lot more than I used to, so that's really nice.

Dr. Lipschultz:

Great. Well, good luck to you on your project. I think it's great idea.

Phillip Stelling :

Thank you. Yeah. Thanks again for being on.

Dr. Lipschultz:

Thank you, Phillip .

Cali Carritt :

All right. I am Cali Carritt and here with us today is Calli Carlson . Callaway , you tell us a little bit about yourself.

Calli Carlson:

Hi, so I'm Calli Carlson . Um, I'm part of the team that's putting this together and I'm excited to be here.

Cali Carritt :

Sure. Which team are you on specifically,

Calli Carlson:

Specifically? I'm on the implementation team. So we help write the scripts and just make sure everything gets done, I guess.

Cali Carritt :

Perfect. We'd love to see that. Well, we'll just start us off right away. Oh, well the, we have a little statistic here that our research team got us. So it says 29% of Americans believe that social media sites are most responsible for the spread of fake news. So do you agree, do you disagree with this? Just kind of tell us about your opinion on that.

Calli Carlson:

Um, I mean, I agree like half-heartedly , I don't think all like social media sites are responsible for it. It's kinda hard to, you know, really put a place on it. Um, social media wasn't created for this, but it has escalated into it. And I think people need to be more aware of that and there needs to be repercussions because of it.

Cali Carritt :

Yeah. I definitely agree with that.

Phillip Stelling :

What do you think like, this is a little bit of a little bit of a topic change here, but you know, a lot of people sometimes will post us tickle posts and stuff on mostly Instagrams where I see them . Um , some people will post like activism posts and posts, words , raise awareness, and then others will go and make a joke or like satire out of that. Um , when do you think those funny posts becomes something that can be considered big news?

Calli Carlson:

See , and that's such a hard line because you know, it's one thing to make jokes like SNL, like they make jokes about the news is, but I think just staying updated on what's real and what's like really going on and making sure you're not like making a joke of like what can things that aren't happening in it. And really just trying to like, maybe if you make a joke, maybe be like, Hey, it's funny, but you know, it is intro. Yeah .

Phillip Stelling :

Yeah. So adding that extra, you know, extra reassurance behind that then kind of , um, we also have a statistic that says that 10% of Americans have knowingly shared a fake news story. Do you find that surprising at all

Calli Carlson:

A little bit? I mean, I don't know why people would really want to share fake news on purpose. I mean, I'm sure there always is that small percent, but I feel like 10% is more than you think, you know, you think of everyone in America and 10% law .

Phillip Stelling :

Well, I mean, even if you think about it just from a numbers perspective , uh , the COVID-19 virus affects only 3% , uh , the American population and that's still a huge number of that too. Um, you have any like reasoning you think behind why people would share fake news knowingly, even if they know it's not true,

Calli Carlson:

Maybe because they think it's funny, you know, seeing the repercussions of like what people , um, they spread it around and then people are talking about it. Like just today I was watching a Tik TOK live and everyone was saying, Drake is dead. So it got me and I looked it up cause I'm like, Oh my God, like, why is everyone saying it? Um, obviously. Yeah . But yeah, no, it can get anyone.

Phillip Stelling :

No , for sure. For sure. I think we've all definitely been there and had that situation. Um, do you have any tips for listeners kind of on how to avoid that or how to avoid , um, spreading that fake news themselves?

Calli Carlson:

Maybe just stay informed before you share, like, if you're, even if you're like not questioning it, just to look it up just to be sure. Um, yeah. And I know it's hard to do a , you know, you want to share it right away if you think it's important and you want to get it out there for sure. For sure.

Cali Carritt :

Yeah. And kind of staying with that kind of topic. What do you want to see done as far as like the preventing of spreading fake news? Yeah .

Calli Carlson:

And I think going back to the social media, like for them to take more responsibility of like what they need to do to get it, you know, like a little bit more under control. Um, I didn't see you one day, like Instagram had a pop up saying like, Hey, like this could be fake news, be careful. And I think like having those implemented more, or even with like AI, like how, you know, you liked the one post and it generates many more posts just to be like somehow code it better, but it's hard to code it when people are biased themselves. So there is no perfect answer.

Cali Carritt :

Yeah. Kind of sticking with the AI that you mentioned, do you think that AI is kind of like working against us and what we're trying to do and like stopping the spread of fake news and misinformation?

Calli Carlson:

For sure. I mean, when I like, think about like AI, like it just myself, like the example of, I love cats liking, capital's like, Oh, I want to see more. I love that, that it does that for us. But thinking about like, Oh, the fake news or like bad things that people like, and then they get more and then it just, you know , escalates that's definitely working against us. And I mean, I didn't even think about that really before this class, to be honest.

Cali Carritt :

Yeah. No, I definitely think that the class has been pretty eye-opening and not perspective that we kind of do it to ourselves. Um, I think it's been really interesting to see how it's progressed, especially with what we've been learning.

Phillip Stelling :

I guess let's stick with the AI. Do you think there's anything that those technology companies could do to try to , um, better their algorithms to make sure that these fallacies that ended up happening now where people fall into holes of, you know , maybe even radicalism and other things like that because of fake news or misinformation, or just , um, the algorithm continually showing them the same thing. Do you think there's, what things do you think that those companies could do to try to fix those algorithms?

Calli Carlson:

To be honest, I don't really have a great answer. I'm not a tech genius out here. Um, first thing that came to mind is maybe which I'm sure they did, but having a really diverse team working on the AI , um , trying to get it. So it's more diverse, more , um, accepting. Um, yeah, no, I really don't know

Cali Carritt :

The tough question. I'm not a tech guru in myself, so I definitely understand that. So kind of the last kind of little question we have for you is what do you think we, as people can do for preventing the spread of fake news?

Calli Carlson:

I mean, as I've said many times, I mean, just stay informed. I mean, I can't say I follow the news amazingly and that's on me. I need to. Um, but yeah, just staying informed and watching out for the signs. Um, I know we'll talk about it in a different episode of signs. Um, staying informed on the signs can really help, so that way you can identify it easier.

Cali Carritt :

Yeah, for sure. For sure. All right. Any other questions fill up? Not that I can think of. All right. Well thank you Kelly , for joining us today. We greatly appreciate you coming on this episode and hopefully we can see you in some of our future sometime . Yes . Thanks for having me.

Phillip Stelling :

In our next episode, we will be talking about how to talk to your parents and older generations about fake news. We hope you can join us next week on info window. Don't forget to follow us on Instagram at info window UNO . And if you have any questions or wants to be included in any of our future episodes, go ahead and email [email protected] Thanks for listening. And we'll see you next week. And remember you don't be part of the problem. Be a part of the solution.