
MEDIASCAPE: Insights From Digital Changemakers
Join hosts Joseph Itaya and Anika Jackson as they dive into conversations with leaders and changemakers shaping the future of digital media. Each episode explores the frontier of multimedia, artificial intelligence, marketing, branding, and communication, spotlighting how emerging digital trends and technologies are transforming industries across the globe.
MEDIASCAPE is proudly sponsored by USC Annenberg’s Master of Science in Digital Media Management (MSDMM) program. This online master’s program is designed to prepare practitioners to understand the evolving media landscape, make data-driven and ethical decisions, and build a more equitable future by leading diverse teams with the technical, artistic, analytical, and production skills needed to create engaging content and technologies for the global marketplace. Learn more or apply today at https://dmm.usc.edu.
MEDIASCAPE: Insights From Digital Changemakers
Beyond the Binary: Why Your Survey Results Are Probably Wrong
When you ask "Mexican food for dinner?" you're engaging in the most basic form of survey methodology—but as Dr. Philip Garland reveals, there's a profound science to questioning that shapes everything from presidential elections to product development.
Dr. Garland's remarkable journey began with a personal mission. As a freshman experiencing racism at University of Washington, he abandoned engineering aspirations to study political science and communications, determined to understand how media shapes perceptions. This path led through Stanford, groundbreaking dissertation research on racial attitudes, and ultimately to transforming SurveyMonkey from a basic tool into a sophisticated research platform.
The difference between amateur and professional questioning is striking. Simply adding "if any" to "How many movies have you seen this month?" dramatically changes responses by normalizing zero as an acceptable answer. This precision matters enormously: Dr. Garland controversially argues that the widely-used Net Promoter Score is "not worth the paper it's printed on" because fundamentally different data patterns can yield identical scores, potentially driving businesses to identical conclusions from vastly different customer feedback.
During the 2012 presidential election, Dr. Garland's team correctly predicted 48 of 50 states, helping legitimize online polling by leveraging SurveyMonkey's massive reach. Looking ahead, he sees AI dramatically improving survey efficiency while still requiring human responses: "No amount of data simulation can replace asking real human beings real questions."
Perhaps most concerning is how media fragmentation affects our political landscape. When people only encounter views reinforcing existing beliefs, meaningful dialogue becomes impossible. For entrepreneurs, proper survey methodology helps determine whether you're solving a problem for five people or five million—critical information before investing significant resources.
Ready to harness the power of data-driven decision making? Dive deeper into digital media strategy with USC Annenberg's MS in Digital Media Management program.
This podcast is proudly sponsored by USC Annenberg’s Master of Science in Digital Media Management (MSDMM) program. An online master’s designed to prepare practitioners to understand the evolving media landscape, make data-driven and ethical decisions, and build a more equitable future by leading diverse teams with the technical, artistic, analytical, and production skills needed to create engaging content and technologies for the global marketplace. Learn more or apply today at https://dmm.usc.edu.
Welcome to Mediascape insights from digital changemakers, a speaker series and podcast brought to you by USC Annenberg's Digital Media Management Program. Join us as we unlock the secrets to success in an increasingly digital world. Hi everybody, and thank you very much for joining us this week, as we are joined by a very dear friend. Colleague, a friend of the program started off as an industry advisor foriascape. Today, I'd like to introduce Dr Philip Garland, who is a leading survey methodologist and data scientist. He's worked in multiple industries, but with a special focus on politics, culture and higher education, and he's been doing that in multiple spheres and in multiple ways for as long as I've known him, which is, at this point, about what? 25, 26 years. So, phil, thanks so much for being with us today.
Speaker 2:Pleasure to be here. Thanks for having me.
Speaker 1:I want to jump right into it and ask you about your journey through education and that evolution that happened. It began with communication and then evolved into some pretty interesting ways, so could you take us through that to begin with, and then we'll start to talk about your career and then some pro tips that you have around surveys and data.
Speaker 2:Yeah. So I like to describe my journey, my career journey, as kind of a happy accident, and then the only reason why that happy accident is possible is because I was led I led myself by what I was good at and what I was passionate about. And so the very beginning of the story starts with a college freshman who had just moved to Seattle from Los Angeles, where his high school was extremely diverse and lots of multiculturalism, to a place that was more homogeneous. So I came in thinking that I would follow in the footsteps of my father, my stepfather, who's an engineer for Boeing. My dad is an airline captain. So I thought aeronautical engineering, or maybe computer science which at the time in 1998, was just really kind of a budding field compared to what it is today, and when I showed up on campus I unfortunately experienced a great deal of racism, racial bias, prejudice, and it was in that moment, in those moments, I decided that I wanted to study that, I wanted to impact that, and so I switched to political science and communication, because something about our kind of mediated interactions was playing a role in how people thought of each other, how people thought of me. The most poignant illustration of this is really almost every party I went to or social gathering, people would say hey, phil, nice to meet you, what sport do you play? And so then I knew it was a matter of exposure and education for those folks, and so I really wanted to study that and impact that. So that started the journey into political science.
Speaker 2:I was fortunate to have some incredible mentors, some people that really, from the very first moment that we met some professors that were interested in investing in this kid, and so I stuck to their sides. I took every opportunity that they offered me. I hung around their office hours. I dropped into their offices when I was walking by them and I, you know, spent a few minutes sharing about what was happening in my life and what I wanted to do. And they, you know, as professors often do in my life and what I wanted to do. And they, you know, as professors often do, they're receptive to anyone who has the gumption and the chutzpah to go and do something like that. And they took me under their wings and that made all the difference. Especially at a large institution like University of Washington, there's tens of thousands of students. It can be hard to get that kind of one-on-one and personalized interaction. So I just signed up for their classes and all the rest that I could having to do with American politics, race politics, political communication, mass media effects on society and so forth and not only had a wonderful experience, I started getting better grades and it was really just something that was fulfilling for me, meaning I was good at it, I enjoyed it and I was rewarded for it, and so that kind of positive feedback cycle is the kind of thing that can keep propelling someone to be great.
Speaker 2:So at the end of college I thought, you know, I wanted to get into politics, and so I applied to law school. I only applied to three schools. I didn't study for the LSAT as much as I should have, and I got into my third choice. And so I mentioned this to my kind of the main advisor that I had, david Domke, in the spring of my last year of college and I had mentioned to him just in passing that my mom said, because I was 20 at the time and about to finish school, that I should apply to a master's program, and you know, kind of by myself, two years at a time, so that I wouldn't finish law school as a 23-year-old, which would be probably hard for someone to hire Like who wants to hire a 23-year. He said your mom's right, take the GRE. If you get above 550th section, we'll give you a full ride for a master's. He got the scores and then the fall of 2001,. 9-11 happened and as a result he got three peer-reviewed academic publications concerning 9-11 out right away and another book chapter. So at this time now I'm just turning 21 with four publications and still intend on going to law school.
Speaker 2:And so in the second year of my master's program, a political science professor named Adam Simon we're in the elevator together and at this point I had lectured in our largest lecture hall like five, 600 students I had hosted a freshman interest group kind of mission project spoke in front of 3,000 freshmen in the basketball arena. So folks had kind of heard about this young kid who had the publications, who happened to be African-American, and so he was asking kind of in a leading sort of way, like where are you going to get a PhD? And I said I'm going to go to law school. And he's like law school, you should get a PhD from Stanford. And that put the idea in my head that was kind of like happy accident number going to Stanford.
Speaker 2:But you really only can understand the level of intelligence but also the work ethic that folks have at a place like that, meaning their need for cognition and their effort and thirst for information, education and exposure is almost unquenchable. And so when you get exposed to something like that it raises the bar for yourself. And so we all at every university, we all sort of teach the same. You know seminal works from the same kind of pioneering professors for the cornerstones of whatever field that we're in, and so that's kind of all standardized for the most part. But what happens is your peers start exposing you to higher levels of achievement, output, production effort etc. And so that I would say, was kind of the biggest impact on my life was just, you know, the other grad students and even the undergrads that I was a TA for. That really kind of just showed me a new level of greatness. So my chair, or my advisor for the PhD was John Krosnick. John Krosnick has received the Lifetime Achievement Award from the American Association of Public Opinion Research and I would say he's sort of on the Mount Rushmore of scholars in survey methodology.
Speaker 2:Now I think sometimes when I tell people about my background in survey methods. They're kind of puzzled that someone could, that you would need a PhD at this level for something as simple and straightforward as asking questions. Right, we ask questions every day. Hey, do you want to get some Mexican food for dinner? Like that's a question.
Speaker 2:Our life is built around asking other folks questions, but there is absolutely a science to it in terms of question wording, the number of response options, sampling and then the applied statistical analyses to interpret the data as well on political attitudes and, within that, to some degree, kind of racial attitudes or intergroup attitudes, so the attitudes that people hold towards one another on the basis of some characteristic, generally race being one of them, gender kind of different characteristics, and so this was kind of the melding of my interest back going, dating back to undergrad at University of Washington, in terms of race and race politics and racial attitudes in America, and so that really equipped me with a tangible skill set that led to, you know, a career in the survey industry ultimately, and so the trouble, though, is that this industry has existed separate and apart from kind of academic scholarship for so long that it can be hard to convey the amount of information and technical details in something like this when, again, people think question asking is something that's intuitive, that we all do.
Speaker 2:So it's nice to be equipped with this skill set, but the challenge in the corporate world is really educating others in the corporate world and their clients, constituents and customers that there are differences that make a difference in terms of how we ask questions and analyze the results and the responses to those questions. So definitely, you know, professor Krosnick and the others on my committee really equipped me for that journey.
Speaker 1:All right, fantastic. And what was your dissertation about?
Speaker 2:My dissertation studies themselves ask people to evaluate a set of music lyrics, except that I told them that the artist who produced the lyrics was either black or white and was either a rap artist or a rock artist. So the prevailing scholarship and literature in the previous 50 years concerning, you know, racial attitudes and prejudice would predict that the black person and the black thing rap, so that is the black rap artist. The lyrics would be perceived to be most offensive when the person was either black or rap or both, and so that was kind of the hypothesis going in, and what we found was the almost the exact opposite, which is the white rap artists and the black rock artists were the top two most offensive perceptions. And so what that meant from a theoretical perspective is that when you cross over into a genre or a cultural domain that isn't owned by your group, there's kind of a punishment effect. And so this really kind of flew in the face of, you know, decades of scholarship and was surprised to pioneers and titans in the field who told me so that this was just, you know, an exceptional discovery, essentially because it really helped drill down and crystallize, like, what the drivers are, what the mechanisms are for racism.
Speaker 2:So when you think about kind of racial strife and the civil rights movement, right, it ultimately meant that people perceived black folks as encroaching on white domains. Oh, you want to integrate schools? Schools are for white people. You want to? You know, ride on this bus Buses. You know the front of the bus for white people. So it starts with this conception, or this misconception, that some groups don't belong in certain places, and then we have this kind of primitive flight or flight aversion to things that are out of place. And so when you think about all of the advancements for women, minorities and other domains, it's because they're trying to enter into a space where they previously were not.
Speaker 1:That gets into ecosystem politics and a whole bunch of other things, along with what we've discussed in the past, that idea of like culture fit versus culture additive. But maybe we can circle back to that a little bit later. Very, very interesting, phil. Before we jump into, you know, one of the early real successes in your career, I just want to ask about survey methodology, and to you this might seem like a foolish question because you've lived in this world, in this realm, for so long, but for some others they might be going. What's the significance of survey methodology? We can understand why surveys are well, it's important to survey people and to survey large groups of people. However, the concept of survey methodology why is that so significant? Can we talk about what you know what it is? Why is it so significant?
Speaker 2:Yeah. So the word methodology is key here, because methodology as opposed to methods means it's the science of the methods. That means we're experimenting with the methods to understand what differences are made by subtle changes in the methods. So one of my favorite examples of this was a study that Professor Krasnick and I did in graduate school with some other folks having to do with a simple question, asking people how many movies they'd seen in the past month. Pretty straightforward, right? Maybe some big production studios or movie theaters.
Speaker 2:You can imagine who might be interested in a question like that Netflix, right, formerly Blockbuster. There are lots of people that might care about how many movies people are seeing in a month. How many movies have you seen in the last month? And the other half were how many movies have you gone out to? So you can see that, like, gone out to is a, which should be a lower number than seeing a movie, seeing one in passing, saying part of a movie, right, and so it's true, gone out, did have a did yield a lower number of movies gone out to then seen. So that was pretty basic.
Speaker 2:But within those kind of bifurcated halves, we asked seven different ways During the last month, how many movies have you seen, if any, which depresses the number Because it tells the respondent that it's okay to so forth. And I think some of what they're getting at is this concept where subtle changes in the language can produce statistically significant differences in the results. So another version is, you know, a lengthy, wordy version. Some people never see movies. Some people see movies all the time.
Speaker 2:How about you? How many movies have you seen in the last month? Right, so you're kind of normalizing again that zero movies or very few movies is a perfectly acceptable answer. And so when you think about the multiple iterations for a question as simple as movie watching and then apply it to something a lot more complex where people have low information, low experience and low starting points for whatever the topic is, those subtle differences make a huge difference in the outcomes and the results that we see I have so many follow-on questions I want to ask and one of them I'm just going to ask it, even though I know we need to get into some of our other things but it's the danger of binary thinking and living in this world of black and white, male, female, republican Democrat.
Speaker 1:You know all of these different things, and especially as we begin to think into the sophistication of survey methodology and then survey responses and how to weave through all of that in a world here, as we said, in 2025, that seems like oddly, we're retreating back into a non-multidimensional way of thinking, but back into these binary ways of thinking. But I just want to throw that to you because I know that that's something that matters a lot to you and as it relates to surveys and collecting data, this idea of a binary approach to data collection and then data absorption, data research strategy, et cetera.
Speaker 2:Yeah, I mean in terms of binaries. There's actually a very durable finding in survey methodology called acquiescence response bias. So most people and in certain cultures this is more drastic than others there's a tendency to be polite and agreeable and respectful to say a researcher or a person in a lab coat or someone in a position of authority, and so if you ask someone during the last month, have you seen a movie? Yes or no, you're going to get an inflation of yeses on its own, compared to a numerical transformation of that where zero is no movies and anything above zero is a yes right. If you were to kind of collapse the numbers into a binary, you would get different results. When you give people the opportunity or give them a clue that saying no is okay, so yes, no questions in the survey world are extremely problematic because you're going to get an inflation of yeses that isn't valid and doesn't reflect the underlying phenomenon that you're trying to study. Interesting.
Speaker 1:All right. So anybody who has been alive and working within business in the last couple of decades has probably stumbled into SurveyMonkey a time or two, either designing a survey or as a recipient of a survey or respondent to a survey. It's one of the most ubiquitous technologies that we have, and you were heading up survey methodology at that organization, which is really impressive, and I remember when you got that job and I was floored. That's just so impressive. How did your work there influence the product and the business during your time at SurveyMonkey?
Speaker 2:Yeah, so when I joined in 2009, the business was actually 10 years old at that point, but it was still pretty rudimentary in terms of its capability to handle sophisticated questionnaires that would be produced by academics and so forth from the world that I had come from. If you think about it in today's terms, it would be a little more than Google Forms, right Like? You can write a question, you can get some radio buttons or some response options, and that's about it. But a sophisticated questionnaire has lots of elements to it, like experimental manipulations, where someone's going to get version A of a question, like the movies question, or someone's going to get version B, and so it needs to randomly assign people to the version. You need to have skip logic. It's like you know what is your gender or gender identification. If your gender or gender identification is male, we're going to skip the questions about pregnancy, right, and so that way we're not asking you too long of a questionnaire that doesn't apply to you. So you need skip logic. You need sometimes to change the order of the response options. There is a written questionnaire is what's called a primacy effect, meaning people read top to bottom or left to right and kind of stop when they find a satisfactory answer. That phenomenon of not trying too hard or giving a BS answer is called satisficing, and so there are elements that you can insert into the questionnaire design to mitigate satisficing, essentially.
Speaker 2:So, step number one for SurveyMonkey at the time I joined, which again was 10 years old but about 30 people, was to create a tool that everyone could use your DIY parent group, your dentist, but also a very well-trained survey researcher. So that way we essentially expanded the addressable market from, you know, kind of novices to experts in that way. So step one was to manipulate the tool, because everything downstream that we wanted to do with the business depended on kind of drawing in a more sophisticated customer base. The second objective was to create what's called a access panel or a double opt-in access panel, which is when people voluntarily sign up to take surveys for some sort of incentive or reward Could be cash, could be a gift card, could be points to buy an object, could be sweepstakes, so many things. And this was a fairly mature market at the time, although it was undergoing some consolidation, because it happens to be a market that has the structure of a monopsony, meaning many sellers, one buyer, and so that creates a commoditized business where everyone's trying to sell their survey responses for the cheapest price, and so the margins are really tough, which means that any element of kind of inefficiency is very costly to the providers or producers of online survey interviews. And so, for example, if a bunch of people quit your survey, right then are you paying incentives to those folks who didn't finish. If a bunch of people are satisficing straight, lining abracadabra, right, mucking up the data, creating a bunch of noise, and you have to replace those, right? So the art and the science in the kind of panel industry is to get the largest and most diverse pool possible and to have them take the surveys regularly, take them seriously, which is a lot harder than it sounds.
Speaker 2:So SurveyMonkey wanted to enter into this business because it's DIY survey customers maybe wanting to test a new logo or a color or something right. So we have this experimental manipulation, or understand the landscape of a new business venture, or do competitive analysis Tons of reasons why its existing survey creators would need a pool of respondents to go with that, to supplement that survey work, without having to go stand on the street corner with a clipboard or at the time there was really no social media to post on. So it's like where do I get people and preferably a representative group of people right so that I know what all Americans think, or all Californians think, or all of the certain group things? So it has to be generalizable, has to be valid, has to be reliable, and so we set out to build this business and we did. Fortunately, we're doing around 2 million survey respondents a day at that time and so we just at the end of their again their PTA, their dentist, optometrist survey we just asked them if they wanted to stick around or sign up to take more surveys, and a substantial number of people did. We were able to build the largest pool of people in the industry in about six months and it became a very significant portion of our revenue because we had this existing base of survey creators.
Speaker 2:But the third piece related to the second, which is proving that the pool of respondents or available respondents that you have is representative of America essentially in this context. And so fortunately, there is what I like to call the Survey Olympics every four years and it's called an election, which means there's a right or wrong answer and you can kind of submit your horse into the race and to say you know how representative is this panel and to the extent that it can produce valid predictions of an election, that suggests that you have a very healthy, hygienic, suitable and usable panel before you. So the 2012 presidential election we did just that, but we did it with a slight twist because election polling is very expensive. You'll see the sample sizes for these election studies generally around 1,500, sometimes 750. And that's because you're going to get a very predictable margin of error at a predictable confidence level At about 1,500,. There are diminishing returns for collecting more people, which are really expensive for very small gains in the reduction in the margin of error.
Speaker 2:So our advantage was, at 2 million surveys a day, is we were touching about 90 million households, which was just shy of half at that point every 90 days on SurveyMonkey, again, because we're DIY for churches and dentists and all the things. So everyone has some place where someone was kind of doing a DIY survey. So we had great coverage, which is the term that we use for understanding the ability for people to make it into a sample. Traditionally it had been first face-to-face and then telephone, and now we're in a place where kind of internet access was virtually ubiquitous. You know, 95%, 90, 95% internet penetration in the United States at that time. And so the argument about coverage is hey, we think we have coverage that is, you know, similar to that of a phone or a face-to-face opportunity. There's a ton of debate about this which I won't go down into the weeds to, but ultimately the idea is we had a great coverage. So the next question is what are our response rates?
Speaker 2:In 2012, the response rates for telephone surveys were about 9%, which was really 15, 14, 15% for landline telephones, if anyone remembers what those were, and it was about 5% for mobile phones, because people don't like to answer their phones and use what were then expensive cell phone minutes, and also the law mandated that you could not use an auto dialer for cell phones, meaning a human had to dial the phone, dial the numbers, and therefore more expensive, and so it was really expensive to do election polling and it's almost impossible to do it at the state level repeatedly. So you'll see things like you know. Des Moines Register, iowa. Des Moines Register. Does you know polling for Iowa? Right, and it does it infrequently. But Iowa is a very critical state in the primary season, right, and so Circuit Monkey had the opportunity to do all 50 states very cheaply because we have tens of thousands, if not hundreds of thousands of people in each state.
Speaker 2:So we did about a million two million survey responses in the 12 weeks leading up to the presidential election.
Speaker 2:We didn't do too much in places like California, where the outcome was obviously predictable, but the 10, 11 battleground states where elections are won and lost. We're doing surveys daily and producing the results weekly leading up to the election, and we got 48 states right. The two that we missed were kind of a rounding error on the margin of error, which I think critics of online polling will love to jump on. But all pollsters have dozens, if not a hundred, kind of models for weighting the data before them and they ultimately the art meets the science, where you have to pick one of those and you know, say, that's your submission into the survey Olympics. So we certainly had models where those two states were correct, but maybe it flipped some other ones incorrectly, because you can't really cherry pick. You want to apply your weighting scheme evenly and so ultimately we're very proud of that. We were featured in the New York Times because of this and really I think kind of set the stage for what became the legitimacy of online polling thereafter, which is very commonplace these days.
Speaker 1:All right, fantastic, pretty amazing, that you were involved in something that was groundbreaking like that, and congratulations. We think a lot about how, especially within higher ed, so much of what we teach and what we share and this will dovetail into our discussion about AI a little bit later is actually a synthesis of what has been shared with us and, when there's the opportunity to create a specialization, as such that you're able to make a contribution to a body of knowledge that's new and not just a synthesis of what's come before, that's remarkable, of what's come before that's remarkable, switching gears into a little bit more of what it is that you do. Now there's three questions I want to ask you to get us started, and so we'll take them one by one. The first one is could you share with us a pro tip for surveys and also biggest mistakes that people sometimes make?
Speaker 2:I'll say one in the same here and this may ruffle some feathers in some circles but ubiquitous and very commonly used Net Promoter Score, or NPS, is not worth the paper it's printed on. It will lead organizations to wrong and wrong-headed decisions. If you'd like, I could show you a diagram of this. But ultimately I can show you a diagram of this. But ultimately I can show you three different distributions of data that would lead you to interpret very different business conclusions, business strategies. Out of them, one kind of looks like the Grand Canyon, where you've got like a kind of like data on the very ends, kind of like a polarity on the ends and nothing in the middle. You can have some with like a skew to the left where folks are. You ends kind of like a polarity on the ends and nothing in the middle. You can have some with like a skew to the left where folks are, you know, kind of clumped on the higher end. And you can have some that are shifted toward the middle a little bit, but still kind of on a normal curve. All three of those distributions will give you the same NPS score. So how can we make a decision as a business when extreme levels of variance and dispersion yield the same number. And that's where I say just use the tried and true mean median standard deviation frequencies of your distribution to make sensible decisions. We don't need a magic bullet.
Speaker 2:This one question that's going to solve somehow solve all of our problems. From a question wording perspective it's double-barreled because it asks you if you'd recommend to a friend or a colleague. There are certainly products that I would recommend to my friends and not my colleagues. I won't go into what those sensitive things might be Could be athlete's foot, I don't know but the point is friends and colleagues are different groups and they should be separated. It's on an 11-point scale. We suggest five or seven-point scales that are fully labeled. There aren't labels on all those scales. So there's a paper by Professor Krosnick, daniel Schneider and some others that goes into this. You can easily find on the web if you search for Krosnick net promoter score. You can easily find on the web If you search for Krosnick net promoter score. There's a lengthy paper that explains why it's problematic. But the summary here is you can get the same number from very different data and that's not the point of data.
Speaker 1:All right. Second question AI. It's on everybody's minds these days. How is AI going to impact the survey industry?
Speaker 2:Good question For some menial tasks much of data work and survey methodology work, before you get to the big aha and the finding and everything that you're looking for. There's a ton of data cleaning right. There are outliers, there are mistakes in data entry, there are satisficers. There's a ton of value in what AI can do to clean data and get it prepared, coded, standardized and in a position to be analyzed. That takes probably 80% of the time in terms of a data process. So it's definitely going to speed that part of the methodology up and really remove some of the tedious work involved there.
Speaker 2:I think the second part is the analysis. There have been statistical software packages dating back Most people, I think, in this era Remember SPSS. There's R, there's Stata and most recently there's Python, and so these are all kind of scripting languages to basically undertake the statistical test, the regression right. Whatever the applied stats method at hand is. That used to require a lot of code, where now you can literally just put in the desired test into ChatGPT or any other kind of program and it can do that for you.
Speaker 2:I just tried it the other day. I put in my dissertation data and gave it some very minimal commands and it was very good at doing that, so that data were obviously already cleaned and ready to go. But so I think from the cleaning part and the analysis part, ai is going to be instrumental in saving everybody a bunch of time and headache and hassle with parts of the work that you know aren't as interesting. Where I think AI cannot have an impact is if you want to survey 1,500 Americans about the election. No amount of data simulation can do that for you. You literally still have to go ask certain questions to individual human beings to get their opinions. So that part I think is safe from AI. The asking people, real people, real questions part I don't think can be simulated, so it's insulated to some degree but will be sped up and made more efficient in the interim.
Speaker 1:And as we think about politics and you're teaching in our digital media program, and we think about new media, all of the new forms and platforms of new media, how is new media affecting politics within this world of surveys and data?
Speaker 2:Yeah, I would say far and away the biggest factor that is affecting our politics is the bifurcation and the isolation and the tailoring of media. Your feed, your friends' feeds, even just take Fox News to CNN or CNBC, whatever you like, this kind of me-centered media consumption, which was an idea pioneered 30 years ago by Kat Sunstein at Harvard, and it's really when you only seek out or are served information that corroborates and reinforces your opinion it is very hard to undertake politics, which itself is a competition for scarce resources.
Speaker 2:But if the information about that scarcity or those resources are completely different, there's no way to have a conversation about it. So it becomes. That's why you've seen more polarization, more tribalism, more separation, Piloing, Piloing exactly Because this is the structure of our media, of our media landscape. You know, back when it was ABC, CBS, NBC, Dan Rather, Tom Brokaw and Peter Jennings. They're largely we're working from the same set of guardrails, the same education, the same institutions, the same kind of historical perspective.
Speaker 2:But now the 15-second TikTok about the latest conspiracy theory is filling that void or has taken the place of those kind of standardized options. And when those aren't vetted, when those aren't questioned, and unfortunately our brains have not really evolved faster than the technology and so we're very poor at discerning truth from fact, from opinion, from truth from falsehoods and so forth, compounding what is kind of a legacy primitive brain, the effect on a legacy primitive vein, with a technology that, under Moore's law, is doubling every number of years, and I wish I had a brilliant solution to kind of get us all back on the same page. But that's the biggest effect on our politics, which is creating more tribalism, more separation, more angst, more polarization, more anger, more hurt, more sorrow.
Speaker 1:Just to comment on that. I have a filmmaking and photography background and we talk about a frame. You know, the frame of the photo that you pick right Could be this big. And then there's that expression what's your frame of reference? And people, people think about oh well, that's just the context I'm coming from, it's the perspective that I have right, but if you think about what it really means, a frame means that you're able to see this much and when a photographer or cinematographer hands their camera at something, they pick, what's in that frame? Mostly what happens is what you leave out. So, when you talk about your frame of reference and a media perspective in a frame of reference, even if we're talking about algorithms, algorithms are created by people who have explicit and implicit agendas and bias, often, or almost always, driven in some way by money and resources and stock prices and just trying to hold on to their jobs. And if you think about a frame of reference, it's really important. And and if you think about a frame of reference, it's really important scary when you think about all the things that mostly what is being left out is far more than what is being put inside of that frame. So, for anybody who's listening and for all of our students. At DMM we talk about that a whole lot and, of course, here at the Annenberg School for Communication and Journalism, you know.
Speaker 1:I want to ask you just a couple more things, phil, and one of them has to do with your background as an entrepreneur. You've been a founder of multiple companies. You have been on the founding teams of multiple companies or consulting with multiple startup companies and I want to ask you you know so many entrepreneurs when they are thinking about starting a new company and developing a new product. You know it comes from this, this idea of gut and instinct. You know, I know that I would use this if it was around.
Speaker 1:Here's this problem that I've seen within my world and sphere of experience and influence and it's very much like I would use it, or it's very emotional. You ask a few friends but what? Your approach is much less qualitative, much more quantitative. So can you talk about the quantitative side of being an entrepreneur and how that has influenced the development of products? Or maybe even when you try to decide whether or not to hit, go on an idea that you think has some merit and are you going to put a whole lot of time into it and try to build a company around that Qualitative versus quantitative in your opinion and in your background as an entrepreneur.
Speaker 2:I think step one a venerable VC once told me is understanding what friction you're trying to solve. And sometimes, when you have a gut instinct about friction, there's some friction in your life. Ebay was famously started by someone who wanted to be able to sell. His wife wanted to be able to sell PEZ dispensers trade PEZ dispensers right, so there's friction in like getting these collector's items out to the market right, so that starts with a personalized friction. But then the second and most important step is to really quantify the opportunity or the addressable market for that friction. How much of that friction exists in the world and how much of it can you solve for how many people? And that's where kind of survey methodology and these double access panels can come in and be helpful because you can really get a sense of is this a problem that me and a few other people have, or is this a problem that everyone has? Is this a problem that we have every day or is this a problem we have once a year? Right, and so when you start to answer those questions, then you can understand what your price should be, you can understand what your product should be and then you can understand what your product market fit is, which is you know how well your product addresses that friction. And I think it's a very underutilized step in the entrepreneurial process where people could basically collect a survey from 1,500 people and understand their market or their desired market At the same time.
Speaker 2:You know the Henry Ford's famous quote of if I had asked my customers what they wanted, they would have said faster horses. So it's this pull between you know asking people about something that they don't want, right? I mean, we all had flip phones when smartphones came out and Steve Jobs himself famously didn't want to do the iPhone right. So it's always going to be an art and a science to some degree. But you can help yourself a great deal by at least understanding how many people you're going to have to educate about this new technology that you think they need, right. So either way, you're going to learn something about how to educate them or how to make the product so that it fits within their friction and so forth. But ultimately, survey methodology and asking a you know, target audience of people who could be your customers or just the general population is going to get you a long way in understanding the friction, the amount of friction and how you're going to solve it.
Speaker 1:It reminds me of something that I learned about through the lean startup method, which was pioneered by Harvard Business School, and one of the big questions was you know, as an entrepreneur, you need to ask yourself, first and foremost, not can we build this product, but should we build this product Not? Can we bring this to market, but should we bring it to market? Is there a product, market fit, before we go and spend a whole bunch of time, money, resources, et cetera on something that probably shouldn't be developed in the first place, even if it feels like a wonderful idea. Okay, last question, the question I always love to ask when we have our our wonderful and esteemed guests, and it's it's your opportunity to share one piece of advice, and it could be something about life, it could be something about professionalism or academia, but if you had one piece of advice or a core value or a North Star that guides you that we could gain some enlightenment from, I'd be really grateful to hear.
Speaker 2:Yeah, I think, dating back to my happy, accidental journey of a career, what I learned and tried to impart for folks is, whatever level of effort you think you're giving your max effort, there's a gear and a level past that for yourself and certainly in the world. Meaning someone out there is doing your level and then some, and maybe not even breaking a sweat, because they've already pushed past your highest level and trying to push them past their own highest level, which is creating an even bigger gap than you think, their own highest level, which is creating an even bigger gap than you think. So it's hard to fathom that. There is another level when you're exhausted, when you're busy, when you have a ton of things going on in your life, but it's not hard for someone else to fathom and those are the folks that you're competing against.
Speaker 2:Michael Jordan used to say he used to practice so hard and so forth because he knew Larry Bird was getting up extra shots more in modern times I've heard this stuff Curry needs to make 250 swishes of threes after practice. So already tired from practice, already exhausted, ready to go home. Another level past that and it shows in the output on the court, and so life is kind of like that you will get out of it what you put into it. But what you think you're putting into it probably isn't probably the most on earth, because you've never seen the most on earth. So you kind of have to imagine it. You have this boogeyman, this fictitional character out there that's getting 300 swishes. After practice. That person probably exists, and if they don't, you'll be that person when you hit it.
Speaker 1:So push yourself harder than you thought was ever possible and you have a chance at greatness.
Speaker 1:Bill, I want to thank you for sharing these insights as you've gone through your journey.
Speaker 1:I like to think of middle age not as over the hill, but as in in our primes and I say are because we're both right in our mid forts now, and so thank you for sharing where you are in your prime and the prime of your career Definitely have some scars and battle wounds, learned a lot ups and downs, but your trajectory I'm real proud to say that I've been able to watch that over is a piece of advice, right, and it's what you started your talk with today just the importance of the people that are in your life and how people came along, cared about you, invested in you, mentored you, and now it's fantastic to see you doing the same thing for so many others.
Speaker 1:So I just I want to thank you for that and for anybody who's listening, I put that charge on you as well, especially for our students, to think about how you can make contributions to the people around you, some of whom you might not even know, because when you impact their lives, you can have a multiplier effect out into the world. So thank you, Dr Garland, for joining us today, and I look forward to spending a lot more time over the summer as we get into some of these new classes that you'll be teaching. And I want to thank everybody for joining us on this week's episode of Mediascape. And, Dr Garland, I hope you'll come and visit with us again soon To learn more about the Master of Science in Digital Media Management program. Visit us on the web at dmmuscedu.