Perpetual Novelty

WE AIN'T SEEN NOTHING YET. Perry Chen & Walter Isaacson.

January 20, 2021 Perry Chen Season 1 Episode 1
Perpetual Novelty
WE AIN'T SEEN NOTHING YET. Perry Chen & Walter Isaacson.
Show Notes Transcript

Walter Isaacson joins Perry Chen to consider the question: "How do we navigate a time of immense technological change?" 

Walter Issacson is an American author, journalist, and professor. His books include Steve Jobs, Einstein: His Life and Universe, and the forthcoming The Code Breaker: Jennifer Doudna, Gene Editing, and the Future of the Human Race.

Walter Isaacson:

All right, well, I'm ready.

Perry Chen:

All right, let's do it. Okay, so. So I'll start with the once a tech once an innovation is possible, it becomes inevitable. You know, I think that if you think back to kind of when physicists first realize the energy within atoms, right, of course, they immediately start thinking about how that could be released. And so there was this kind of inevitability to their pursuit of that. And at some point, there's a realization, potentially that it's possible, even if they don't know how to do it, that just with enough effort, it's possible. And so I'm wondering, if you would agree with with that line of thinking. That once an innovation is possible, it becomes inevitable that it is invented.

Walter Isaacson:

I think that it's inevitable that once something is theoretically possible, we're eventually going to get to that innovation, a real question becomes, will we have the policy and moral fortitude, to say, we can make this innovation, but we're not going to use it. The most obvious example is the atom bomb. It was pretty clear beginning in the early 1930s, that you could have vision that you could release the energy of an atom that Einstein's theories were correct, that equals mc square. And so by 1945, we create an atom bomb. And I guess it was inevitable, we were going to use it. However, one of the surprising things has been that after we invented it, after the innovation happened, and even after we had used it, we step back and said, Maybe this is an innovation, we should never use again, meaning the use of nuclear power as a weapon during war. There are very few instances of that, where an innovation comes along. And we decide, hey, let's put the brakes on it. I think the more most interesting one we're going to face in the near term is genetic editing. As you know, in 2018, CRISPR was used to create designer babies, twin babies that were born in China that had one of their genes removed so that they were no longer susceptible to catching the virus that causes AIDS. Well, you could use gene editing technology in the near future, in the next 10 years, it'll probably be ready to do all sorts of edits to our babies to make them taller or stronger, or give them blue eyes or brown eyes or whatever color you want. Maybe even increase their memory capacity or their mental processing capacity. And that is an innovation that is possible. And it's probably inevitable that we will learn how to do it. The real question for me, is it inevitable that we will commercialize that technology make it available for individual use, make it so that parents 1015 years from now can design all the traits of their baby, I mean, you're having a baby soon. And it's done. The way it's been done for, you know, hundreds of 1000s of years is that the assets and traits of that baby are assigned by a random natural lottery? Well, we haven't innovation, whether that's not necessarily going to be true for your child's children. And the question is, is it inevitable, we will start hacking the human genome and hacking the traits of evolution.

Perry Chen:

I think that you've set that up perfectly, because I've thought as well about the example of the atom bomb and nuclear power and nuclear weapons. And when I think about that, what I also wonder is like the that how much do the characteristics of the technology matter in its ability to be regulated, meaning that, you know, it ain't so easy to make something, an atom bomb or something that uses nuclear power to date. So there's obviously very scarce resources. It's very resource intensive. And where other things, you know, let's say deep fake technology that gets perfected to the point where it's just a free app on your phone. It's very, very leaky by nature. So I wonder if maybe some technologies are more able to be regulated simply by the characteristics of the challenge it is to make and deploy it. Where others maybe will be near impossible to contain, even if we wanted to.

Walter Isaacson:

It's very important to distinguish between those technologies that we can easily control by, you know, locking them in the barn. And those that are almost impossible to control because anybody can do them, as you say, an atom bomb. And, you know, nuclear weaponry is somewhat easy to control, you know, we can have inspections, we can, kid can't make a nuclear bomb in the dorm room or the basement. There's some leakage began when India and Pakistan both developed a bomb, where nuclear regulators and international inspectors were kind of surprised that it happened. Clearly, we're watching the process play out in Iran at the moment. But generally speaking, it's the type of thing that you can try to guard against the spread of nuclear weapons technology. Because it's a complicated technology, you need to have fissionable material, you need to have all of the equipment to make a bomb. And that's allowed us over the past. Well, I guess it's been, you know, since 1945, we use the bomb more than 75 years to keep the genie from getting too much out of the bottle. There are other technologies as you say, that there's just no way you can control deep fake media being one. Anybody with a good computer, and a few years can make video that just as they can right now make photoshops of still pictures that are very hard to detect. Somewhere in between is gene editing technology. I was writing my book that comes out in March called the codebreaker about the people who invented CRISPR technology. And I went to Jennifer Doudna, one of the inventor, I went to her lab in Berkeley, and within two days, I had edited a human gene, I had edited human cells to change the genes. Now, I, you know, mixed up with chlorine and flushed it down a sink. But it was something that I realized that with the help of a couple of graduate students in biology, I could do on my own, make genetic edits. And that means that it'll be more up to individuals, they will be empowered to do this thing, governments will not and society will not be able to control it as easily. So in some ways, we're going to have to figure out systems for controlling things that we can have padlocks, the way we do when we try to padlock a nuclear facility if it gets inspected, but that program, that's something we do in all of society. I mean, you can't stop people from shoplifting, let's say, or from, you know, killing pets and animals. But we have societal rules, and we have shaming, and we have laws, and it means you can stop 99% of it.

Perry Chen:

Since you've been spending so much time on this subject the past few years, I'm curious if you know, you have an inclination to how you think, like something like this plays out specifically, like how it could plausibly be regulated in a way that you'd think would be functional? Or is it become one of those things like the way that with the rise of the Internet era, there became these kind of quasi religious mantras of like information wants to be free. That sound good, that may be kind of happen as something becomes harder to control that we almost like rationalize the adoption of it because we've you start to realize this, there's nothing you're gonna be able to do and people are doing it. And I don't know if kind of the if a picture started to form in your mind, as you're spending time with this, of what you thought 20 years from now, or so, it's gonna really be like.

Walter Isaacson:

Well, you are an internet innovator and pioneer. And you know that in the early days of the internet, there was that mantra you talk about of information wants to be free, and the Electronic Frontier Foundation and people like that one to make sure that we're no real controls over the internet. And in some ways, that turned out in my mind to be very healthy, because it created a robust growth of the internet. But there were some dark sides to it, including the fact that it put every you know, most local newspapers on a precarious position are out of business. Because information was supposed to be free. And it allowed the spread of misinformation. And social media platforms like Facebook took no responsibility for people putting up conspiracy theories and misinformation. And that became dangerous, we did not develop rules of the road, to say that on the internet, you have to follow the same general rules that we follow in real life, which is if you libel somebody or lie, you can be held accountable, those type of rules, or if you take somebody else's intellectual property, and share it, that's considered theft. Now, the question is, can you create, you know, rules of the road? For other technologies? I think with the technology, we've been talking about genetic editing, yeah, you can, there is a philosophy just like the old internet libertarian philosophy, that people should control their own bodies. And that will be one of the mantras which is, if I want to design my children, you have no right to tell me, I can't edit the genes of my children, I can't make them stronger, bigger, more intelligent, and it's my life, my body, my family, the government should keep its hands off of it. Hello, however, you know, there are probably reasons not allowed to not allow this type of technology to be totally subject to individual choice, one of which is that would reduce the diversity of our species, which is dangerous. The other is it would encode any qualities where people who are wealthy enough to buy higher IQ points for their kids will be able to do so. And you'll almost create different subspecies with a wealthy by better gene for their kids. And the inequality is not only the way it is now, but it's encoded into our genes and becomes part of what our species is, and those are bad things. So I think it's useful to try to develop some rules of the road, even though it will be difficult.

Perry Chen:

I think what I'm trying to come to terms with is just trying to identify the difference between the rationale which may be very sound, which you just kind of pointed out of the reasons why unfettered adoption of such technologies could be not in societal interest, and then moving from that, even if there's political will, to actually being able to create some sort of enforcement. And so obviously, it's to be determined, but I think I'm, I've started to believe that there's going to be a lot of leaky stuff. And so there may be frictions that we can create, depending on how easily easy to make distributed, and deploy something may be. But then like, you know, almost, I start to wonder if it's really just a subgroup of things that take that are really arduous, like an atomic bomb, or nuclear technology, to create, and most things probably, especially that play out in the digital space, or that, that we can tell are going to get to a point where somebody is going to be able to do it, you know, pretty cheaply and learn pretty quickly or, or pick up the cheap tools to do it. Because as you say, like, you know, or as you, as you, I think you're alluding to, is that, you know, we have, we have a there's a global world, people are ready to go to other jurisdictions to do all kinds of other medical procedures or get good pharmaceutical drugs that are that have different kind of laws around them. So, you know, just genetic editing, tourism is obvious, right? And so, you know, it kind of ends up being these things where it's like, if you can't get enough of a global control, which I don't know what our history is around a lot other than SES, new nuclear weapons that, in essence, it is kind of leaky, and then at some point, you know, it's then it's you start to have to maybe make it available in the jurisdictions that wanted to withhold it because that's perpetuating the unfairness. But it's only the people that could fly to Belize that can that can do it. But they're doing it in droves, or in other countries where it's just part of the national strategy, maybe to it to an effect.

Walter Isaacson:

Right, medical tourism and genetic tourism makes it a lot harder to say we're going to have a regulation in the United States against editing, you know, your children too much or creating clones of yourself or things like that. However, even for simpler technologies, such as pharmaceuticals or drugs, we're able to have a pretty good regime that has, say, 95% control of things. You mentioned, people fly overseas or go to Caribbean islands in order to get, you know, different types of treatments that are banned in the United States. But that's not all that common, you can't, you know, most people, if a drug is not approved by the FDA, they just can't use that drug. And even though you could possibly fly somewhere else, that's a small bit of leakage compared to saying everybody can do whatever they want, and put out any cure and any, you know, quackery, or any drug cure they want, and sell it on the internet without any restrictions. So we're able to have restrictions, even though those restrictions can be slightly leaky by the fact that other countries might have looser restrictions, likewise, but if you look at the Internet as a counter example, there are just certain things that were architected into the internet when it was first built, that makes it incredibly difficult to put too many controls. And first of all, it's a totally distributed decentralized system where every node on the internet has equal power to store and forward and create bits of information. So you don't have any central hub or gatekeepers that can stop the flow of bits around the internet. Secondly, when they created the protocols, the TCP IP protocols for the Internet, and for all packets, which networks, it meant that every bit you could encode upon it, what the address was and where it was supposed to go. But it's not encoded, where it comes from. So people can pretend to be coming from somewhere else. So you can have trolls and Russia that are pretending to be voters in Michigan. And it's very hard to pierce the veil of anonymity simply because of the technology that was used to build the internet. That's why the internet is particularly hard to control. But drugs or medicines, or medical procedures, or genetic editing, is a little bit less hard to control. And nuclear weapons is even less than that. So it sort of depends on how the technology is distributed. And whether it's something that anybody can do anonymously. And the internet is an extreme example of a totally distributed technology that can operate with almost total anonymity if somebody wants to.

Perry Chen:

Yeah, I like the framework that you're putting in. And I think that with digital stuff, there's also just the replicate replicability at zero or near zero cost, right. And so, that's inherent in a lot of digital technologies. But my guess is that also, with genetic engineering, it's also there's going to be a lot of components of that, that are kind of would end up being very, very cheap to to produce or to copy. Not the same as digital. But you know,

Walter Isaacson:

Yeah, and then you have to look at the role of social shame. Here's a big difference between, say, genetic editing, and trolling on the internet, is, if you're trolling on the internet, and you're pretty careful, you can be pretty sure that you'll be anonymous. In fact, you can even create a Twitter account in the next five minutes with a fake name and a fake picture. And probably just start saying the most outrageous things with total anonymity. If you are living in a community, and you're having, you know, children, and you're genetically editing your children, or you're, you know, going to, you know, in vitro fertilization so that you can edit the embryos. And so, it's a lot harder to do that anonymously. Which means that there can be social shaming sanctions or that people will just say, this is generally considered wrong, just like you know, nobody can stop you from walking around, you know, half naked or something, but you don't do it because social conventions. So I think if we try to say there's some rules of the road of what type of medical procedures will do. I mean, you even see that helping your kids get into college at a certain point people cross lines, and they're trying to buy their kids way into college. Well, it's not just laws that prevent that, even though people have now gone to trial for it. It's the absolute social shame of doing it, or take, for example, this vaccine for Coronavirus, it's possible that if you have a lot of money in your poll or whatever, you could try to get ahead of the line as some people will to get your vaccine before healthcare workers and first responders get theirs. But because people will probably know you've done it, breaking in line is something we you know, we just don't cotton to. And so I think a lot of the restraints on the use of technology come because people feel there's a shame of doing it in a way that society doesn't approve. So and on the internet, you don't have to worry about that. Because you can be anonymous, but in the real world, you do have to worry about what will people think of me?

Perry Chen:

So then would you say that if people were able to do it clandestinely, secretly, and feel confident that they would be able to keep it that way? That that would then change?

Walter Isaacson:

Yeah, let's do a thought experiment. Let's suppose that you went to an a clinic that allowed you to do genetic editing. And let's suppose that it was like a genetic supermarket, that you could choose what they gave you a list of the traits you could choose. And let's suppose that the traits begin by saying, make your child less susceptible to deadly viruses? And you'd say, yeah, I'll check that one. And also, you might say, deadly diseases like sickle cell anemia, or Huntington's or cystic fibrosis, those are easy. Those are single genes. And yeah, you would check that nobody would think badly of, and as you go down the list, you'd say, Well, I don't want them to be too short. Or maybe I don't want them to have been mad, you know, have a predisposition to bipolar disorder. You check those? And then you'd say, okay, not only to short, but maybe I want him to be six inches taller than average. And I want them to have blue eyes and blonde hair, you'd probably check that one, too. And then let's say it said, Well, do you want to make sure your kids gay or not gay or, or other trades? If it were totally anonymous, if nobody would ever know what you did behind that curtain? When you were checking off the gene, do you want it for your child? I suspect you would make you would go further in designing your children than you would if you knew your choices would not be anonymous.

Perry Chen:

Yeah, I think so that sounds like maybe the kind of maybe that'll be the fundamental factor in how this ultimately is able, how effective the social regulation that you're talking about could be, is that how confident we'll be we'll be able to get around being able to do this, without people detecting it. And it's funny, I think, maybe the history of plastic surgery or those kinds of surgeries has maybe had some of this to where, you know, there's there probably now a days a lot of procedures that are maybe more subtle, and there's people that maybe would have said I would never do those things. And part of the reason was, well, one because they were younger, when they said that and then also maybe because, you know, they felt like look, it's kind of at some level at start, it's pretty obvious and you look, you know, you look very tacky. And so but then it's like, Okay, well, now we have this thing, and it's it's, it's, you know, it's nobody can tell and you probably have a lot of people who before would have been like, well, I wouldn't ever want anybody to think I did plastic surgery, but if you can't really tell them this, well, then you nobody has to know. So maybe there's a similar correlation of that arc too?

Walter Isaacson:

Absolutely. You know, there's a big philosophical issue on anonymity, and it goes all the way back to Plato and Plato's Republic. He tells the story of the ring of guises. It's called And the dialogue is, if you put on the ring, you become anonymous. Nobody knows what you're doing. It's like being invisible. You can go shoplift, you can, you know, insult people, whatever it may be. And the ring of guy just protects you from anybody seeing or knowing that you've done it. And this debate was Socrates, or Plato or whatever is, would we be less civil? If you could put on that ring and be anonymous? And I think the internet helps answer that question. As we watch, as people can create anonymous, Twitter, and even anonymous Facebook accounts, and put out fake news and do things and not you know, anonymously, or send you emails saying, I'm your aunt, and I'm stuck in Nigeria, and I've lost my passport, please wire me money, and the person is anonymous, so you don't know who it is. All those things have proven that if you're totally guaranteed some anonymity, you're going to do things that are uncivil or that human, a lot of humans will do things that are on civil when you're nonnamous. And I think we used to not have total anonymity, when I was growing up in New Orleans, and you know, you know, New Orleans very well, if I went to the corner drugstore, and I bought a pack of Marlboro, or a, you know, pack of condoms, when I was 15,

Perry Chen:

Your parents are gonna find out.

Walter Isaacson:

It would take 10 minutes, or somebody call my mom and say, I didn't know your son smoked, or whatever. And I'd get home and that'd be a glare. You know, people knew what I did. And for the past five, 6000 years of civilization, people live in communities where you want anonymous and you're held accountable for what you did. That notion of anonymity, we sometimes confused with privacy, and your friends and mine, Perry, who are part of this Electronic Frontier and or internet denizens, they wail on about privacy, privacy, we all need privacy. Well, first of all, too much privacy crosses the line into allowing people to be totally anonymous and do things they'll never be held accountable.

Perry Chen:

It harms social cohesion and social and group function.

Walter Isaacson:

Totally. And I think we've become addicted to the notion of privacy, and let it go to such extreme that it means the same as anonymity. And I agree that certain things should be private. But I also think that like in the real world, privacy doesn't need to be taken to such an extreme, that you can do things that are harmful to other people, under a veil of anonymity. You know, this is,

Perry Chen:

But prior to the internet, I think that we already moved into this paradigm that you're talking about. Whereas, you know, you saw that we moved out of agriculture into industry, and then into the offices, and a lot of stuff that you could do in an office where you could maybe work on harmful societal practices that were legal, extractive to people in your community, or in a community 1000s of miles away. But you know, you look like you're just everybody else, you're waking up, and you're going to the office, and you come home. And you know, maybe your job is really complicated, and people don't understand it. Or you don't really have to explain it, because you're like, Oh, I work in finance. And then everybody moves on to talk about, you know, whatever the happened at the ballgame. And so, you know, I've felt since growing up and growing up in New York City, right, so you're kind of like I was staring at those giant buildings always like, what are those people doing? They're doing because they seem to be doing really well, but have no sense of what they're doing. And I feel like that that's analogous to kind of what you're saying on the internet, too, which is I think you can have people who are being harmful to the to the group. And they're cloaked in this anonymity or privacy, because what they do is largely unseen, or largely not understood. Would you agree?

Walter Isaacson:

Yeah, I do think that the move towards having people be more and not allowing people if they wanted to, to operate with more anonymity begins, you know, a couple 100 years ago, as we became less of an agricultural society, and people moved in mass numbers from small towns and communities where everybody kind of knew who they were into large cities where you could be more anonymous if you wanted to. And that had a really good upside for a lot of people. Maybe it was whether it be your sexual preferences or your politics or anything else. If you left, you know, Bogalusa, Louisiana and moved to New York City, you know, you, you were, you had a little bit more anonymity. in your day to day life, people did not generally know who you were, that though gets amplified, globally, a million fold when you get to go on the internet, and people truly don't know who you are, if you so choose.

Perry Chen:

I wonder, also if, you know, I've thought a lot about complexity in the sense of what's the what are the costs of complexity. And so when I think about the financial crisis, 12 years ago, in 2007, eight, you know, the, you know, the easy thing to talk about, there maybe is the credit default swaps. And, you know, they were, in my mind engineered complexity, right, it was designed as a complex instrument. And part of what that did is that it's opaque by nature, meaning that a lot of people around it won't necessarily understand it. And then also, it's much harder for the people that are charged with regulating it, to comprehend it, they have to go through the stage of knowing it exists, then comprehending it, and then only after that potentially regulating, it's like a bakes in a longer lead time for society, to be able to respond, even if it has the will. And, you know, I think there's something about like, the complexity itself, and the fact that we live in a time of growing complexity, and maybe it's Oh, you know, first of all, maybe it's always been like that, that we, and I wonder if there's almost a saturation point. And whether we've reached it, or we're already past it, we're part of what we have to think about when we think about better systems in the future, is, is not just an improvement, but also that they need to be simplified now, like Einstein, not simple to the point, as simple as it needs to be, but no simpler. Right? So but if, you know, even just using an example of the tax code, and I know there's a million reasons why tax codes are, are, have deep problems. But, you know, I wonder if even if somebody could fully understand, the tax code would say that, look, if you tried to make it a flat tax, it's actually not as fair that there is a cost of the complexity that is needs to be assessed. Whereas the fact that when something is made super simple, just the fact that the public understands it means that that has value, whereas complex systems require trust, because people don't understand it. And and, and, you know, we now live at a time with with lower societal trust. And then we have all these complex systems, which, which mean that it's, you can be suspicious about them, because we don't understand them. Or they're largely misunderstood. So maybe the question is really about just this. All of these things we're talking about are kind of added complexity. And I wonder if we can continue to solve the problems of complexity, with more complexity.

Walter Isaacson:

I think one of the costs of complexity is that it you become alienated from your technology, you don't fully understand it. We see that with vaccines, we see it with our electronic technology. We see it even with the complex systems that shuts down a Maytag plant in keokuk, Iowa, and people lose their jobs because of some global financial system in which hedge funds and you know, people have acquired companies and globalize them and offshore jobs or whatever. And so the financial system is complex. The industrial system is complex, the internet Maddison all is complex. And you feel that you don't fully understand it that makes you susceptible to conspiracy theorists that they're weird people for you know, from Bill Gates to George Soros, that are controlling systems that you don't understand, until I think it's important for us to try to understand our systems better to Have a simpler tax code have a simpler economic system. Also, for people to try to understand biology a little bit better to know what a virus is and what a gene is. Or like when I was growing up, in our basement in New Orleans, we, you know, I had a little workshop with a soldering iron. And I made circuits and I used, you know, vacuum tubes, and then transistors and resistors, and capacitors. So I could figure out what a circuit did. But once Apple comes along, and personal computers come along, you no longer open up your computer, you have no idea what a circuit is. And it all feels mysterious to you. So I think this alienation and detachment from the complex systems we have to live with, makes us less trusting and makes us possibly more conspiratorial. And one way to do that is to have simpler and more transparent systems, whether they be economic systems, or technologies like biotech and infotech.

Perry Chen:

So yeah, I agree. I totally agree. And I'm wondering about the attainability of that. You know, I think that continuing to use it the tax code, as an example, that it appears that complexity creates this kind of capture, right, where industry rises up around helping people navigate the complexity. And then of course, has no interest and will fight tooth and nail often to against simplicity, because, you know, that'll put them out of out of out of out of some work. And so this capture that kind of occurs around complexity. I wonder if, if that's endemic? And, or if there are ways that to work around that? Have you seen any? Or what are your thoughts on that?

Walter Isaacson:

I think that there are people who benefit by making systems complex, and certainly people who, you know, can create hedge funds that will help them make money off of complexity that others don't understand. However, I don't think there's a grand conspiracy to make things more complex than they need to be. I think it just happens with a bureaucratic inertia, that you have tax codes, for example. And then, in any given year, people, legislators decide to add some new facet to the tax code. But nobody ever says, Let's throw the whole thing out and start from scratch again, and make it simple. And so I think it just naturally happens like Microsoft, you know, word Yeah, exactly. Every new version comes out, they add some new functionality, but it makes every menu and every pull down thing harder to find, how do I just, you know, print in draft mode? You know, or something simple.

Perry Chen:

Yeah, I don't necessarily I take the stance that it's Machiavellian, you know, of course, I'm sure there's many examples of creating complexity to take advantage of it. But I agree with you that I think it's just, it's just the inertia of things as they exist. And so that's, that's what I wonder is like putting aside any sort of attribution of agenda, assuming that is inherent, like how do you how do you de-complexify?

Walter Isaacson:

Well, I think every now and then somebody comes along like a Steve Jobs and says, Wait a minute, simplicity is the ultimate sophistication, something Leonardo da Vinci said as well. And so you have a cell phones, let's say you and I can remember them before the iPhone comes into being in the early 2000s. And entering something into your address book was so complex that you you know, you can never get things into your phone. You never knew how to use things. And Steve Jobs just said, we're going to make this radically simpler. He did it first with the iPod. And when they were designing the iPod, he said, I want it to be so simple. It doesn't need a manual. And that within three clicks, you can get any song you want. No designer said, Well, you know, we need to be able to have the album and the artist and the song and then it's going to take mo He said, No, three clicks and why the hell do we need all that sort of stuff, just and finally they make a very simple scroll wheel. You know, that wonderful thing on the original iPod? And you could just scroll down your songs and find it. And when he saw it, he said, Yeah, that's a simplicity I want. And then finally, he said, What the hell is this on to some button that was on they said, Well, that's the on off switch. He says, Well, what the hell does it do and they pause for a second And they said, well, Steve, it turns it on and off. He says, Well, why the hell do we need it, and they pause for a second, you know, need a big on off switch on your iPod, it knows the power down when you've quit using it, or the power backup, when you start using it. And so a genius like a Steve Jobs could radically cut through complexity and make something simple. And so he takes that talent and does it with the iPhone. So there is a market for people who know how to make things less complex, especially if it's a consumer product. The problem is when it's a bureaucratic product, you know, like the tax code, it didn't take a very strong politician, maybe you know, an Andrew Yang, or somebody will come in someday, and say, This is insane. I promise you a tax code that can be done on a postcard your returns, and it will be radically simpler. And maybe then we'll cut through those complexities.

Perry Chen:

Then what about the tension between the examples that the two examples that you've talked about? This, this last one about the simplicity of the iPod, and apple and Steve Jobs ability, especially in those kind of golden years, where they were just kind of radically simplifying things into almost magic felt like magic. But but then before you, you know, you were talking about things that people used to know how things worked. So like, I know, the car is a great example, where people generally often knew how to fix a bit of their car, but now you open the hood of a car, and it's like a computer system. And it's like, you know, it's a totally different world. And, and so that transition to more sophisticated cars, kind of is where people are dealing now with a more complex car, and they're more alienated from it. But I would say that, I think there's an argument that the same could be said, for the iPod and the iPhone, which is that the interface was simplified. But there's just an exceptional amount of complexity behind that, like there might be in a Tesla, as well. And so is it really simple, in the sense of that it's not like doesn't contribute to the alienation? Or is it just that the interface is simple, and thus, means that there's tremendous, you know, it has all the benefits of like, for consumer adoption, which I totally agree with, but then when things go wrong, right, when there's distrust in all technology companies and things like that, then we start to realize, hey, this thing I have in my pocket that I think is simple, and but I now realize that under the hood, well, where's my I don't know anything about where my information is going? Yes, sure. Tim Cook claims that Apple's the one that cares about privacy, but you know, is that I can't How the hell do I know? So what do you think about that the under the hood, because I've seen that a lot in design and technology, especially, which is that the magic is, is often you use a lot of under the hood complexity. And you find a way to make it very simple for the user, but it's complex underneath.

Walter Isaacson:

That was a philosophy that Johnny Ives, the great design director for Apple had, which was so true. And so Steve Jobs believed it as well, which is complex. Simplicity is not a surface snake. In order to have true simplicity, you have to understand the absolute depths of all the complexity within in order to get to something that feels simple. And that's what graphical user interfaces have done for 60 years ever since they were done for weapon. You know, for air defense systems and video games like Pong. We had simple graphical user interfaces, that allowed people to feel that using their computer could be simpler. And that's what Apple does with the desktop metaphor that becomes Windows as well. But you're right, underneath any user interface that is premised on simplicity, is a wealth of complexity that the designers had to understand. And so I think we have to say that we need not only simplicity, but pure transparency. We have to know what's happening deep inside of these systems that have simple interfaces for us. Where is our information being stored? Where does it come from? Where does it go? And it's not in the interest of a lot of companies to have such radical transparency. It's against The business model of Facebook data sets radical transparency. And for that matters, you say, even computers, you know, for us to know exactly where our information is going. So I think that in order to make us more comfortable with our technology, instead of having these organizations like the Electronic Frontier Foundation that just push for privacy, we should also be pushing for transparency, and also digital literacy, where we say, here's what's under the hood, we want you to understand why there's an algorithm that amplifies this type of tweet, or this Facebook post. And maybe you might not want a company that uses those algorithms.

Perry Chen:

What about the possibility that some things are just too inherently complex, that even if people are better educated, around a lot of the basics, that there is a point, let's say, with the automobile, where people could kind of understand their car. And then of course, it's not hooked up to the global Internet, so they can trust that there is no nothing happening when they're, when they're unaware of it, you know, other than somebody coming in sneaking into their garage. And so, but now, like, even if I think about, like, being taught by somebody to better understand my my laptop or my, my phone, I still have to rely on a tremendous amount of trust in what they're saying to me. And knowing also, there's probably several points where Yeah, they say my data is backed up in these highly secure places, and that nobody else has access to them. And this is how they do it. But I have to trust that because I'll never be able to see that with my own eyes. And so it's maybe... I'm sure, with vaccines, it seems like it's very similar, right? This seems like a very similar kind of proposition.

Walter Isaacson:

I think that's why it's very important that we try to understand our technology. And it's what I've done in most of my books, whether it's writing about Einstein, or writing about Steve Jobs, or writing as I am about genetic engineering and vaccines now, which is there's a certain beauty and comfort that comes from understanding things, especially when those things are ourselves. And if we try a little bit hard, we can understand how does the cell work? What is DNA do? What is RNA? Do? You know? How can RNA cause yourselves to build proteins that are like the spikes on a Coronavirus and therefore, kick your immune system into higher gear? Well, that's somewhat complex. But you know, lots of things are complex, but those of us who want to feel more attached to the world around us, we make the effort to understand that complexity. And then in my case, I tried to make it understandable to other people. There's nothing more complex than Einsteins to two theories of 1905, you know, relativity and quantum theory that he comes up with in his 1905 papers. And that's a period in which the world got hit with a lot of complexity. You know, there was the uncertainty principle. There was, you know, people like Stravinsky, breaking the bonds of music and Picasso doing it with art. And modernism comes in and the world becomes complex, and you have a reaction against it. Partly because our science becomes so complex at an ordinary citizen would have trouble understanding it. If you're an ordinary citizen in the time of Ben Franklin and Thomas Jefferson, if you you can pretty much understand Newton's mechanics or you know, gravity, you know, how it works. But understanding Einstein's gravity is difficult in quantum theory, even Einstein had trouble putting his head around it. However, I you know, I've made it part of my career to say, if you try a little bit you can understand why and circuit with on off switches can do logic, and how you can etch such a circuit on a piece of semiconducting material like silicon. And I think it's just a way that we should all like Leonardo da Vinci be a lot more curious about what happens in the world around us, because it makes us feel less detached and alienated.

Perry Chen:

Yeah, I mean, look I there's nothing to argue that it's a two sided thing, right. Like there's the work that we need that needed to be done on simplifying things, to the right point. But there's also, of course, every people being able to get more educated. So they're kind of, you know, at least able to grasp these concepts. And I also agree with you that I think there's a lot more that's attainable there. I guess I'm just reckoning with thinking back some of the other examples, like you gave about being in New Orleans being 15. And go to the store, that there is some tie, we have to our perception, our senses, the ability to see something, the ability to smell something to hear something that really kind of was inherent in being a human and being part of a group. And so many of these things are abstractions. So I think that people can start to understand these abstractions more. And I think as they either like gravity, they're happening in front of you. So there's ability to perceive it, and there must be some explanation. So you're, they're gonna believe it's, it's God in the sky, or there's something behind it. And then also, when these things work for you, meaning that like, okay, you're explaining this thing, to me, maybe understanding a little, but I see it happen every day, I see my electric car turned on. But you know, what happens in society, a lot of times, it's things work for some, some, and they don't work for others. And at that point, you know, I don't know, like, people seem less likely to want to believe these obstructions, even if they're explained it, because it doesn't seem like it's working for them. Like, somebody tried to tell me that the tax code is actually real, very complicated, and probably over complicated, but actually designed that way, and has gotten that way because of constant attempts to solve it and make it better and more fair. And I know that that's not necessarily true. But I mean, even that argument itself, for somebody who feels that they're paying a fair share of taxes, and they may be getting getting fair benefit from being part of society is more likely to be like, okay, I can believe that. And somebody who feels differently, may say that, how do I know that? That's true? So there's this question about the things that we can't see and touch and feel and our willingness to go along with the gap of our ability to affirm that as our own selves.

Walter Isaacson:

I think that requires a system where we have leaders and legislators and regulators that we can trust. And sometimes we have that, and sometimes we don't. It's really a function of the political process. All we electing people who are trying to understand our problems and solve them, and then get the right regulators and experts in as much as I try. I'm not going to fully understand how Facebook, federal fiddles with my data, for example. But I've got to trust that the FTC, or you know, the antitrust division of the Justice Department has experts in it, that we can you that can regulate these things. Likewise, I know how the new COVID vaccines work in theory, I understand what messenger RNA is, and how that builds a protein and myself, that will create, you know, a reaction and immune system reaction that might protect. But I don't know whether Madonna and Pfizer have actually tested the chemicals. Exactly right. I have to trust the FDA and its process and the CDC, we've lost a lot of that trust in our regulators and is why is both a cause and effect of people like Donald Trump. We don't trust government anymore. So we like Trump, and then Trump says don't trust your government, and seems to gain the regulatory system. And it's really up to each one of us to say, all right, maybe it's time and I think that may be what helped Joe Biden, which is alright, he's not the most exciting candidate in the world, but he knows how government works. And we actually realize now we need a Centers for Disease Control that has a real expert leading it and we need regulators at the FTC, they're going to try to figure out things like whether Facebook is violating our privacy. We can't do this ourselves. And it's why we have to have a government we trust so the main problem is not each of us solving the complexity of biotechnology or information technology, it just solving the problem of democracy. While we can all try to elect people and support people who we believe are well intentioned and care about us, and are going to rely on smart, decent people to figure out what drugs should get approved, what internet services regulation should be, and whether a car engine, you know, needs to have certain safety requirements?

Perry Chen:

If so, following that line of thinking, what do you feel the record has been for, you know, kind of liberal democracies. In this in our, in our recent decades, at least, of living in this just more and more complex world? I mean, do you feel that there's some inherent challenges, even safer leaders that maybe we we trust, who maybe have the right inclinations and, and care about the society at large more than their own interests? And but what may be just inherent in the kind of the the best system that we say maybe have compared to all the other alternatives, but how is it struggling? In this extremely complicated time? Complex time?

Walter Isaacson:

Right? Well, it's, that's a good note, a great combination note for this interview, because it's something art can do. It's something you're going to be doing. It's, it's a role of every humanist, every artist, every writer, every scientist to say, we're all in this together. And we're here about making real connections. And I sometimes think that artists and humanists don't make enough of an effort to understand science. And I think sometimes scientists don't make enough of an effort to open up about things. Same is with technologists. And so I think there's a role for writers and artists and and for that matter, innovators, to say, we're gonna bring people together to make us all feel a little bit more connected to the world around us. And so I want to thank you for what you do to Perry.

Perry Chen:

Thank you, Walter. I've got one last one for you. How about that. That, okay?

Walter Isaacson:

Sure.

Perry Chen:

Here it goes. And I've been very interested in asking you this. What do you see that's coming in terms of science and technology? That maybe isn't everybody talking about? Like, like genetic engineering and artificial intelligence, even if it's a recombination of those things? Like what what are you seeing that might be on the horizon past the ones that people are starting to talk about and see?

Walter Isaacson:

Yes, well, artificial intelligence and genetic editing of the two big ones that we've talked about. I think there is, in the near term, something that you having founded Kickstarter would understand quite well, which is, there is a need and a hunger and therefore, I think, a inevitability for the type of social network that will come along, that will be unifying, and empowering, and help bring us together and help us do the type create the type of communities we want, that will replace the type of social networks we have now, that are inherently baked into their engineering and algorithms, are inherently polarizing, and divisive and enraging and help incentive aggressive and non empathetic behavior. And those had certain business models that created things like Facebook by allowing people to get all upset and go into their own little filter bubbles and hear things they want and then get more and more polarized. But I think, just as the move from Donald Trump to Joe Biden Herald some social sense that maybe we should get back to more boring normalcy for a while, I think on the horizon, there'll be kids in my class at Tulane, who will invent things that are somewhat in the spirit of Kickstarter, rather than which you helped, or you did invent, rather than the spirit of Facebook. And they'll say we can use this internet to do things that make our lives better, but make our community and our society better and I think that's the big thing that will come to pass in the next five years, is a whole new type of social network that will appeal to the better angels of our nature, as opposed to the demons lurking within us.

Perry Chen:

So it seems like you're talking about the intentionality of how maybe technologies that are kind of already here are deployed with what we have learned works and doesn't work. In the first wave of whatever we've just experienced, right? It's kind of I'm sure it's quite natural arc of of a technological wave, and then it just lands as it lands. And then does the second wave, which is like, Okay, well, now that we've seen if we just throw this thing out there, what happens? Now, what do we want to actually do with this? Now that we have some sense of what what it does well, and what it doesn't. Is that right?

Walter Isaacson:

I think that's exactly right, which is connecting the technology with the humanities, which is what you did when you started Kickstarter. And it's what I think Mark Zuckerberg back in his innocent days was trying to do when he wanted to connect the world. So it's not necessarily reinventing a TCP IP packet switch technology. It's deploying the technologies of the Internet, and the understanding of our human nature to create tools that people will want over the next decade, rather than the polarizing and poisonous tools in social media that we've had in the past decade.

Perry Chen:

Thank you, Walter.