Privacy is the New Celebrity

Ep 1 - Ramez Naam on Drones, AI, and the 4th Amendment

June 30, 2021 MobileCoin Episode 1
Privacy is the New Celebrity
Ep 1 - Ramez Naam on Drones, AI, and the 4th Amendment
Show Notes Transcript

In the premiere episode of Privacy is the New Celebrity, MobileCoin founder Josh Goldbard interviews author and technologist Ramez Naam. Ramez is best known as the author of The Nexus Trilogy, but his current focus is clean energy technology with an emphasis on solar energy. Josh and Ramez discuss drones, AI, the 4th amendment, and the relationship between privacy and creativity. They also share stories from the first time they each realized privacy was important, and that one time Ramez got a cult for his birthday.

[00:17] - Speaker 1
Hi. I'm Joshua Goldbard, the founder of Mobile Coin, and I'm Super excited to introduce the very first episode of Privacy is the new celebrity what we're going to do on the show. We sit down with some of the most interesting people we know for conversations at the intersection of tech and Privacy. We'll have a rotating cast of hosts and guests. We're going to bring you smart, forward looking conversations and stories from our studio right here in San Francisco. For our inaugural episode, we have Ramez Nam Ramez is an American technologist and science fiction writer.

[00:57] - Speaker 1
You probably know him best from the Nexus trilogy. Rames, thank you so much for coming on. The very first episode of Privacy is the new celebrity.

[01:05] - Speaker 2
Josh, it's awesome to be here. Cool.

[01:08] - Speaker 1
Can you tell us how you introduce yourself?

[01:10] - Speaker 2
Oh, my gosh. I'm Hermesnam. I'm a software guy, computer scientist. I'm a Sci-Fi author, and I'm a clean energy person. I've written books on climate energy, and I speak about and invest in climate change fighting startups.

[01:24] - Speaker 1
So one of the questions that I always have for people that are technologists is, what is your relationship to technology? The reason that I ask this is that I have a dear friend of mine who did his PhD in computer science, and he lived in a room without electricity for the entire time that he's doing his PhD because he wanted to get away from technology. So I'm just kind of curious, what is your relationship to technology?

[01:45] - Speaker 2
Wow. That's a good question in the abstract. Like, technology is one of my greatest versus hope and joy about the future. And it's one of the things that gives me hope about climate change, for instance, and maybe about democracy and freedom on planet Earth. But in the very real personal sense, I struggled through my phone down at the end of the night doing asleep. I'm still wired to it. I love the global interconnected communication systems that we have and still constantly working to get healthy relationships with it.

[02:15] - Speaker 1
I want to dive a little bit into the relationship that you have to climb. It just for a second before we go into technology, I've been just super deeply fascinated with the writings that you've done over the years about solar technology. I was wondering if you just talk very briefly about kind of the progression that you've seen in that world. Yeah.

[02:32] - Speaker 2
I grew up in tech. I spent a bunch of time in Microsoft, and I looked at Moore's Law and so on. And around the late 2000s, I got interested in climate environment. I think I've been sort of a tech optimist. The technology would solve this, but I'd never really looked into it. And it was actually a beautiful beach in Mexico that I just decided I should look into the state of the environment and the planet and my role. And when I started just reading everything I could find in the field, people were saying, oh, it's a huge problem.

[03:01] - Speaker 2
And either we're doomed. Or the only way out is degrowth or ending economic growth or smaller lives. On their hand, they were saying, there is no real problem. The problem is obviously true. But one of the things that I found was that the cost of solar panels was plunging in a way reminiscent of the cost of computation. And so we think about Moore's law in tech. It turns out Moore's law is probably just a case of more general law called rights law, which is that for various technologies, every doubling of human of scale, how much you've built it or deployed to it leads to a certain percentage declining cost.

[03:38] - Speaker 2
And so the cost of solar panels has dropped by about a factor of 500 from 1975 till now. The cost of electricity from solar systems, which involves other stuff. And just the panels like the labor deployment, the wiring and so on, has dropped by a factor of somewhere between five and ten in just the last ten years. So it's not as fast as competition, but it's staggeringly fast. It's way faster than anything in what we normally think of as physical infrastructure and that and related things like it in terms of cost of clients and batteries, costly wind power cost vehicles, cost making hydrogen one day gives me hope that we actually can turn this ship around in terms of climate.

[04:20] - Speaker 1
That's really exciting. I think about the miniaturization, just the availability of energy in the world. We're going to talk about it a little bit later in the show, but that certainly increases the ability for people to observe other people at a lower cost. If you have free, abundant energy and the cost of manufacturing things becomes ever lower. The cost of making things like drones happen. Obviously, that has a lot of impact on the ability for us to sort of observe each other. And I think that that's one of the things that I've spent a lot of time thinking about.

[04:47] - Speaker 1
What I wanted to ask you on this question is that we think at Mobile Coin that the act of creation requires Privacy. And I was wondering what your relationship to that idea is. We're testing that is sort of a hypothesis, and I'm wondering, do you believe that? Do you think that's real?

[05:04] - Speaker 2
I think it does require Privacy, or at least maybe not individual Privacy. It doesn't. Privacy actually say that creation does require Privacy if you want it to be good, especially if you're making something that pushes the envelope of culture. And we talk about art folks in this podcast and me as a science fiction author when you're writing something or painting something or composing music, whatever. And you want it to be edgy in some way. I don't mean that just for stylistic purposes, but you want it to actually drive some change in the world.

[05:39] - Speaker 2
You're, by definition, playing with ideas that are at the edge of acceptability, and it's really easy to miss up there. Yeah. And so you've got to have the freedom to experiment, and iterate and so on before my first novel, Nexus, came out. Ultimately, there were probably about five drafts of that book before it came out, and maybe six or seven revisions of the very first couple of chapters. And I still got some things wrong that I wish I could go back and change, but I wouldn't want the whole world to see all of my rough drafts or all of my communication with people who are reading early copies of the book as I was working on it, because that was me trying to figure it out, right?

[06:24] - Speaker 2
Yeah.

[06:25] - Speaker 1
I think that when we're trying to make people feel, when we're trying to create art that evokes feelings of people, there's a pursuit of novelty, a pursuit of trying to get to something that we haven't scratched at before that we haven't felt. And when you're working through those new emotions, sometimes they're too raw. Sometimes they're too soft, like hitting just the right sort of crescendo and decrescendoing in a book like that. I don't know how you do it if you have to do it in public.

[06:49] - Speaker 2
Yeah. It's very challenging because you're just going to misstep if you're trying to get intense emotion out of someone, whether it's great music, great poetry, great movies, great books, whatever you are going to be doing stuff that pokes people a little bit or exposes them to new ideas that maybe they're not already comfortable with. And it's so easy to miss up a little bit and trigger the opposite direction of what you wanted. And so you have to have some freedom to, like, try that out and have some trial and error before showing people the finished product.

[07:24] - Speaker 1
I think it's absolutely true. And I just wonder what is the first time that you realize Privacy matter to you? I have a short story here about seeing my neighbor read the Journal of her child and then punish her for the contents in that Journal. And that was the first time that I realized Privacy really mattered to me. Wow.

[07:44] - Speaker 2
Oh, my gosh. I'm sure it goes back to my early childhood. The first thing I think of that when you ask the question is, I was a pot smoker in College, and that used to be illegal cognitive Liberty, right. The freedom to think your own thoughts, the freedom to modify your mental States, whatever way you like that seems fundamental to me. But we live in a society where you were punished for that sort of stuff. And so I think that's when I fully felt a need for Privacy from the state, at least.

[08:20] - Speaker 1
Well, one of the deep ironies that I find in America is that you probably couldn't have the Boston Tea Party in an absolute surveillance state, of course.

[08:30] - Speaker 2
Yeah. Any rebellion, whether it's breaking the law or not. But any act of doing something that's outside the norm requires some Privacy in planning, and certainly political movements, too.

[08:44] - Speaker 1
I would even go a step further and say that change in society requires the possibility of Privacy.

[08:49] - Speaker 2
That's absolutely the case. Right. When you think about things like Americans have now largely embraced gay marriage, for instance.

[08:57] - Speaker 1
But could you have gay marriage without gay relationships? Exactly.

[09:00] - Speaker 2
And there was such a long period where it was so anathema. And if we had forced everything out in the open and forced everyone out of the closet non consensually, I don't think that would have been good for progressive society. Like people needed the freedom to fundamentally need the freedom to do what they were going to do, period. And that's how you integrate these things.

[09:20] - Speaker 1
Yeah. Well, also, just like the freedom to assemble that depends on the ability to have a private conversation where you talk about assembling.

[09:30] - Speaker 2
Right. Absolutely.

[09:32] - Speaker 1
Right.

[09:32] - Speaker 2
If you're talking about advocating for some change, you're talking about communicating in some way or pushing the world forward in some way or even just advocating for your own rights. And you want to do that as more than just you as an individual. You want to do that collaborations and friends, you got to communicate with them. And we see, I mean, going back to, like the FBI wiretapping Martin Luther King, every hotel room he stayed in was the phones were tapped by the FBI specifically because Jager Hoover didn't like the civil rights movement and wanted to get dirt on King to discredit him.

[10:10] - Speaker 2
That is a clear indication of the forces the powers that be whoever is in control typically has an Association with an attachment with the status quo. And if you want to change that, you've got to have people have the freedom to think new thoughts and communicate about them.

[10:28] - Speaker 1
Yeah. And what's interesting to me is that when we feel sort of the tides of power shift in a society, then the people who were the entrenched power structures, they want Privacy now because they are no longer the entrenched power structure. And I think it's just really interesting when you have a society where there is change. Everybody at different points wants Privacy or doesn't want Privacy.

[10:48] - Speaker 2
That's true. Whenever people feel vulnerable about something, they want Privacy. And it's okay to feel about something like not everyone needs to know your dietary preferences or your sexual preferences or what you're eating or whatnot I do think it's interesting. Corey Doctors has this concept of sort of progressive Privacy, like Privacy for the masses as the default and more and more requirements for transparency as we give people more power because we give people governmental authority. And so on.

[11:20] - Speaker 1
What celebrities have different rights already right now, as it relates to sort of the use of their name. And I don't want to say slander, but the things that you can say about a celebrity are wildly different than what you can say about a private citizen. And the line between what makes somebody a celebrity and not these days, is if you have 1000 followers on YouTube. Are you a celebrity? Right? One of my favorite quotes is actually from Aaron SIDG, who's one of the early engineers at Facebook, where he said, Privacy is the new celebrity.

[11:50] - Speaker 2
That's interesting. That's a really interesting quote. Actually, it's an inversion we thought about. And you're right. The line of somebody is totally fuzzy. I think even more about people in government when we hand someone power in a legal sense where they can create laws that affect all of our lives. I think there it's not unreasonable to ask that people voluntarily sign up that some aspects of Privacy are going to be reduced for them if they in exchange for them being granted these powers and privileges. But even there, I think people deserve a lot of Privacy for their personal affairs that's separate from their professional affairs.

[12:30] - Speaker 1
Yeah, of course. I just can't imagine a society where you have cameras in your home, and we do have microphones in our home, increasingly with the sort of like the home assistance pause and those kinds of things. Right. It's true.

[12:42] - Speaker 2
We do. And I don't have an Alexa in my house. I have slightly mixed feelings about it sounds very convenient until the time that's not well.

[12:52] - Speaker 1
I was encouraged by the fact that Apple's audio assistance is now no longer leaving the device question whether you believe that or not, because the code is an open source. But they are stating now that the information isn't leaving the device, which I think is pretty cool.

[13:06] - Speaker 2
And I hope that triggers other vendors to make changes in that direction themselves.

[13:14] - Speaker 1
Well, heaven help us if Apple is the only company in the world that can defend our Privacy.

[13:20] - Speaker 2
And Apple is not a perfect defender of Privacy, either. They boast about it a lot, but they're not ideal. And times Apple can be compelled.

[13:28] - Speaker 1
Right.

[13:28] - Speaker 2
Like in the news today, the Trump DOJ gave subpoenas to Apple that forced Apple to give up data about people in Congress, in addition to people at The Washington Post in the New York Times, like journalists. That's bad enough. But Trump wanted Adam Schiff, chair of the House judiciary. They wanted to see what he was up to. And not only was Apple, like, forced to give up some metadata, they were gagged. Yeah.

[14:00] - Speaker 1
For a year. Yeah.

[14:00] - Speaker 2
They were forced to intrude on somebody else's Privacy of journalists, the Washington, The Washington Post, the New York Times of Congressmembers. And they were also banned from communicating about that to the victims. I'll call them victims. Right. Like that is like a double whammy the first and Fourth Amendment both being violated in this case, as far as I can tell. Yeah. Absolutely.

[14:22] - Speaker 1
I completely agree with that. So one of the things that I wanted to drill in on was this idea that you need Privacy to form narratives. And I want to go back a little bit to the solar energy. So I've noticed that. And I know that you have lectured large institutional organizations in energy about solar and the way that it's changing our world. And I've seen the way that those conversations are happening in the public. The narratives that those companies are presenting, those have shifted over time.

[14:53] - Speaker 1
So I'm wondering what change happened behind closed doors and were closed doors, a requirement for the change in those narratives.

[15:01] - Speaker 2
I think in many cases, they were like, of course, these companies, they're economically what they did, right. And they have internal conversations about how are we going to make this transition or what's really happening and so on. And if you want to affect them, I think public pressure out in the open is very valuable. But I think there are real conversations you can have when they're willing to talk about things that they might not be willing to put up publicly at that moment. And I've been in close conversations with these companies that are, I think, super valuable.

[15:38] - Speaker 2
Actually, I understand people's desire for transparency. And I have a lot of transparency, too. But when people are talking about a big change, right. That's a very challenging moment for people to think about big change. It's usually very threatening for people in their personalized people, in their relationships, people in their corporate situations as well. Change is scary. And so you've got to have some room for people to have those conversations in a way that's less threatening than everything being totally out in the open.

[16:06] - Speaker 1
I guess the point that I want to drill in on is that I think it's very obvious to a lot of people that social change change in society that requires Privacy. What isn't necessarily as obvious is that private change, the change in corporate ideals, the change in any organization that also requires Privacy.

[16:25] - Speaker 2
I think it does. Again, I'd say there's multiple avenues of pursuing this change. And I think public pressure and public pushing on things is good. But of course, you can't expect a Corporation to have all of his conversations on its future strategy. For instance, be posted real time on Twitter. That is a violation of Privacy in a sense that people need some space to iterate on ideas and be wrong on ideas as well. And to honestly explore things pros and cons of things in order to make real decisions.

[16:58] - Speaker 2
Yeah.

[16:58] - Speaker 1
I mean, this is kind of why we have IP laws. You can develop things in private, then present them to the public if you had to develop, let's say that SpaceX had to do open development of Rockets. I think there would be a lot of Rockets that look like SpaceX in the world right now.

[17:12] - Speaker 2
Of course, or any software company. I'm a fan of open source code, but if every line of code ever written had to be open source, and when it was written, I think you have a lot less rewards for innovation.

[17:25] - Speaker 1
Well, probably be pretty hard to keep keys sacred if all the keys that you're ever typing in your computer have to be published by Digress.

[17:32] - Speaker 2
That'S fair or certificates that we trust to secure the Web and so on.

[17:40] - Speaker 1
Right. So overall, what do you think is an example of a place or an arena in society today that requires more Privacy than we currently have?

[17:53] - Speaker 2
Well, I'd say we're certainly on track for major Privacy erosion in a lot of ways, and that's happening in the US. I think the US is actually one of the places that has the best respect for Privacy laws. But if you look at what's happening in Russia, China, Iran, you have a situation where societies have gotten more entrenched. Governments have gotten more entrenched, less able to be dislodged. And it's harder and harder for people to have real conversations in private about change that they want to make, and especially with China is a major tech power, and it's been awesome to see China rise out of poverty and a billion people and more having a higher quality of life and a lot of deploying to digital tech.

[18:43] - Speaker 2
But it's not easy to disagree with the government in China or to advocate for change, even in small conversations with your friends. Here's a real problem.

[18:57] - Speaker 1
I want to share an experience that a friend of mine had. He was attending an AI panel, and there was a researcher from India, researcher from America and researcher from China. And the researcher from China was discussing the fact that in China, they don't really have hit and run driving accidents anymore because they have facial recognition of all the drivers. And so if you're in a hit and run accident in the city, then there are cameras that identify you and you can't run away from that vehicle.

[19:26] - Speaker 1
And the American researcher said, That's terrible. How do you live in a society like that? And the Indian researcher turned and said my brother was killed in a hit and run driving accident.

[19:37] - Speaker 2
That's powerful. And I think as Privacy advocates, we also have to be realistic that there are aspects of surveillance if it's appropriately limited, that do bring benefits.

[19:53] - Speaker 1
Right.

[19:57] - Speaker 2
If you have a situation where government is well reigned in and there's good checks and balances and tools are only used appropriately with appropriate authorization and strong evidence to motivate them or exceptional circumstances, then I think you would find that CCTVs everywhere, cameras everywhere, even facial recognition would help reduce crime. For instance, the problem is that once you put those tools in place, it's very easy for them to be abused. It's not that anyone thinks that we shouldn't catch hit and run drivers, or we shouldn't catch something that murders someone on a corner or we shouldn't catch corrupt politicians.

[20:42] - Speaker 2
Certainly, all of those things, we should it's that once you have all of that data being captured, what prevents the authorities, the government from using that data, not just to look for hit and run drivers, but to look for dissidents or to look for artists that are saying things they don't want to have said or to look for people that are disagreeing with the official government propaganda, right. That's the real danger. And I think it's something we haven't fully wrapped our heads around how to get any of the benefits of this without those downsides.

[21:15] - Speaker 1
Yeah. I mean, going back to the early point if there's no possibility of committing crime like smoking weed, you don't get marijuana reform.

[21:22] - Speaker 2
Yeah, it is absolutely right. So my state, Washington State, is the first state to pass a law banning the use of facial recognition by law enforcement, with one exception, which is in the case of, I think, child abductions, but in every other case, facial recognition will not be allowed to be used by law enforcement here. And I think that's reasonable. Actually, I think that the crime is I worry less about crime. Crime is a natural thing, but I worry less about crime than I worry about government overreach as an okay balance to strike.

[21:57] - Speaker 2
That's right.

[22:00] - Speaker 1
I sort of have this idea about a society has kind of two boundaries on it. One is absolute Privacy, the other is absolute surveillance, and we exist somewhere on that continuum. If you think about a society with absolute Privacy, it's not reasonable to actually have a society where you could wander into any space with whatever's in your bag. It'd be impossible to protect the White House in that scenario, for example, the other extreme of absolute surveillance, there's no possibility of change. It's a very sterile society. There's not the same kind of innovation.

[22:31] - Speaker 1
And part of the reason that America is one of the most innovative countries in the world, I think, is that we have this possibility of Privacy. So my question for you is where do you think we land on this continuum?

[22:43] - Speaker 2
Good question. It's constantly shifting. But I think the US is in a relatively good spot in this casino in other parts of the world. But I think it's still a challenge. I think it's still too easy technically to do sort of drag nuts and fishing expeditions. And I think Privacy issues are different when you talk about the targeted and exceptional and well regulated unmasking of something or surveillance of someone you got a search warrant on probable cause to look into someone in whatever way is different than I can just do a fishing expedition and show me everyone who has sent an email about smoking weed.

[23:32] - Speaker 1
Right.

[23:33] - Speaker 2
And the US is not in the latter situation where we have avoided that largely, though certainly we saw that the NSA is collecting or has been collecting a lot of the data to make that possible. I do think if you want an interesting fictional portrayal of sort of the conflict or of the best possible case you can make for surveillance society. There's a great Sci-Fi book called nomon G-N-O-M-O-N by Nick Harkaway, where he really science fiction novel set in the UK in the decades from now, but not in the super distant future, where he really tries to portray a society that does have massive surveillance, but tries to balance it with limited human access to the data is being surveilled.

[24:26] - Speaker 2
That's quite interesting. He posits sort of algorithmic methods and approaches like randomly creating a jury out of decision jury for a very limited time to evaluate whether or not certain data should be looked at and things like that. And it's scifi and it is sort of dystopian. So everything goes right at all. But I think it's worth it for us if we want to be Privacy advocates to sort of do the best job we can crafting what the alternative narrative is to at least seriously understand what the motivators are and what the case is for surveillance so that we have the best possible rebuttal to it at least.

[25:08] - Speaker 2
Yeah.

[25:08] - Speaker 1
And I think it is ethical for a society to want to protect their citizenry. The question is just to what degree? I think that's always the question to what degree? Exactly.

[25:18] - Speaker 2
And as you said, it's a balance, right. I think no one would say that we shouldn't have search warrants if you're talking about the physical world, should the police be able to search your home.

[25:28] - Speaker 1
It's very interesting to think about that because one of the companies that we really admire mobile Coin is signal and signal has been submitted multiple times by the government, and the only information that they retain is the user's phone number, the creation date, and the last access date. They have no other information. If you want to see the messages that someone sends on signal, you really need their phone. And it's an interesting dynamic to think about the dynamic of sort of spear phishing needing to access a user's device versus drag netting where you can just take all the information from the server and in fact, don't even have to notify a user that their information was taken.

[26:05] - Speaker 2
That's right. And I think it wasn't designed as well as in a plan to get to this point of balance. But I think that when we have a world where you can't dragnet where you can get access to data, but it takes effort. It's always an economic model of preserving Privacy by making the cost of penetrating Privacy high enough that while it's possible for government to do when it's appropriate, they have to limit it for the cases where it's most important, but it's not possible to do it algorithmically for everyone on Earth and the planet simultaneously.

[26:40] - Speaker 2
And I think that creates some level of balance.

[26:43] - Speaker 1
I guess one of the asymmetry that is exploring is this idea that if you want to look inside somebody's house, you need a search warrant or any probable cause. You need a specific instance where there is an interaction.

[26:54] - Speaker 2
Right.

[26:56] - Speaker 1
Whereas if you are dragned, then you subpoena server and the users of that server don't have any sort of notification, right? They don't have any rights, if that's the right kind of subpoena. And so it's interesting to think about the reality that if you are having your home search, do you have the opportunity to try to reject that warranty? You have the ability to have a legal case. You can put it in front of the law. But if you're using a server, you lose a lot of those privileges.

[27:22] - Speaker 1
Yeah.

[27:23] - Speaker 2
In Fourth Amendment law, this is talked about the third party doctrine, right. If you have a third party holding your papers going back to the 18th century or today holding your email saying that the Fourth Amendment does no longer protect that information, and that is clearly crafted by people who existed in the pre Internet age. Is that obviously written into the Fourth Amendment, right?

[27:50] - Speaker 1
I mean, maybe not.

[27:50] - Speaker 2
Right.

[27:51] - Speaker 1
Like, maybe the intent carries through from the 19th century to 18th century.

[27:55] - Speaker 2
Well, I think the third party document should go away. I think when you have, like, legally, I would hope that during our lifetime in the next generation, as we have Supreme Court justices that actually have lived in a world where their private papers, their emails and texts and so on are accessible to a cloud provider that they come to just realize in their gut that the spirit of the Fourth Amendment of being secure in your information as well as your possessions against unwarranted search and seizure obviously should apply to your emails in Gmail.

[28:30] - Speaker 2
Obviously, until that happens, though, and we don't know when that'll happen, then I think using technology that avoids that challenge by not having a server that has information about you is the best that we can do. Right.

[28:48] - Speaker 1
And that's part of why we tried to design what we call oblivious systems here at Mobile Coin. A lot of the technology that we built is basically intended to protect user information, insofar as the operators of the servers don't know what information is being carried over them.

[29:01] - Speaker 2
It's amazing. It's amazing what you guys have done, what mobile phones done. It's amazing what signal is done as well. And I think it's just as important for transactions as it is for sending a text or an email. Right.

[29:16] - Speaker 1
Why? Yeah.

[29:18] - Speaker 2
So let's say I want to go to a March, and I want to protest by having a sign that I make. I'm going to buy some poster board and a piece of wood and some markers to do this if the government, let's say, doesn't have the right to squelch my speech with the right to squelch my commerce, and I can no longer purchase those things and no longer put gasoline in my vehicle or charge my electric vehicle or pay for a rideshare or whatnot they can effectively by cutting off monetary flows stop me from speaking.

[29:57] - Speaker 2
And so I think people are a big sort of moral ideal on the left that money is equal to speech. But try sitting in society in any way. Try social movement for change. If you have no ability to exchange money with anybody else, and you will quickly find out that's not realistic.

[30:16] - Speaker 1
Right. We already exist in a society where if you have the wrong metadata, you can lose access to your banking relationships.

[30:22] - Speaker 2
That's absolutely right. Or if you're just deemed socially undesirable. If you're a sex worker or you work in porn, you can be locked out of a lot of financial transactions. If you're a pot shop, you're largely locked out of the banking system.

[30:40] - Speaker 1
Right.

[30:40] - Speaker 2
Like this is just what is that but an attempt to curtail to interfere with the civil liberties and the freedom of those people.

[30:50] - Speaker 1
Yeah. And I think the place where it always becomes really interesting for me is thinking about morality versus protection. And on some level, there is a feeling from people in the state that they want to protect youth from access to these kinds of things. They want to protect society from corruption. And then there's the question of like, is that morality as opposed to security.

[31:14] - Speaker 2
And is that notion of morality not actually about some absolute truths of morality, but just tradition or what you've grown up with in some sense? Again, like being gay was definitely considered wildly immoral for most of American history and not most of world history. And who are we to say what's moral and moral if it's consensual and doesn't harm people? I do think society has an interest in things like keeping kids away from drugs. And I think there is some reasonable expectations that they can take measures to do that.

[31:50] - Speaker 2
I think if you're under 18, not being able to buy booze or cigarettes or weed is not totally unusual. There's ways to us to be better on kids and alcohol, but that doesn't mean that we have to give the state the ability to see everyone's financial transactions and interfere with them if they want to.

[32:11] - Speaker 1
Right. So at Mobile Coin, we think a lot about imagining alternative futures. What are your aspirations for Privacy in the future?

[32:21] - Speaker 2
Oh, my gosh, this is my aspirations are like the thread that we have to walk or the tightrope that we've got to walk across because as you said, you talked about drones early on this conversation, and the reality is, look, the cost of cameras is going to drop to zero.

[32:37] - Speaker 1
More or less to nothing to nothing.

[32:41] - Speaker 2
Basically, the cost of drones is going to drop. The cost of certain data storage in the cloud by a state law enforcement or intelligence agency is dropping to zero. So the real world is that it's going to get easier to collect data about people and to store that data and to analyze that data. The cost of facial recognition only. We have these devices everywhere. We can turn that into information about people. There's also AI to lip read from a drone based camera, like all of this stuff.

[33:20] - Speaker 2
And so we have to live in a world where we fundamentally respect the rights of people to be private and where we do two things. We use the law to create real checks and balances on what people can do, right. And where we give people the freedom to use technology to protect their Privacy as well. And I think that's the best world we can agree with. We live in.

[33:48] - Speaker 1
Yeah, I remember doing the math a long time ago, and I haven't done the calculation in a while, but I seem to remember that the cost in terms of storage of recording every minute of audio on the public switch telephone network is in the order of, like, single digit petabytes.

[34:05] - Speaker 2
I think it's something like that. I remember doing the Snowden revelations and so on when we learned about one of NSA's programs was they had picked I think, five countries where they were trying to intercept and store all of the phone conversations and doing that math. And having the Petabytes, and even at that time, Petabytes was difficult but not impossible to do in the cloud.

[34:27] - Speaker 1
Well, at that time, Facebook was doing like, one to three petabytes a day of photos.

[34:33] - Speaker 2
And now if they want to, it's a lot less than that.

[34:37] - Speaker 1
That's nothing. You can store it in Ram now. But I digress one thing I'm thinking a lot about is the way in which the cost of drones is dropping just dramatically. And also the size of drone that you need to do, like, high resolution video is tiny. You could imagine, like a drone the size of your fist that has a 4K camera on it.

[34:57] - Speaker 2
It's going to change the world, but it already is. At the same time, I think the mechanisms for stopping that surveillance from happening are going to be way less technical and way more legal, which is that you have a right to expect that somebody can't step on to step up to the window into your house and peek in.

[35:18] - Speaker 1
Right.

[35:19] - Speaker 2
That's actually illegal. They can look from the street, but they can't just, like, come up to your second floor window, put a ladder up against your house, come up to your second floor window and keep it on you or your kids.

[35:30]
Right.

[35:31] - Speaker 2
And so we have a similar, I think, reasonable expectation that similar things apply to drones. Now, that's complicated, because today they don't necessarily home ownership or not. Private citizens don't own the airspace above them. If you live in a home or you live in an apartment building and your landlord owns the property, they don't own the space above it. And that is interesting. It enables a lot of stuff who owns it? Nobody does.

[36:05] - Speaker 1
Well, then if it's nobody, it's probably the state, then at that point, right.

[36:08] - Speaker 2
Sort of. It is the States. Basically, I have a drone, so I use my drone. I'm a little familiar with these laws, but I'm not an expert at, but the FAA bases regulate stuff below 500ft, except near airports and near some other situations, like stadiums, the White House and so on. And so between some distance above your house. So it's not very high. And that 500 foot limit where aviation starts to kick in. It's a largely unregulated space. Not completely. Whoa. It's not completely unregulated. There are some issues that the FAA does have some constraints on drone flying in that space.

[36:48] - Speaker 2
Like if the you had drone line of sight, you'd only do it in daytime. If you want to do anything beyond that, like nighttime flights or flights where you can't see the device or flights over large crowds of people, then you need to have various reasons and so on. But it's actually good that it's relatively underrated, because otherwise, if I lift my drone off and I want to fly at several blocks away, I do have to overfly some people's houses. And if that was all controlled by them, it would be a legal morass.

[37:23] - Speaker 2
Or doing commercial flights over a city, for that matter would be very legally complicated. But it does mean that it's not impossible that we'll have drones with Super Zoom 4K video cameras that can easily peek into open windows. And I don't think the ideal solution for Privacy is for everyone to just have their curtains closed all the time, right? Yeah.

[37:48] - Speaker 1
We shouldn't have to hide to live in our society. Yeah.

[37:51] - Speaker 2
So I think it's actually okay to pass laws that can, strange, to some extent, private and state operators to say, okay, there's limit to what you can do. You can use general video. You can capture stuff. I love to overfly the park on my house, and I get pictures of the trees, and it's lovely and so on. But you can't use it to surveil. You can't use it to deliberately try to get information that people have a reason for Privacy, too. You can't use it to speak from people's conversations or to track the movements of people from place to place.

[38:24] - Speaker 1
So one of the things I think a lot about is John Perry Barlow once famously said that technology outpaces law. The rate at which we're developing new tech is much faster than we can write new regulation, because we tend to be extremely thoughtful about regulation. And technology is just somebody in the basement banging on circuits until the thing turns on. And so there's a sort of reality of consensus. We don't live in a dictatorship. We don't live in a place where a single person writes the laws.

[38:50] - Speaker 1
We live in a world where we have to get agreement in order to put laws in the books. And so that means that technology will almost necessarily outpace law. How do you think laws should react to that reality?

[39:03] - Speaker 2
I think Barlow is right. And I think also in the startup universe, he was talking about innovation happening at the edge of regulation. It often happens in places where regulations are not totally clear or they haven't yet moved into some new space. And so I think I'm not a fan of the precautionary principle. I don't think we should immediately regulate everything we don't understand or immediately ban everything that's new. We don't understand. I think that's unreasonable. I think we need regulators and lawmakers who are attuned to what's happening in technology, who are paying attention to it, who are not prone to moral panics and immediately assume that anything new is bad.

[39:45] - Speaker 2
But who do think about what are the potential consequences of this? There's room for science fiction is good at exploring the potential consequences and ripple effects and secondary effects of technology. I think there's a variety of public organizations and think tanks that convene people to think about this in terms of in science and tech. Like what's happening? How will it affect the world? What should we doing? And I'm not sure if there's a good general answer other than just like to pay attention to not slip out on things, to actually look soberly at the benefits and negatives of things and mostly not put tech into a black or white box.

[40:29] - Speaker 2
Try to find ways to recognize that experimentation and innovation are actually generally cycle goods that basically every technology has some negative side effects and try to find a way to allow as much innovation as possible while reigning in the worst problems. Yeah.

[40:46] - Speaker 1
So how do you think about people getting educated on new technology? Obviously, there's a lot of new things that come about in the world personally. For me, cryptocurrency wasn't actually real until I sort of held my own cryptocurrency, transacted with my own cryptocurrency and got to actually feel the product and experience it. I think that's true of a lot of technology. But if you're somebody who's a legislator, you don't necessarily have a perfect ability to try out every new technology. It's just you're busy legislating. So how do you get educated?

[41:15] - Speaker 2
Yeah, that's a really good question. I don't know. There's no answer for that. I think that Americans are Technochic. We actually really like tech. And so I think Americans go out and try new things. I think one thing that's relevant in a lot of technologies is to see what the kids and teams are doing because they're often ahead of you in terms of it. I think science fiction is actually another useful place. Often sometimes it's about something that's not really technology sort of magic, but near future science fiction.

[41:43] - Speaker 2
I think it's quite interesting refloring some of the stuff. I think one of the challenges with legislators and regulators is they often live lives that are kind of different from the people that they are governing, just in the sense of if you're a legislator, you have a staff through a whole lot of stuff. You don't necessarily have some of the day to day challenges that every normal person in your country you're helping to govern does. You have some of them and increasingly, legislators still, like, send texts to their friends and have kids, pictures of their kids and their phones and so on.

[42:22] - Speaker 2
But I think there is some sense of you've got to look for what's the experience of people that are unlike you as well, because they might have a different experience with technology than you do.

[42:32] - Speaker 1
Yeah. It's really interesting thinking about that. I remember hearing when I was younger that one of the heads of the major movie Studios didn't write emails. He had a Secretary who would dictate emails, too. She Typed on a typewriter and then Retype them on email. And I thought that was just, like, the most mundane thing ever. And then I realized that he didn't have any subpoenaable emails.

[42:54] - Speaker 2
That's interesting.

[42:54] - Speaker 1
Yeah.

[42:56] - Speaker 2
That was the world for quite a while. I worked on email at Microsoft, and we would hear stories like that, and it's kind of crazy, and it does have these ripple effects as well. But I think that's not a good thing. I think that even though wearing an email does make it sustainable, I think you still have to actually live in the world that everybody else is living and to understand it.

[43:19] - Speaker 1
I think that's absolutely true. It's hard to understand the email if you are banging on stone tablets as it were.

[43:25] - Speaker 2
Yeah. Exactly.

[43:27] - Speaker 1
Is there any Privacy technology that does not exist yet that you think should exist?

[43:32] - Speaker 2
That's very interesting. I think one thing that we have limited technology on is technology defeat facial recognition. And we have seen interesting things, interesting papers on certain patterns of makeup and so on.

[43:48] - Speaker 1
I'm not going to put my face into, like, a checker box of, like, black and white to walk out in public.

[43:53] - Speaker 2
Yeah, I know. It's really weird. It makes you stand out if you're doing that, it's Privacy that attracts notoriety. So I think that's an interesting thing. I think encryption is still too hard. You and I both use signal, and that's one of the first things that's made encryption really easy. But most people can correspond in email, or they use SMS.

[44:16] - Speaker 1
Like.

[44:18] - Speaker 2
You have no option for SMS. Maybe you've switched to Signal or something like that for equivalent option. But if your friends and family aren't using Signal, you're out of luck. If you're using email, try using PGP. If you're a normal person.

[44:37] - Speaker 1
Well, even if you are using your own server that has encryption for email, you're forced to interact with every Gmail. Right. So it doesn't do you any good that you're on your own server, right.

[44:44] - Speaker 2
That's absolutely right. And I'm a techie. I'm a geek of a computer science degree. And if I try to use PGP, it's a fucking chore.

[44:53] - Speaker 1
Yeah. There was a security researcher who once said that they stopped opening PGP emails because every time it was somebody accusing them of being an NSA agent, it's like full tinfoil hatterate. And it's not that these technologies are bad, right? Pgp is like the original inspiration for the actual little ratchet protocol to underpin signal.

[45:14] - Speaker 2
Right.

[45:14] - Speaker 1
Like, if you don't have those kinds of technologies, you don't get to the consumer facing product. I just want to touch really quickly on the consumer facing aspect of making secure technology. Number one, it's hard. It is just crazy hard to make a consumer facing technology that conceals all the Privacy under the hood. It's something that we thought really deeply about. It mobile coin when we were designing it. How do you make a cryptocurrency that doesn't feel like you're using a cryptocurrency. You don't want to type these long hashes.

[45:43] - Speaker 1
You don't want to do key signatures, all that kind of stuff. You don't want to do any of that. You just want to click a button to send money. And making that product is really hard. I think part of the reason is that most companies don't want to do stuff open source. And if it's not open source, can you actually trust it? Like Facebook says they use the signal algorithm. I'm sure they do. But what are they doing with your data at rest? Yeah.

[46:09] - Speaker 2
And your metadata and so on. Yeah. And I think making things easy is what gets adoption. And so a perfect Privacy technology that nobody can use doesn't really do anything. And also making things that are complex, simple is brilliant. It's really hard. And it's really important in this world. I'm not sure how much I would focus on new technologies that we need rather than what can we do to make the technologies we have actually user friendly and math adopted.

[46:43] - Speaker 1
So one of the things I think a lot about is like in WhatsApp right. Like WhatsApp uses and then encryption. It's really secure. But there's this aspect of the fact that you can lose your phone and get your chats back. And so how does that happen? Because you're obviously not taking your root entropy, your encryption key from the phone you just lost in the Lake and restoring that on a new phone. So to me, obviously, that says that your chats must be stored unencrypted inside of Facebook, right.

[47:11] - Speaker 1
Like that's the only way that can happen. So if they're unencrypted in Facebook, then what good does it do you that they're unencrypted when you're sending them?

[47:18] - Speaker 2
That's a great question. And I have no idea. Maybe you know the answer to that.

[47:22] - Speaker 1
Well, I don't, because Facebook's code is closed source. The reality is that we can never know.

[47:27] - Speaker 2
It's a real problem, especially because someone could subpoena Facebook, and you have no idea also, what they'll do. Yeah.

[47:34] - Speaker 1
Is that they're returning over the metadata is that they're turning over all your chat history. You actually don't know.

[47:39] - Speaker 2
And I think that's also an aspect of what Apple fighting subpoenas, for instance. Or honestly, Google is not perfect, but they've fought for Privacy legally, a number of times when they've been when one person asked them for things they fought against subpoenas. So I think one of the things we need to think about is normalizing, that social contract normalizing, the expectation that you have that a cloud provider is going to take every measure possible to protect your Privacy and notify you if your Privacy is breached. And that's still pretty fuzzy.

[48:14] - Speaker 2
But I think Apple's push on Privacy over the last couple of years. Maybe it's helping to get some of those ideas out there.

[48:21] - Speaker 1
I love what Apple is doing, the way in which they've just come straight at the issue and actually just really driven dramatic change in the world, in the way that apps work and the way that they collect data. You know, it's a big change when people are upset about it.

[48:36] - Speaker 2
It's great to see.

[48:37] - Speaker 1
Yeah. So I know we're getting close to the end of our time here. We had a question that came from a dear friend of mine on Facebook. He said, ask Hermes about the time he got a Colt for his birthday. Do you have any idea what he's referring to?

[48:52] - Speaker 2
None whatsoever. I do have an idea. I was given a joke Colt for my 25th birthday that I thought would last for about one evening of my birthday party. And then instead, when a group of friends and I decided to go to Brain that was voted as the theme we should use. So it's interesting having a joke cult named after you. And after five years of having that the Church of mez, we had a board meeting, quote, unquote hilarious event where the Church was deemed morally bankrupt by the board and expanded.

[49:31] - Speaker 2
So it was a fun experience. And it does get back to these questions of Privacy. It was all tongue in cheek. It was very humorous, but I think people have wacky things without their friends to still have their communications about it be held a little bit close to the chest. Sure.

[49:51] - Speaker 1
And I guess the dissolution that Colt was probably held in private.

[49:54] - Speaker 2
It was held in private at my home. That's why the first one was being publicly about it.

[50:01] - Speaker 1
Well, thank you for sharing that with us. We really appreciate it. Mes. I just want to say this has been wonderful. Thank you for being our first guest.

[50:08] - Speaker 2
Josh, it's absolutely a pleasure and an honor. I'm really so blossom out of the world.

[50:20] - Speaker 1
Thanks for listening. This has been Privacy is the new celebrity with Ramez. Nah. Have a good night.

[50:26] - Speaker 2 
Bye bye.