Richard Helppie's Common Bridge

Episode 15- Of Cyber Security.

January 14, 2020 Richard Helppie Season 1 Episode 15
Episode 15- Of Cyber Security.
Richard Helppie's Common Bridge
More Info
Richard Helppie's Common Bridge
Episode 15- Of Cyber Security.
Jan 14, 2020 Season 1 Episode 15
Richard Helppie

Rich Talks about the problems how much your privacy is being compromised by big tech.

Support the Show.

Engage the conversation on Substack at The Common Bridge!

Richard Helppie's Common Bridge +
Become a supporter of the show!
Starting at $3/month
Support
Show Notes Transcript

Rich Talks about the problems how much your privacy is being compromised by big tech.

Support the Show.

Engage the conversation on Substack at The Common Bridge!

Speaker 1:

Welcome to the podcast, the common bridge with Richard helpy. Rich is a successful entrepreneur in the technology, health and finance space. He and his wife, Leslie, are also philanthropists with interest in civic and artistic endeavors, but with a primary focus on medically and educationally underserved children. My name is Brian Kruger, and from time to time I'll be the moderator and host of this podcast. All right, welcome back to the common bridge. Last week, rich, we had a ball. Uh, you were, you were going on. We, you know, we kind of recap what we were doing with, uh, um, with, you know, what the common bridge is all about. And we talked a little bit about, um, the election and we talked about impeachment and stuff like that. And it's fine. Let's get back into some policy now and I'm going to, I hope it's not a surprise, but we're gonna, we're gonna tweak you a little bit. I want to talk a little bit today about cybersecurity, um, and, and the technical well, and you can kind of frame this if you want. Um, you know, maybe reigning in what's going on with, uh, with, with, uh, you know, high tech and what they're doing with, um, privacy and such. So, um, welcome back to week two and welcome back to January for the common bridge.

Speaker 2:

Great. Thanks Brian. And, and, uh, you're correct. I guess I should apologize to my listeners somewhat that spend a little bit more time than I wanted to on the political framework. But I do think it's important, uh, and it's important in this regard. We are polarized in over the holidays. There were a couple events that should be absolutely nonpartisan that I was shocked at the partisan division. Every member I've been looking at this, I'm like, this polarization has to stop. And I was surprised at the whole hum attitude if it was harming the other side. Uh, I'm, I'm reminded, and I'm not making this up by the way, a couple of years ago, I was just being silly on Facebook and I said, well, you know, can we all at least agree on the law of gravity? I remember that actually. And as it turns out, we can't because apparently it must benefit one demographic over the other. So it's a, that was a great exchange. Remember that? It's like we can't, like we, there's gotta be some things that unite us that you would think the law of gravity wouldn't be controversial, but let's not open that can of worms. Let's talk. Let's talk a little bit about the modern world. Let's do, um, Brian for the benefit of my listeners that don't know this. Uh, I had the privilege of working in the computer systems industry, um, from a very young age and through its evolution, mainframe computers to client server to internet one to where we stand today with like an unearth, a high school picture of you with a punch card, uh, that you can. Okay. Um, but it's, uh, uh, so this arc of, uh, technology, uh, we've reached a point now where the machines themselves are really, really, really smart, um, that they learn, they teach each other. Um, they're, it's a, the, under the broad umbrella of artificial intelligence, uh, predictive analytics, um, and, uh, big data engines. Uh, we now have the capacity to store, uh, so much information and to extrapolate what that information, what those little bits, um, put together might mean. Um, and we're using it in warfare. We're using it in commerce. Uh, we're using it in research. Uh, last night I actually watched a, uh, film from national geographic and they had sent, uh, cameras down to look at this, the wreck of the Titanic. And they had 67 terabytes of data and reconstructed what the Titanic would look like if the ocean was drained around it. Cool. Uh, we didn't have that kind of computing, um, there, right? So now we have these really big behemoths. Uh, we have, uh, Facebook, uh, Apple, Amazon, Google, and others that are harvesting our data. Uh, we've become the product. And this has massive implications for us in terms of our, our personal freedoms, uh, because one of the things that everyone should understand is that there's, there's really no escaping from it. And secondly, every time you touch a computer or a phone, um, or walk in front of a security camera, uh, that is being stored. It's storage is very, very cheap and matching technologies are very, very good right now. Um, and if you think you can move into the mountains of Idaho and escape, it's can't, uh, because you really visible from satellites and they can read, they can read your Fitbit from up there if they want to. Um, and we volunteer our information every day at a cash register when they say what's an email address I can get. And they just, they track exactly what you purchased and what year exactly the habits are right there. So, so I'm going to, I'm going to just spend a little time today on a Google, um, and then a little bit about where effective government should be and what we need to be thinking about in terms of policy answers to these modern, uh, problems, modern dilemmas, if you will. Uh, because we do like the benefits of the convenience. Sure. Um, you know, I, I like being able to transact many things in my life, uh, at any time of the day or night. Uh, just if I have my phone. Um, I'm also cognizant that my phone number is one of the absolute best ways to match a person. Yeah. And that's why you're always asked for that phone number because we know we know who you are at that point. So look at Google. Alright, you've got the Google search engine and Google knows what you're looking for when you're looking for it. And they remember what you looked for in the past and they're starting to build a profile on you. They know what kind of music you listened to. They know where you're thinking about driving on Google maps. Uh, they looking at YouTube for your music and your streaming services, their email, they're reading all your emails to see what you might be talking about. And I'm sure almost everyone listening to this has had the experience. It might be planning a vacation and email somebody about it and pretty soon you're getting served up ads for that location. Uh, they, you've got Google voice, so now they want to have all your phone calls. Who did you call? When did you call them? Okay. How long did you talk to them? How frequently do you talk to them? Uh, images. Oh yeah, they got your photos too. And they get to keep them and they're a the ISP. Okay. Google is not satisfied. They want to be your ISP so that if you go around their search engine, and by the way I use duck, duck go, um, as a, as a search engine that does not keep track of you let your internet service providers one thing, but your IP address is something that you can't change. That's precisely, yeah, you can put, you can get you get behind a VPN and then I recommend that you do. I use, um, a product called Nord and Ord, um, and duck duck go. Uh, but there's no hiding from that. It's because, you know, I've got to get out through, um, an ISP internet service provider and Google wants to do that too. Sure. Uh, Google also wants to be in the bank business, you know, Google pay and now they want to know who you're writing checks to, how often you're writing them, uh, uh, Google pay, why not? You know, what you're buying. Uh, so your, your convenience, you want to know what the weather's like. Say Hey Google, guess what Google is listening to every conversation in your house waiting for you to say, Hey Google. And they're storing that and they're there. They're convinced, they're comparing it with all the other sources of data coming in. Um, there are some companies that that are, they buy up ad sense and DoubleClick or two of them. Those are some over 200 companies that are owned by Google that they're harvesting

Speaker 1:

frightening one. Cause you think you're somehow avoiding Google and you peel it back the onion and they don't peel back too far to realize who the owner of that company that you're, if you think you're working.

Speaker 2:

Exactly. And then in, um, uh, they have a developer conference. Alright. And in may of 2018, uh, Google's still not satisfied there. They're writing, uh, their artificial intelligence, um, into the, uh, chip. So, you know, as soon as you pick up that device, they're recording what goes on. Um, that's what I also caution you about that Chrome notebook. Sure. Um, and then, um, recently Ascension healthcare, the largest not-for-profit, uh, health system in the United States. Um, they were notified about a partnership with Google and a, there's a law called HIPAA, which protects your privacy. Um, but there was a lot of loopholes in HIPAA.

Speaker 1:

Do you just put air quotes around, uh, protect your privacy?

Speaker 2:

Well, there are, there are limitations, but there's a wa under HIPAA health system can share data with a business partner if that information is used. Not, I'm using air quotes only to help the covered entity carry out its healthcare functions, not for the business associates, independent use or purposes. Um, and that gets a little sloppy, doesn't it? Well, I guarantee you there's a, every lawyer listening to this says, Oh, I know how to write that. So it's for healthcare purposes. Um, and this is intensely personal stuff. Um, you know, one of the things that I actually don't have a background in, um, healthcare information technology. Um, there's things that can be embarrassing. You know, certain diagnoses and occasion medication and there's a mental health issues and those types of things. Um, and now Google has an understanding about some of those most intimate details of your wife and what you've purchased.

Speaker 1:

And I understand with Ascension health and others firsthand, uh, that, uh, the training that went on with Google docs, replacing all the Microsoft suite with Google docs and Ascension, uh, has had everybody understand that nothing is going to leave the cloud. Um, when you send an email, all of these people are going to be copied to it. So they had to relearn about, um, you know, in the word, uh, world. You could be a writer and you could make your own changes and your own, uh, profile for your own changes. And those will be shared between, say you and I, those are all be shared within the whole system. Now. Um,

Speaker 2:

Oh, cool. Uh, Google docs is a, and I thank you for the reminder should have mentioned that Google docs, they're reading every document you store there. They're looking at your spreadsheets. All right. So, um, there is not a hint of privacy there and the business model pivots on it, uh, because they can sell that information about you. Um, I'm gonna go back a long way, Brian, and I'm not gonna give you the specific year, but the first company I worked for in computing, uh, we, our product was, uh, developing direct mail and this was very primitive stuff. But we would take, uh, car registration lists that we buy from States and magazine subscription lists and census track data and build a profile around households to see who might be interested in buying a new Ford Mustang. Sure. That same mindset now targets people literally from birth up. Alright, here's a photo of my new grandchild. Oh boom. Now it's in the cloud. We know who he or she is and all the references get tagged together. Now is there any way you put that toothpaste back in the, uh, I don't, I don't think there is, but I'm cynical. Well, here's, here's the other risk too and I'm only gonna touch a little bit on the, uh, other companies and I should have included Twitter in there as well. Um, if you protest too much about this, they can deep platform you. And if you think about, uh, this character Alex Jones, who is a, I don't know if I can call someone a nutcase without, uh, being exposed to, um, some kind of suit. So, but what's he going to do? He was simultaneously taken off of all the technology platforms. He became a non-person. Maybe that, maybe that's the key though. That's how you get off the grid, right? Well I'm sure he's still being tracked. He just doesn't, he no longer has a voice. And so when you think about what we're doing here today with this podcast, um, you and I are our persona, our ideas, our thoughts are getting communicated. We can just as easily be switched off if somebody doesn't like what we're saying cause they did it to Alex Jones. Sure. And again, that comes down to some of the things I've touched on earlier about the polarization and the boundaries. Who should be the arbiter of what is good speech and what's not good speech. So, so pivoting a little bit to what would effective non-polarized government be doing to protect us, the citizens from these private companies that are wielding so much power and have the ability to violate our privacy, to control our lives and indeed take away our voice.

Speaker 1:

Okay. So one of the best things about your podcast and everybody says that as you lay out the problem really well, and then the next thing you do is you come up with a really nice broad brush and sometimes specific, but it's broad brush solution. How do you fix this? It's okay.

Speaker 2:

Uh, no different than what we've done in the past as a country. We have antitrust laws and these tech companies need to be broken up. So we go off Teddy Roosevelt on him. Well, look, the, it worked for the oil companies. Um, uh, IBM was subject to a consent decree when they were using their market power. Um, Microsoft, a little different case, but Microsoft was also, um, thwarted a bit in, uh, by using antitrust, uh, legislation.

Speaker 1:

A lot of us over a certain age, remember when, uh, the bells, right?

Speaker 2:

Yeah. Well an at and T was broken up. And here's the really interesting thing. If you go back to the rise of these tech companies that are so powerful today, a lot of that was spawned at the time of the at T and T breakup. Oh yeah. I guess deep than irony is indeed. And so now it's time for them to be broken up and, and so if you just think about the little recitation I've just gone through with Google, Google needs to be broken up probably into a dozen companies. Um, and we need to have laws about how you share information amongst companies and, and with some teeth in it. Um, you know, we, we did this with consumer lending where you had to not only disclose to the consumer, um, here's what your loan is, but it had to be in plain language. All right? If you take out this loan for a$20,000 car and you pay this interest rate, it's going to cost you 40,000. Right? Alright. You got to lay it out just like that. And instead we have these really lengthy end user licensing agreements. And if you don't click it while your stuff stops working. And, and so I think using the antitrust legislation, using the consumer protection laws that are there and applying it to today's problem gets us out of this. But you know what's going to be required, uh, not nonpartisan indeed a nonpartisan way. And wouldn't it be nice if we could turn on a local news channel or a cable news channel and there was a person there explaining what I just explained a few minutes ago about where Google is and what also say Senator X has proposed legislation that would do the following or the suit has been filed and this is what the plaintiffs want to happen to this comp company that it needs to be broken into.

Speaker 1:

But Richard did it. Didn't you have a, you have a lot of interest in a lot of business and a lot of just everything. Um, in the last 30 years, tell me if I'm wrong. It just seems to that, that the, uh, conglomeration of companies has become a lot more prevalent than the splitting up of companies. It seems that, you know, uh, we see it in the movie industry, we see it in tech. Everybody's buying everything. I haven't seen a trend that goes the other way, and it's been 30 or 40 years I've seen this. It seems like there's a lot of rubber stamping in Washington of let's let those two companies get together and these two and then those two. And now what does that one, how do you fix that?

Speaker 2:

Well, I think the paradox is that the technology has, uh, undone some of those anti-competitive practices or that if you think about how many places you can get streaming services for movies right now, right. You know, Comcast, Apple, you know, Google, Netflix, uh, they all, uh, Amazon prime, they all want to sell you that movie. Sure. All so, um, if those companies started consolidating, uh, then we'd have a real problem. Don't you see that happening? Um, no, I don't think you're going to see that happen. If it does, we're, we're, we're cooked at that point. But also we're technology like movies. A great example, um, with the kind of, and I'm no expert on this, but, uh, I, my understanding is that the, uh, digital cameras and things that they've got, you can shoot a high quality movie without the big overhead expenses in the studio time and everything else that you used to know. So that is really what the beauty of capitalism is that an idea cannot run a bureaucracy and we're capitalism fails is when the bureaucracies are able to keep out all the innovation and uh, and take control. And that's what we're on the verge of happening right now. If we don't address, uh, these big tech companies and the power that they,

Speaker 1:

we'll kind of wrap this up. Um, your underlying theme has been this for a long time is just trying to get involved. That, that, uh, that the, the voter or the citizens should try to get involved, become a lot more independently informed instead of just locking into which side of the news media you want to get. Try to try to read things yourself, learn it yourself, and then approach Capitol Hill, re re repo, uh, approach your senators. And your congressmen and go at it and try to do it in mass.

Speaker 2:

Well, let me end this podcast with just one piece of advice. Sure. How to read the news and I've got a lot, a long list of these. I'm just going to do one when you read the headline and if it's a conclusion rejected and go read the article and see if there's any data that leads to the conclusion. Because remember the headline writers trying to alarm you and is trying to get you to take a position. Sure. Don't give it,

Speaker 1:

that'd be less than number one. Rich, thanks a lot. Um, and we'll see you next week. It's always informative. Love these conversations. Folks. If you want to learn more about, uh, the common bridge, look@richardhelpy.com. Um, like I said, towards the end of this month we'll have a newsletter up and we're going to start posting some PDF. So some of these documents that we think that you should be able to read, we'll try to make it easier for you to see some of this and not have to fish too, too far. In fact, you could tell your friends to go to Richard, help you.com to read, uh, the IgE report, for instance, and um, and we'll go from there. But thanks a lot rich and we'll see you next time. Look forward to it. Brian,

Speaker 2:

you have been listening to Richard is common bridge podcast recording and post-production provided by stunt three multimedia. All rights are reserved by Richard helpy. For more information, visit Richard helpy.com.