Security Insights - Cybersecurity for Real-World Workplaces

No User Privacy Allowed: Controlling Risk From Clubhouse and Leaky Social Media Apps

March 02, 2021 Chief Security Officer Phil Richards and Head of Endpoint Security Product Management Chris Goettl with Ivanti: Cybersecurity and Information Technology Solutions Season 1 Episode 3
Security Insights - Cybersecurity for Real-World Workplaces
No User Privacy Allowed: Controlling Risk From Clubhouse and Leaky Social Media Apps
Show Notes Transcript

New platform Clubhouse (and other social media apps like it) present uniquely appealing privacy risks for cybersecurity teams trying to secure their end users -- especially in light of new legislation from the United States and the European Union.

Chief Security Officer Phil Richards and Head of Endpoint Security Product Management Chris Goettl discuss the inherent security concerns social media brings, including:

  • Developments in privacy legislation around the world cybersecurity teams must adhere to.
  • Why and how Clubhouse and other personal social media apps invade user privacy and present a risk to employer organizations.
  • Social media privacy best practices, including proper cyber hygiene training and user education.



  • Next episode going live June 29, 2023!
    • New episodes publish around the second and fourth Thursdays each month.
  • For all show notes, resources and references, head to Ivanti.com/SecurityInsights
  • Join the conversation online on LinkedIn (linkedin.com/company/Ivanti)

Adrian Vernon: Hello everyone, and welcome to Ivanti Insights, episode three. Now every two weeks, we'll be sharing our thoughts with you on the latest cybersecurity news. I'm Adrian Vernon, your host for today and here at Ivanti I'm the Director of Sales Enablement, I'm joined today by Phil Richards, Ivanti's Chief Security Officer and Chris Goettl, Sr. Director of Product Management for our security products. Now Chris, in our previous episode two weeks ago, we had the chance to meet Phil and get to know him a little more personally because you, unfortunately were not able to join us. So Chris, we turn the spotlight on you now, let's begin with your 15-second professional bio, who is Chris Goettl as it relates to security products?


Chris Goettl: Hey Adrian, yeah I've been with Ivanti through a number of acquisitions over the years here. I came originally out of a small patch management company, spent some time at a large company called VMware and then came over to what is now known as Ivanti. So been in the space of helping companies to address software updates, security vulnerabilities, and now a broader variety of security concerns like password passwords and privileged management, application control, antivirus mobile threat defense. So we focus on a lot of different topics here at Ivanti, as far as security is concerned. I've also done a lot around, some of you may know our patch Tuesday series where we focus a lot around talking about breaches and vulnerabilities and what should be prioritized and addressed first. Definitely been around for 16 plus years in this space and helping people to make sense of the massive things out there, putting us all at risk.


Adrian Vernon: And Chris, when you're away from the office, trying not to think of cybersecurity news and relax in some way or form, what do you like to do for fun?


Chris Goettl: Oh a couple of things. I'm a homebrewer, so I actually brew my own beer, I do make my own wine as well; that's more for my wife than me, but yeah, I do a lot of home brewing. So right now on tap at home, I've got an Irish red ale and a Scotch ale that is getting me through the COVID era here, so doing all right in that front. Otherwise, I do a lot of PC gaming as well yeah, those are probably my two favorite hobbies.


Adrian Vernon: I have a 14-year-old son, he's big on fortnite, you dabble in fortnite as well?


Chris Goettl: My son plays a bit of fortnight, the game that I've been playing most recently is called valheim, it's an interesting online kind of survival sandbox game, but you basically become a Viking and go and fight all of the evil things that are trying to threaten valheim. So it's a fun game, a couple of friends of mine are playing that, so we were one of the original Vikings to join, now there are over 300 or 3 million people actively playing this game in just a few weeks' time. 


Adrian Vernon: Wow okay, and probably still growing as we speak. All right, we appreciate your sharing that Chris, it's good to get to know you a little bit more, now let's talk security. So gentlemen, today our focus is going to be social media as a whole, but let's begin with some hot news in Clubhouse. This audio-only social network where members can gather in virtual rooms, they can listen to one another speak. It's really a hot app right now, and you must be invited to join Clubhouse from someone who is already a member. Therein lies one of the issues, as part of the signup process, you're urged to give clubhouse access to your phone's contacts, and it seems Clubhouse is using that information to build profiles of people who aren't yet members. A Forbes article earlier this month had the headline, Clubhouse the hot social network has big privacy concerns. Phil, I'll toss the first question your way, where do we even begin with something like Clubhouse?


Phil Richards: First of all, Adrian, thanks, it's great to be here, and that's a big topic. But really, where I tend to start thinking about this is in the space of privacy. There has been an awful lot of privacy legislation now, in fact, privacy is so important that that states in the United States and several different countries around the world, as well as regions, have adopted privacy standards. One of the big standards in the United States is called the California Consumer Privacy Act or CCPA that was passed in January of 2020, so just a year ago. Of course in Europe, there's a General Data Privacy Regulation or GDPR, that is actually for the European Union. Interesting that with Brexit, the country of Great Britain has passed a component or a reciprocal GDPR component as well, so there's actually that law in Britain as well. The point is these laws also have teeth, it would be nice if you would safeguard privacy. There are actual monetary consequences for violating the laws and things like that, and they have a number of components to them. Some of the major components are, and this is where it really hits home with our topic today around Clubhouse, is that when you have members sign up there are some real restrictions on what you can do. In terms of gathering private data without disclosing what you're going to do with that private data and without providing means for your customers to correct it and to remove it if they want it to be removed. 

Some of those requirements limit what some of these apps like Clubhouse can do in terms of just going into your email and finding all of your contacts and sending messages out to all your contacts and things like that. At least on paper, the problem of course is there's a number of companies that are still working or behaving in these ways that are against the rules or against some of these laws. And it's just a matter of when will the prosecution get to the point where they can enforce some of the regulations that they've put in place in some of these places, so there's a lot going on in this space. One of the first things that I want to think about is privacy, how it affects not only Clubhouse but social media apps in general. And I do think that's an interesting area of discussion because there is so much that's changing and being different in that space compared to even just three, four years ago,


Adrian Vernon: Okay so Chris, let me ask you, so Phil's indicating, yes, Clubhouse is in the news right now, they're a hot app but this extends way beyond Clubhouse. With something like this, when we talk about the privacy concerns, what if I'm in Clubhouse, for example, and I'm doing this on my own time, how am I potentially exposing my corporate network here at Ivanti as an employee?


Chris Goettl: Yeah, and this is very true of many social media platforms in general. One thing to understand, as you do this is you should never expect that everything you do on these social platforms is private. Many times there are different things that can be looking at or data collectors that could be aggregating data that we aren't very aware of. This has happened many times, there was Cambridge Analytica with Facebook before. There's a recent set of issues going on with another major data broker called social data that was aggregating data from TikTok, from YouTube, and from other sources. They're taking a lot of this information, aggregating it, and looking at it from a variety of different purposes, and that is one major concern with that. The way that they're approaching, trying to branch out their network is, 'hey, I'm going to encourage you as a new sign-up, a new member of our platform to invite some of your friends.' So their whole, "Hey, here's two invites, go invite them." By the way, something that I remember very distinctly back in the Gmail days when it was in beta, you had two invites and everybody was all excited. 'Hey, I know somebody who's got invites, they're going to get me one and I'll give you one.' It became this craze and it took off very quickly; that's working in their favor. The thing that's happening behind the scenes that we all have to be conscious of though is as it's doing that, it's going into your contacts. It's going into all of that information and it's building a data web, it's connecting these dots. What can that data be used for, good or ill? And that's the thing we always have to be cognizant of and aware of as we do that. 

The military bases around the world from multiple governments had to start to raise concerns about fitness apps and GPS location. There were cases in 2018 and 2019 of bases being fully mapped as far as where military personnel were jogging and running throughout those facilities, mapping the grounds mapping paths that they might be taking. All of these things can be used in different and interesting ways that we don't think about on a day-to-day basis. Corporations that have banned smartphones in general from certain types of meetings. If you're going into an innovation meeting or a boardroom meeting, no smartphones. How many times have you been in a meeting where Siri on somebody's phone suddenly goes off saying, 'I didn't quite catch that;' Siri is always listening. When you connect with one of these social platforms, think about all the things it asked for. Access to your audio, access to video, access to contacts, all of those things that app could now scrape. In fact, this Clubhouse app already has its first significant data issue, somebody found a way to use JavaScript to basically scrape audio conversations already. This news just broke in the last couple of days here. Security and privacy issues are definitely one issue, but how could this be used in other ways against us?


Phil Richard: It's interesting, some of the things that Chris says are just so indicative of kind of a lifestyle that we're living now, where we feel like we have to trade our private data for access to these quote-unquote free apps so that we've been able to get access to them. The issue also exists with companies that collect our data when we're not even aware that they're collecting our data. My mind goes back to the Equifax breach a few years ago where Equifax was… we're not consumers, in fact, as a person, I don't go in and look at other people's credit reports or anything like that. We're actually just the data, and Equifax uses that as justification for being able to hold a lot of data without talking to the people who owned that data. So you and I, the consumers never allowed Equifax or TRW or any of those companies to get and collect your data, and that's because we're not the subjects of that data, we're not the customers. 

This California Consumer Protection Privacy Act, as well as GDPR, some of the things they do is require these companies to provide you with information about what they're storing about you, and how they're using that data. Then give you an opportunity to correct and modify and change it and things like that. So there are some rules in these laws that allow you to be protected, but these laws are not global, they're certainly not universal. A large portion, especially of folks here in the United States simply aren't covered by any of these laws or regulations, which protect your ability to maintain your own data. And also to be able to make changes to it and decide if you want to trade your private data for access to a particular application.


Adrian Vernon: And you know what, even along those lines, I'd asked you guys, how often does someone go in there, and even if you have the right to, as you say go in there and change, modify your data, how many average people out there actually do that? It's 'Oh, I'm presented with this thing, yep, I'll click agree and I'll move on and dive into the app.' So what advice do you have for folks out there?


Phil Richard: From my perspective, I guess a couple of things to think about. Number one is be a careful steward, and make sure that who's using your data and what they're using it for. The other thing to think about is just to realize, and this is something that your father told you back when you were in kindergarten, and that is that nothing is free. If something is free, what that means is they're collecting something about you or something for you that maybe you don't want to be giving up, and you need to be thinking about that. So when you download the Clubhouse app to your phone and it asks for all this information, be thinking about the fact that you're giving away that information. Maybe that's a trade you're willing to take, and maybe it's an exchange you're willing to make, but you need to be aware that's what's taking place. And really awareness is probably the thing I would stress the most when it comes to these apps.

Chris Goettl: Yeah, and I would say from a corporate standpoint we had a lot of people rethinking how they engage with vendors. There are a lot more companies scrutinizing and asking tough questions of vendors that they buy products from. I know engagements that I've had with companies that are investigating and looking to purchase Ivanti's products, there has been ramping up over time. But with the recent solar winds incident, there is a huge emphasis on RFPs companies asking about what's the overall end-to-end supply chain of your product getting into my hand. Where is it developed, where is it tested? If there is any content-driven from it, where is that? What are the data sovereignty barriers that you have got set up, if it's cloud-based? When you look at a technology like Clubhouse the frontend experience is built by Clubhouse, the backend is all based on Agora; they're platforms, they're Shanghai-based. There are data privacy laws in China that basically obligate Agora to provide access to audio clips if it involves security concerns for the Chinese government who's the arbiter of that. How is that information handed around? How much access do they have to information if it's questionable or did not fit that barrier?

So where data resides is a very important part of this as well, so this goes even beyond just personal privacy into what if people start to use a social media platform like this as part of their day-to-day work. I've seen companies embrace technologies like the Oculus VR platform to engage with their employees in training because we can't be together in one place. Let's use technologies and platforms like that to interact with each other more, it's a social platform, it's given you the ability to interact. If somebody were to engage with a social platform like Clubhouse and do so in a new and interesting way, we do have to think about, whether it's personal or corporate information, how does that potentially expose us to other risks? 


Adrian Vernon: Yeah. All right Chris as we take this down the home stretch now, we're not necessarily selling, we're driving awareness, but what's your elevator pitch? So people can understand how Ivanti can help IT Departments deal with potential issues arising from apps like this.


Chris Goettl: Yeah, a couple of things. We do have mobile technologies that can help with two of the major concerns on social platforms. One, controlling what's allowed on your corporate devices, if there are certain apps you want to not allow that's one aspect of this. The other is that a lot of social platforms are a target for phishing scams, so anti-phishing and protection of that user. So there's one thing that we all have from a cybersecurity weakness that none of us can eliminate, and that's our employees are the weakest link in the security of our environments, but we can't get rid of them. But what is it about the employee we can remove as far as a weakness? It's their password, it's their identity, that's what's at stake. How do we protect them from phishing attempts? How do we take a password out of the equation because that's the thing that they're stealing most often and then selling on the black market? And that's what's allowing data breaches, ransomware attacks, and other things to just walk right past all your defenses because they're using a credential from a user you know and trust. So passwordless authentication, anti-phishing capabilities, a control over applications and software running in your environment, all relate to this topic of how do I defend against the tactics that are going to be used in social attacks.


Adrian Vernon: Phil, what would you add to what Chris just highlighted?



Phil Richard: The other couple of components I think that I would want to talk to include how important it is to keep your environment up to date, both from a configuration and a patch management perspective. The Ivanti products in terms of vulnerability management and config management really do help provide that. Configuration is so important, in fact, there is a term we use in the industry called configuration drift, which measures the amount of how far configuration has moved from the defect or the default set of configuration. Sometimes that happens because a user makes changes to their own configuration. They might change the color of the background or the wallpaper or something, but it also happens because of applications that are installed. Sometimes applications need to modify the configuration so they can run. Sometimes they modify configuration values just because they want to make it easier for systems to have access to more of your environment. Configuration management solutions like Ivanti have, will help keep some of those in check and keep those configurations aligned. All that stuff is important because our computers are also the things we use to look at the internet and respond to email with, and we click on things. And as Chris mentioned our employees being the weakest link, helps protect them from their own bad behavior.


Adrian Vernon: Okay. All right we've just about run out of time today, so gentlemen really appreciate you joining, providing your thought. Chris, it was great to get to know you a little more personally, and we will look to dive into more IT-related cybersecurity news; I hear a couple of weeks from now. So everyone thanks for joining, for Phil Richards, for Chris Goettl, I'm Adrian Vernon saying until next time, stay safe, be secure and keep smiling. We'll see you in two weeks.