mnemonic security podcast

IT Security Is From Mars, Software Security Is From Venus

January 04, 2021 mnemonic
mnemonic security podcast
IT Security Is From Mars, Software Security Is From Venus
Show Notes Transcript Chapter Markers

We're kicking off 2021 with a timely conversation about software security, this time with two individuals that are more than qualified for the job - Dr. Daniela S. Cruzes and Espen Johansen.

Dr. Cruzes  is a Professor at the Norwegian University of Science and Technology (NTNU) and Senior Research Scientist at SINTEF, and has been working with Espen Johansen (Product Security Director at Visma) on strategies to incorporate security into development processes. As you will tell from their conversation they have made tremendous progress, and have lots of experience to share for those of you that would like to do the same. 

Building an Ambidextrous Software Security Initiative:
https://www.igi-global.com/gateway/chapter/259177

Technical level: 1/5

Host: Robby Peralta

Producer: Paul Jæger

https://www.mnemonic.no/podcast 

mnemonic:

From our headquarters in Oslo, Norway, and on behalf of our host Robby Peralta, welcome to the mnemonic security podcast.

Robby Peralta:

You do your job, and I'll do mine. I'm sure we've all heard these words, (or said them to someone), in our professional lives. But we definitely don't want that exchange between the security and development team. Now, I'm sure we'd all agree that information security is a beast of its own, but so is software development. And unfortunately for the amount of hours that we already worked today, both sides are going to have to learn each other's worlds in order to keep the peace - or to keep the bad guys out at least. Espen Johansen and Daniela Cruzes - welcome to the podcast.

Daniela Cruzes:

Thank you.

Espen Johansen:

Thanks. How are you today?

Robby Peralta:

I'm doing just dandy. How about yourselves?

Espen Johansen:

Ecstatic as always

Robby Peralta:

I love it, Espen and that's why you're back on the show. There's not many people that are ecstatic about software security. So estactic Espen is a veteran of the show, product security director at the Visma, which is one of the largest software companies in the world. And do keep us in check, we have a professor and research scientist. And that's a professor at NTNU and a research scientist for SINTEF, who may just have an opinion about software security after writing 144 academic articles about it. So Daniela I presume your dreams are something like the matrix with a bunch of like code flying around your imagination. Is that true? You wish actually, well, it sounds like we shouldn't go any further into that for this episode. The topic for today is DevOps, or SecDevOps, or software security or all of the above. And again, starting out with you Daniela, since you've been so busy lately with the topic. One of the papers you wrote was called IT security is from Mars, and software security is from Venus. And I'm pretty sure that you didn't mean IT security is for men and software security is for women. So what did you mean by that? Exactly? What What was the meaning for that title for that paper?

Daniela Cruzes:

Okay well it has never been about women or men. So the main points of the paper is mostly to highlight this disconnect that the IT security people had with the development team. And one of the things that happens, and that DevOps also tries to bridge, is that many organizations have these departments that one was like about about it, and then looking at all the things about network and network security and intrusion and things like that. And the development team was totally excluded for all these discussions, or all the things that happens in the network. So then what what we saw the need is that, like, they need to be more aware of all these things. And that's what happens with DevOps teams, much more than when you don't have DevOps teams, because then they have to also run into their heads who also don't have to worry about the operational part of the system itself.

Robby Peralta:

Mm hmm. By the way, quick question for you there. Do you call it DevOps or DevSecOps?

Daniela Cruzes:

If you start putting like, each letter, or each part of what you want to focus as part of like, what is the concept, then you lose yourself. So then I think that's like DevOps, then it should be secured. If you don't have a software that is secure, then what's the point? So then you're risking too much, right? So then it has to execute. So then why to want to say dev sec Ops, does it mean that the DevOps should not be secure?

Robby Peralta:

In English, that's called what we call an oxymoron. That's been you work for a large company. I'm assuming there was one point in your history that there was no dedicated DevOps team per se. hen was that? When did you guys ake the transition from being ust one security team and then ver to the security being ncorporated into the product eam, for example,

Espen Johansen:

well that is quite some years ago, I don't have the exact starting date. But it's quite a quite fun to listen to Daniela because, but what she describes there in her book is something that is quite common when you see it in most companies, I would say, and some some see this as kind of the more of a philosophical fight between the ITIL direction and the Agile direction, that the kind of main main thing that we observed is that the the development teams saw security as something external to them. While the security teams saw these development teams, as also external to themselves so instead of having these look at each other as external beings, we have worked very hard on creating integrated environments where security is just a part of whatever you do in development, and develop it also becomes part of security. So we have to speak the same language and that was kind of a cultural barrier that we had to cross?

Robby Peralta:

Was there a certain event that happened to make that happen? Or was it, you just learned that that was a smart thing to do.

Espen Johansen:

it was basically a gradual evolution, I would say, it's not something, it's a defining moment that I would say it's it's a natural thing to do. But basically, if you've just tried to experiment, but in all ways of having gateways, by some IT security people for development teams, kind of like the traditional handovers from development to operations, we kind of learned that that just doesn't work, you create a soft belly that development teams, and you create a hard shell on the outside. And it's these hard shells are so easy to penetrate. When you have the knowledge of a developer, you just have to follow layer seven, and then you have a soft belly inside. And we wanted to avoid that. So by by following lots of the advice that we got from Daniella and other researchers, we, we kind of permeate into the current status that you have today.

Robby Peralta:

Which has a very high level of security. Yeah,

Daniela Cruzes:

yeah, I think that we started this discussion of calling self management security, remember. And then. So I think that it came with that concept of like, okay, we need to make this more self managers. And we cannot have an external that's going to be responsible for security, because that's not going to work with the base that visma wants to have right, or output or functionalities and things like that.

Espen Johansen:

I completely agree with that. Because if you discuss that Daniella, so did the self management, because that is the core business really big, we are kind of in really many countries. And as a distributed organization, you cannot have a very strong central management of this. This brings it kind of back home to some of we have had so many discussions over the years now Daniela about this, how to do self management on really difficult topics that people try to avoid. And this is kind of where I find the art. And that the fun in this is because you have this really difficult thing that people really want to do. But they just don't know how to. So how do you explain this to them in a way that inspires them and motivates them over time. And this is hard, and you have to keep at it for a long, long time. It's not just a magic recipe. I deployed this and run by this product and everything's fine. It's a mental process, and it's changed your culture.

Robby Peralta:

in that article, or in that paper, you interviewed 23 organizations in Norway of varying sizes, I would assume that the the smaller companies will struggle a bit with this more than the larger ones is that is that a correct assumption, or I

Daniela Cruzes:

think the challenges are different. We have been working in a few different companies in the projects that we are running about software security in Agile software development. And we see that the challenges are different. Sometimes it's much better to do things in Visma because they might have more resources they have might have more knowledge available, more skill sets. But then sometimes it's much easier to do things with a smaller company, because then whatever we try is not so costly to try, right? So then we are much more able to try things that we would like to see if that's going to improve security or not, in that flexible way, then sometimes these nice for example, you know,

Espen Johansen:

I can also I can also add to some of the things that we have learned from from working also we don't know is to do lots more experimentation. And as being as this is quite a large company, to experiment for a couple of teams in a couple of countries and a couple of cultures, you can achieve much faster progress than if you have to wait until you have a big bang have something that needs to be released. Some experimentation, I think it's essential to just try something better than doing nothing at all.

Robby Peralta:

We've got to mention that Visma is spoiled because they have a spin on their teams that might that might help. But I have a question. It's pretty mean. But it's also based on another paper that you wrote, and it's called good enough security. What is good enough security?

Daniela Cruzes:

Maybe you can ask Espen first because he works with security

Espen Johansen:

This is about five years ago, Daniela was sitting in the office, I think it was in Oslo, and discussion with another one of the my colleague doctors that Alan and he asked the question, what is spiritual security? And he said, You don't know. And I think the actual answer is was basically we don't know, what is the absence? I think I think we actually have some evidence to suggest what is not good enough security. So if you look at some of the cases that has been in the media over the last four or five years, is you have some companies that has been breached, for instance. And these companies that have been breached or has been sued, and had had really hard consequences of that breach, and I was in evidence could suggest that they hadn't done enough security. And then other companies that has also suffered the same fate that has not been sued to that hasn't suffered really severe consequences with regards to the customers leaving them and stuff like that. So that could constitute some level of evidence to say that something is at least not enough. And what is then good, I would assume is the on the opposite end of what is not enough? Would you say?

Daniela Cruzes:

Yes. Yeah. And it's also about like, we have now we started especially, these are some activities that we see that influences the security in a good way. So then we see that like, teams that doesn't follow some of the things that the security program asked them to follow. When they go, for example, for a bug bounty program, they struggle much more than the teams that that did follow the mandate. That basically says, right, so then we have some evidence of that, that's like when the team's goes to the bug bounty program. And they didn't apply all the suggested like activities, that seems to be good for security to have a good enough security, they struggle much more.

Espen Johansen:

I would agree to that. And we actually put that into monetary metrics, Daniela just last week. Because we built this median, the median payment that we do, on average, on a yearly basis for teams that are on the bug bounty, and the normal median is approximately$2,500 per year. But the outriggers, the ones who have not been through the entire program has not been kind of through the static analysis, dynamic testing, and all of the normal things that we do, to the most extreme outrigger had about$75,000 in a month in bounties. So you can see there's a dramatic difference in the actual spending on bug bounty, based on when the program or outside the program. So I'm really hoping to kind of show these figures, let's let's give it a year or two more when we have more data in so and then we can try to do more experiments, Daniel, if you want that. So throw someone in a bit early, I can see what happens.

Robby Peralta:

So if I just understood you correctly, you said that the average cost of fixing something for for a bug in a system that's gone through static analysis and all these steps you put in place that was a lot lower than the bug bounty cost that you had for a system that didn't go through all those steps without you just said,

Espen Johansen:

Yeah, I still have inconsistent data would say, but it's at least from our data so far, we can see that there is an average cost for a team in bug bounty that is quite static, it's quite, it's quite static around $2,500 per year. But when you use new services that hasn't been through the entire program, it might be because of some kind of political decision we want to make. We want to enroll someone really fast to supercharge them, which is a good thing we do for some teams, then we are prepared to pick up the cost of Messina, the cost of these teams are significantly higher in the first months than the ones who have been through this maturing face. So we have hard evidence right now. But we will get more evidence over the years to come, I would guess.

Daniela Cruzes:

So when we say about like costs, we mean that the number of bugs and the severity of the bugs that were found in these teams are much higher than the teams that have gone through the program and tried to find these security problems before it went to production or went to the bug bounty program.

Espen Johansen:

Remember, we had some talks earlier than Daniela, we had some kind of challenges, who should be challenged this this, this shift left kind of orientation to the challenges and just drop all the static analysis, dynamic testing and all that stuff. And just hurl everything at bug bounty immediately. So but the actual cost for risk might be lower. And my, my observation so far is Hell no, it will be much higher. So the actual actual qualifying things that we do in the beginning, but the security self assessment, this disaster, last thing, the cyber threat intelligence, the kind of penetration testing all that stuff. It prepares them for the dynamic battlefield that are about to enter. So I still strongly believe that the data supports us that the sdlc should be oriented against shift left. But I haven't experimented enough to be conclusive yet. danella to be fun to see if we could just take a couple of teams directly off the bug bounty, scrap all the others just pour them in and see what happens. But I'm not sure if I have the money to do that.

Daniela Cruzes:

that is also not the not only the immediate effect of this, but I think that what visma also gains by exposing the teams to so many Security activities, and now arenas and so on, is that by like thinking of long term, they will always perform good, right? So the more they know, the less they are going to commit mistakes. And if you just like expose them to a bug mounting how much they learn out of that, right? And how much they learn to prevent these things to happen? I don't know.

Robby Peralta:

So let's say you're one of them, you're an IT managers in charge of security, or you're ahead of the product. And you don't have a bug bounty program, you're not there yet. What and you can only choose like one or two things you could do, would it be like static analysis? Or would it be training your developers? What What would you like your first focus be? Would you start all the way left and just okay, I'm not gonna scan any of our code. I'm not going to do that that, we're just gonna train our developers to write secure code, or would you actually put in like, you know, static analysis, dynamic analysis and go for tools, if you had to choose like one or two things,

Espen Johansen:

I could at least kind of consider some of that. So it'd be kind of in the use case, you're describing Robby, now you're discussing someone who's going to cloud, you know, kind of, let's say, Azure, or Amazon, or deploying their, if they are kind of a slim, small company, four to five developers kind of a mentor to. So why not just use the the tech stack that's available cloud native, because both Amazon and Azure actually have both, both static analysis, dynamic testing, they have all of these features inside. If they develop the code and GitHub, they have all the kind of bells and whistles there already. So you'll see why they shouldn't do it, because most of it is really, really low cost. But when it comes to kind of, it's depending on the market, again, some customers demand you to do an external third party validation. And they have quite strict requirements to what that is. And I always, whenever some of my teams or some of the teams this last minute question, and I get this question several times a week, it is which, but I need to do a pen test, because then there's some customer needs to have some kind of report that shows that we have done this. So who should I contact. So I've learned that every country has their own kind of best practice, there is a couple of companies in Norway mnemonic being one of them that I recommend. And there's a couple of companies in Finland, Sweden, Denmark, Netherlands, Lithuania and all the others, all countries seems to have some kind of thing that gives credibility. But the only advice I give everybody is do never, under any circumstance, let them only test the surface, give them credentials, let them crawl on the inside. Because if you just do kind of surface test, the only thing you're testing in the cloud is the kind of the outer hard shell of Amazon and Azure, why bother pretty hard. These guys they know what they're doing. So they think you actually want to test is the inside of the code is your own logic. And that means that you have to open up that kind of perspective. . So really free thing that you can do is just write the words responsible disclosure on your, on your webpage, just do that. Just responsible disclosure, feel free to copy the kind of words that we have used this month, it's out there, just browse, just search for Responsible disclosure and and investment, you can find our policy just copied and rename it for me on purpose. But please replace our PGP key. Don't use my email address less stuff. But it because that will give you at least some external validation. And you can get some young, aspiring people who would like to train their own skills and test their abilities, they will offer their time and they will give you lots of good feedback and choose some pen testers, if you want to be validated that whatever you do, let them in, don't just let them test the outside. It's not nice.

Daniela Cruzes:

Yeah, to me, it goes back again to the question that you had about what's good enough security. And I think that that is a lot has to be based on the risks that you think that you are facing. And for each company, it will be different. So then they have we have to think about like, Okay, what are the risks that my product is facing, or it's what's the risk that it will face in one year. So then what's the best thing and what's the most important thing that I have to do? So then, if your main risk that you find out is that you are going to have problems with GDPR, for example, and you have a lot of sensitive data, then if you don't have a static analysis tools running to take to take at least the top 10 or asked issues, you know, at least covered, then you are going to have a big problem. And that's a big risk. But if you don't have much sensitive data, and then you're not going to have so many problems with if you don't think that you're going to have problems if the top 10 will ask and your corporate quite well and your team is quite good on doing that already. Or they're quite educated already about that because you know, the majority of them that they know that they are not going to, to to do those issues. Maybe we want to focus on a penetration testing. That's going to be better for your for your team to learn more about like where they are not good enough, right?

Espen Johansen:

Yeah, I could also argue that if you also add one more thing, it is the the actual perceived threat against you. I believe that's a factor of your clients. So if you're developing application to, to manage hairdressing appointments, and you're planning on selling that to normal hairdressing salons, you probably will not have the same kind of risk from from hackers from American or Russian or Chinese intelligence, as you would have if your application is being developed to, to being used by for instance, in a region app for for tracking, contagion, stuff like that. So depending on your use case, and the customers and the data will have inside, just have to figure out what's best for you. If If you feel that you're really, really harm's way, and you have to expect visits from some nation states or some rascals to they call them, then you should do the entire stack just do full Monday. But it is only only kiddies that's gonna take you, then you can do less of it.

Daniela Cruzes:

And like these little demystifies, perfection, we also have to be aware that the risk changes all the time. So then it cannot be like, oh, today I evaluated and it was fine. We don't need to do anything. That doesn't mean that in three months, this is not going to change. So this risk evaluation should be done all the time. You know,

Espen Johansen:

I completely agree, Danielle. And I think also, I think I mentioned this example before, if you don't, this, this voyage of discovery that one of our teams had, it's in a different nation state in Norway, where I will not mention the product. But they had this kind of notion that nobody was interested in your product, because all they did was to travel bills. That was a very simple kind of job. But but they suddenly found out that one of the clients was actually the head of a nation state, and former head of that nation state who use that application to do his or her travel bills. While he or she was doing peace talks in the Middle East, they suddenly found out that they had a different set of fit actress that was attacking them. And suddenly all that weird logs with those really accepted commands. It really made sense. Because of course, that would interrupt with the interest of the Israelis, the Americans, the Russians, the Iranians, everybody. So you have to know who your customer is. And why is using your system. That can be tricky. So yeah.

Robby Peralta:

And that's really hard. It's not like you get a text message. Hey, by the way, you're interesting now for, for nation state threat actors. So you have to just like set a set a date and just go through these things, or how do you guys manage that

Espen Johansen:

is actually quite tricky. But part of that is is to have awareness campaigns among salespeople, and combine that information to a team. So it's basically letting the development teams or the DevOps teams be part of the sales process, exposing them to clients. We call it the Trust Center, or the level three talk. That's one of the methodologies of use, is to try to get the development team engaged with the end client, getting to know that there are actual customers there, we have names, there are people and they do stuff with your with your things. The other, the other part of that solution is to have decent threat intelligence systems in place, just to understand what does the basic critters out there do? So the usual suspects, but current methodologies, what are they after? And then see for big changes in that structure, I think the best example from from recent history is the emergence of this global kind of package from from Russian Intel, when it can lead to a new attack vector against Linux based systems. So most of these have been feeling quite safe if they have been using Linux based system for a while. And then suddenly, this emerged, it was kind of released by by some American, I think was a governmental effort that carefully said information that basically spawned a lot of activity for the ones who might be vulnerable, just by knowing who the actors are, how they act. You can also pick some of that up, but essentially, just know who your customer are token sales, become accustomed to doing that.

Daniela Cruzes:

Yeah, find small companies once that you don't have all the tools that Lisa has available. We we try to focus quite a lot on doing threat landscape quite often. So just doing this discussion, again of like, what is our threats landscape? Now? How does it look like who is it who is interested in anything about us? It can be that's not data, it can be that the reputation of your company, it can be that like now, we are changing markets, and then there will be new threats related to these new markets, right? So all these discussions, you can just like set up meetings and do at least that and if you have security experts or people that are more interested in security, they will be able to at least help a little bit on that.

Robby Peralta:

But hey, you to put your heads together and you wrote a paper called Building an ambidextrous software security initiative? Who wants to explain that one to me?

Daniela Cruzes:

yeah, actually like him dexterity is just to give a fancy name to what these models doing. Our main thing was to try to explain, try to model what Aspen was doing in basement, that I think it was successful. So it's basically this top down and bottom up approach. And that's why is ambidextrous. And we try to model those in for ease that is enabling and guiding.

Espen Johansen:

Ensuring,

Daniela Cruzes:

ensuring and

Espen Johansen:

you can do this. It's empowering. And yes. Please, Robby, you have to give us a T the credibility of the doubt it is. Our enabling betting? Sure, come on.

Daniela Cruzes:

Espen has been talking much more about this, then I so then I hit sometimes forget before is over. But the main thing is that was like how do you create activities that are top down, but also that are bottom up. So the main thing with this self management again, that was also like it's something that has to be very, intrinsic in dexterity in the program is that even though we want the teams to be empowered, we want the teams to be knowing what to do. Sometimes they also need some top down approach. So they also need to know like, Okay, what is the most important thing for us to start doing like you'd like to ask now, right? And then sometimes these have to be like setups top down, because then we have to tell them, this is what we are, we think that is going to give the best benefits for visma. For example, you know, so then it's not cannot be fought like each team. For example, let's say that in visma, you have like almost 300 teams now, right is, let's say that each one of the teams would decide to use a totally different static analysis tool. Imagine how this is hard to maintain to, to to do like cost evaluation, or to do any type of evaluation of like how you're doing it, right, there should be some balance there that you should do between top down and bottom up. Because we can also not tell the teams to like do this way. And everything that you're going to do is this way, because then where is the self management of the team going, right. And that's not what our job is going to be about. And then they're just going to be following recipes. And that's not what we wanted. So we wanted also to in this process of creating some way to to go for security for the good enough security for them to also they have the flexibility to say okay, but in our team, this is how we want to do, this is what we think that is the most effective for security. And this is what we want to to bend a little bit on the rules that you are setting for us, you know, so then that is what we tried to do as much as we can. And of course, like for example, Visma like, top down management cannot know specifically, what are the problems that can have in each one of the product security. Right? So then the teams are the best one to know, okay, these things that we are doing are good, but not good enough. We need to do better than that. Because it's going to be for our products, what is the best thing to do?

Robby Peralta:

Hmm, can you give us some examples that's been about how you actually did that and practice some food for thought for the listeners.

Espen Johansen:

Yeah, so the four E's that empower enabling, but ensure they have kind of different connotation to them. So the empowering part. So when you are a software developer, you have a you're part of a team, there's five or 10 of you, and you develop this really cool app, that is really going to be important for the company. So you know, interesting that is only these people in this team, who will be able to fix any software problem, this will be the only one who can fix the bug themselves. So if I don't empower them, I can take away their power if I want to, I can put them in a program that is 27001 certified, and I can make them hand over their product to some kind of operations team down the line. I can do all kinds of gateway activities, just to relieve them of power. But if I slip this all around, and I say that, okay, you as a team, we know we acknowledged the fact that you're the only one who can fix the problem. When the shit hits the fan. That means that I have to ensure them that they have the methodologies in place. I have to enable them by giving them trainings by giving them tools and methodology. I have to embed systems in place that that ensures that empowering is happening. So the embedding can, for instance, be having Quality Management Systems having icms in place that reflects that the power is actually with the developer. And so when you do this in practice, I think one of the simplest things to empower them is to make the meet the actual source of power. So a developer who meets the customer, that is ultimate power, skip sales, or skips marketing skips everything, the developer meets the client. And good thing and that interaction is so many, one of them is that the client is able to give security requirements directly to the developer doesn't have to go down to seven translation layers. And this really does miracles for the autonomous behavioral, or video autonomous functioning of the team, they become more self manage, they understand more things. But all of this comes at Of course, some cost, I would say, because this self management drives autonomous behavior patterns that drives them in different directions, they want to experiment a bit more on the on the lower levels. So you have to be really interested, you have to like the fact that they will be challenging you. They kind of like racing kids. So I presume they will be everybody will be different. And you have to acknowledge them for the differences and celebrate them instead of churning everybody into the same mold. Because they deserve to be different. So I like this model. And this, this beautiful combination of top down and bottom up, that really resonates to at least in this migration as well, especially since we acquire lots of companies, that's one of the things we do. So we have to treat them differently. And we have to acknowledge that they have survived it for so many years. And we don't want to change them, we don't want to merge them into some kind of pourish when we buy them. We want them to retain their independence. So it makes sense in our use case.

Robby Peralta:

I hope that I hope the customers are nicer to the developers than they are with the sales guys. But uh,

Espen Johansen:

Oh, yes. No, they are. And then the the some of the learning, let's see some of the developers to kind of find this menu can be quite scary. The first time Hmm. But at once they understand that this is actually about our transformation, it is the customer is always the power, it is never the managing director or the chairman of the board. It is the customer who has the ultimate power in any private enterprise. And the ones who have connections with those, they have the actual power. So that is where I see the actual power transaction happening, but also to empower them. In the normal the normal job is basically how do you set thresholds on what is good and what is bad. We help them we are security experts, we try to help them assisting them in setting these thresholds. But the ultimate decisions always have to happen as far down as possible, as close to the core of the company, which is the developers. And if everything has been done right, the DevOps team will also have the needed competency on networks, all the normal bits and bobs of the old IT security industry. So I would actually agree with Daniela, this Venus and Mars analogy is beautiful. Because it basically pinpoint some of the things that we see. So the classic artists of your industry, where it came from, I don't see that anywhere in my new field, apart from some companies like the mnemonic and a couple of others is actually in that field. Today, the rest is in longtail environments and selling off all firewalls like they did 20 years ago. Time to move on, guys. Yeah.

Daniela Cruzes:

And and one thing that he might I came to my mind, again, when Aspen was talking is that even though we have the four E's, there are things that are behind all this that it will not work if you don't have that is trust. So like when Espen says it's like, okay, the teams are the ones that decide you have to trust that they are the best ones to take those decisions, and how can you trust that, then you have to enable them, you have to then give them all the training that they need to know, like make sure that they have all the awareness that they need. Like they know what are the risks, they know what are the risks that they are facing, and that they will take the best decision when it needs to be taken. And so trust is very important. And then transparency is something that we talk a lot about in visma that is should be transparent, both top down and bottom up with from the team student to management and from the management to the teams. So

Espen Johansen:

that's been a surprisingly easy battle to fight actually presumed that transparency will be an issue but it wasn't. It kind of it looks like when you talk with intelligent people that they actually understand the reasoning behind this and it really is was the simplest of all tasks was to get them to be transparent, the more difficult it was to make them take responsibility and really become own up to their own responsibility states, there's some still, that kind of just wants me to throw some kind of certificate on them and say that we're good enough. Now, can we go back to rest? But hell No, I will not do that. So the, the the element of that is compliance orientation in DevOps, people realize it doesn't work. This kind of just wanting a certificate to show to the clients. It's just smoke and mirrors my book. Certificates can add value, if they're implemented with a with a cause. But the certificates, they don't have a whole bunch of money to meet,

Robby Peralta:

the more you think, you know, the lesser option or something like that. Yeah. Hey, last last question. And it's, yeah, I'm pretty mean today, but I'm gonna do it anyway. Why is there no regulation around software security? Like, you know, there's GDPR for you know, personal data. But, you know, all the new security laws that are coming out, there's nothing that meant, that says, You need to do this with software security. And I think I have an idea why, but I'm just gonna put that over to you, too. Yeah. Come on.

Daniela Cruzes:

Because we don't know for sure.

Espen Johansen:

Because I think just it's basically it's just hard. Yeah, it's difficult. How do you regulate this stuff? Yeah. Do you have control regimes in place to regulate if people have done this? Right? So basically, to transfer the responsibility from the ones who develop software to the testers? Yeah, it's difficult.

Daniela Cruzes:

It's also about the thing about being self managing teams. And also the, this whole thing of like, compliance driven that Espen was saying, and that's one thing that you asked about, like, why do you need to be more ambidextrous with a security program? Why we are putting so much focus on that? And one of the reasons was, because as Ben, and we discussed a lot about that didn't want to be compliance driven. You know, so then why are we going to create regulations, so then the teams will have to be compliance driven, that's not what we want, in wanting security to be just part of normal life, it's the way to do software, you know, like, is not optional. It's just like, we just write software in a secure way. We think about security all the way, you know. So we think about security the whole time, from the beginning to the to the end, it's like we talked about this before, that's our dream was that like, security would not be an issue or even like a discussion, are we doing securit or are we not doing security? It's just like, you just do software, and software has to be secure. There is no question.

Robby Peralta:

Secure by design now.

Espen Johansen:

I really love that statement. Because some of the things that I've been at it for a while now, and I couldn't if you look at the Norwegian health law and stuff like that, all these kind of different frameworks that have their own rules, sets and all that stuff. There's always some glitches, and then the people exploiting and, like in the north, I think that kind of praise is that you have to have encryption for certain elements, and you have to have certain amounts of separations. Well, encryption will elicit enough with a video encryption, or is to do what kind of encryption Do you really need? It is not explicit all the time. And sometimes it's outdated and who's going to maintain these kind of standards? And are they sure that they are good enough? So I've learned to, to kind of, instead of making forcing them to read PCI DSS reading some of the other kind of listening piece, look at them, see them as inspiration, because they are great pieces of work made by people who really cares. And they've really put effort into it. So see them as inspirational, but don't see them as, as dictates, you don't have to do that stuff. Because some of that is just moronic. So if you are doing a web application that's going to be posted onto Amazon, or Azure. And you're thinking about the three states of the data, which is either in process or it's in, it's in transit, or it's addressed. So how do you ensure that it's encrypted in transit? Well, you figure it out based on your customers. How do you ensure that it's encrypted and really well secured while it is in rest? Well, that's easy. Most of the services are very good encryption services available for you. But if you want to do the advanced stuff and go for encryption, when it's in process, you have to start to think like homomorphic encryption using microservices deploying them in hardware security modules and stuff. So you really need to challenge the mindset of the developers because they are bloody smart. They are really really smart and they kept the class the sciences, just been given the opportunity to deviate a bit from door to door just give them a steady hand but for PCI DSS, they will do just as it states in the PCI DSS and nothing more

Daniela Cruzes:

Maybe they'll find ways to fake that they are doing. Oh,

Espen Johansen:

I've seen so many examples of how to fake the standards, so easy to fake. So it's always fun to go in and query them. And we don't want to embarrass them, because they're just doing their jobs by letting them be responsible once by empowering them. You're challenging them mentally. And they're kind of, they're like kids, if you give them enough leeway, they might really empower you, they might really amaze you with their creativity. And they might develop into something really beautiful that you haven't foreseen yourself, you're gonna

Robby Peralta:

get a bunch of people trying to apply to your company now has been, we're gonna get a flow of, you're gonna get a bunch of LinkedIn messages afterwards asked me if you need people on your team.

Espen Johansen:

We always do.

Robby Peralta:

You know what you do. Thank you so much for your time. Danielle, I'm expecting 144 more papers from you in the next 12 years. And you are definitely my trusted advisors for the topic moving forward. So I wouldn't be surprised if you get invited back or invited to speak at an event of ours or something like that. So thank you again for your time.

Espen Johansen:

Enjoy it. All right.

Robby Peralta:

Take care talk soon. Well, that's all for today, folks. Thank you for tuning in to the mnemonic security podcast. If you have any concepts or ideas that you'd like us to discuss on future episodes, please feel free to send us a mail to podcast@nimonic.nl Thank you for listening, and we'll see you next time.

When did Visma start with DevOps?
What is "good enough" software security?
Bug bounty - cost of fixing bugs in a system
Where to start with software security?
Why is there no regulation in regards to software security?