Cloud Native ComputingDevelopersDevSecOpsLet's TalkSecurity

There Is A Disconnect Between Developers And Security Teams

0

Immersive Labs is helping organizations measure and improve cybersecurity skills across technical and non-technical teams with its Cyber Workforce Optimization platform. In this interview, Sean Wright, ‎Lead Application Security SME at Immersive Labs, discusses how IT security has changed over the last couple of years with the emergence of cloud-native computing. He talks about what organizations are doing wrong and how they should approach security. The discussion was based on a study conducted by the company to better understand the IT security landscape.

Guest: Sean Wright, ‎Lead Application Security SME, Immersive Labs
Company: Immersive Labs
Show: Let’s Talk


Here is an edited transcript of the discussion by Jack Wallen

How the security landscape has changed over time

If you look at the way we typically developed applications, there are two kinds of aspects that have significantly changed since we used to build everything in-house. With the advent of things like open-source software, developers now use significantly more things like libraries and open-source libraries. I like to view applications as different pieces of code that are then glued together to form the application that a developer wants. The other thing is, the way applications are developed and then released, has significantly changed as well. So if you look in the past, we focus on big monolithic releases, you’d have your quarterly release or your annual release. Now we have releases sometimes happening every day, at the click of a button with things like CI/CD. So I think those two things have significantly changed the way we approach development.

What is Immersive Labs all about?

Immersive Labs is trying to provide hands-on ability to gain hands-on knowledge and awareness. If you go look at the past, traditionally, the way we’ve approached some sort of training essentially was very cumbersome. We used to do it as PowerPoint slides or presentations. If you go look at the way software has been developed, it’s continuous, it’s lightning-fast now. So there could be times where you’re probably not even going to be able to get a week off to go do a full week’s training. So one of the things about Immersive Labs is we try to have on-demand training. It’s not just a set of slides or multiple-choice questions. We’re actually getting our users to learn through doing, to actually find the problem and solve the problem. One of the ways I find that I learn best is actually working through the problem seeing it for myself, and then fixing it.

Disconnect between AppSec developers and cybersecurity teams

I think it comes down to something I’ve said before…understanding. If you ever look at the security teams, we are under pressure, making sure that systems are released securely, constantly worried about the company appearing in the media for the wrong reasons. And then, on the flip side, you have development teams constantly under pressure to deliver. So things like having a security team constantly putting up extra areas just hinder their progress. So it’s trying to see from both sides, understanding what each team is trying to do, and then working together. Ultimately, both teams are working for the same company, so they should be trying to achieve the same goal for the company. And I think that’s the biggest hindrance.

Things getting better or worse?

It’s hard when you constantly see things in the media. But if you actually have a look at where application security is going, and especially if you look towards a lot of the newer frameworks, things like cross-site scripting are really difficult with some of the later frameworks. And to me, that is a positive sign. It’s showing that, hey, we’ve actually identified these issues that have been around for decades. I mean, cross-site scripting has been around for decades. And we’re actually doing something about it. And yes, there’s friction. But I’ve also seen, as you mentioned, things like DevSecOps, and that we’re now starting to get the relevant teams engaged. Not entirely, but at least it’s a step in the right direction,

How to change outlook towards security?

I kind of use security like a seatbelt. When you’re driving along on the road, it’s generally not very useful until you’re involved in an accident, then you really want to make sure that you have that seatbelt. Security is no different; until someone’s going to attack you, it’s not going to provide any value. But when that does happen, it’s gonna be the most valuable thing that you could have at that point in time.

Can you also share some insights on the technologies that are there, like Polymorphing is one of those many examples where every instance is unique. If you increase work for bad actors, you also decrease the incentive that they will get by compromised systems. Talk about some techniques that are there that we can use to kind of mitigate some of these attacks.

As you correctly mentioned, trying to increase the work that attackers have to do means that they’ll either give up or go elsewhere. I really think that a lot of the things that we could do to prevent those attacks are things that are well known. Look at many of the recent attacks! I think the Parkland attack was a compromised password. And we’ve seen that time and time again, really well-known security hygiene has led to the attack. Things like source code, code control, access control…all those are termed security fundamentals. If we have those in place, and those were really nailed down, I view that as how we go a long way to prevent the attack. One thing may fail, but then the attacker is going to have to overcome another barrier and another barrier, and then they just give up and go elsewhere.

There are two facets: One is technology and one is people. You already alluded to that. So let’s talk about the cultural, social, and people aspects of security. But we tend to focus too much on technology power, and we tend to ignore the human aspect, which actually is more important.

I think one of our biggest assets in terms of security is people. Think about it, who configures the applications, who monitors the applications, who is running the applications, who develops the code, who picks up that USB in the parking lot and plugs it into the computer. They are all humans. And we humans, unfortunately, have one big Achilles’ heel…we make mistakes, every single one of us. And as you mentioned previously, we have to be correct 100% of the time, and they just need us to be incorrect once. So humans play probably the biggest role in security, in my opinion.

As companies are going through their own internal transformations, rules are also evolving and changing, and the focus is changing. In the early days, we used to say that no matter which business you are in you have to be a software company. And now, no matter what you do, you should have a cloud strategy. And I think soon we’ll say that no matter what you do, you should have a cybersecurity kind of strategy and practice. Do you have any playbook that is not only just for cybersecurity companies, but for companies in general, so that they can prepare their teams internally, culturally, to be ready, so that we can at least take care of the human aspect?

I think it starts with awareness. If you play it to how it’s going to affect them, on a personal level, and this is why we always enjoy personal security awareness courses because it affects them individually  in their own personal lives. And if they are going to take that on board and improve their personal security, they will actually then follow that through to professional security. And ultimately, it’s all about trying to improve that security culture. That’s, like 90% of the way and the rest will just fall into place. You’ll have people willing to take on additional training, and awareness and tooling, and maybe even a bit of pain. But cracking that is the most important thing.

It also creates stress because with these attacks that happen unnecessarily you have to stay up at night and on weekends. What can I stress they go through, when there are peak seasons, like holiday seasons, everybody’s panicking, you know? So these things can be taken care of with, as you said, awareness and communication. Any other key takeaway that you found in the report that we did not cover in this discussion?

I think one of the things that at least for me was the most surprising was the disconnect between management and their reports. It’s clear from the report that management isn’t aware of some of the issues. I trust their reports a bit more because they’re closer to the source, so they have a better handle on some of the issues. And I guess that’s the security culture top-down, making sure you’ve tested and knowing what’s going on, as well as reporting back up the chain and having the confidence to report back up the chain. One more thing is one of the challenges that will happen is that, because of security issues, sometimes the teams are forced to either delay the deployment or cancel the deployment. So are there cases where they are so pressured that they end up releasing vulnerable or compromised code. Is there anything that should be done to contain that? Is there a serious problem? I think there are two reasons for this. One, there’s the pressure to release. One comment hit the nail on the head: When will development teams be thanked for dropping a release because of fixing this major security issue? Never. The other angle is having awareness of the actual impact of the vulnerabilities. Take a look at the way we do cross-site scripting, script alert scripts. It’s a little pop-up box. So to me as a developer who doesn’t know any better, I’m just going to annoy my user with pop-up boxes, not understanding the full impact and how it could lead to credential loss and session hijacking. So I think those two are really important, creating that awareness around the vulnerability, and also ensuring teams have the appropriate priority and backing to fix these vulnerabilities.