Unspoken Security
Unspoken Security is a raw and gritty podcast for security professionals who are looking to understand the most important issues related to making the world a safer place, including intelligence-driven security, risks and threats in the digital and physical world, and discussions related to corporate culture, leadership, and how world events impact all of us on and off our keyboards.
In each episode, host AJ Nash engages with a range of industry experts to dissect current trends, share practical insights, and address the blunt truths surrounding all aspects of the security industry.
Unspoken Security
Challenging Assumptions at the Intersection of Cyber & Physical Security (Part 1)
In this episode of Unspoken Security, host A.J. Nash welcomes Ana Aslanishvili and Shawn Abelson from Pine Risk Management. Together, they dive into the often-overlooked intersection of cyber and physical security. With a combined experience of 30 years, Ana and Shawn share their insights on the importance of integrating these two realms to fortify organizational defenses against evolving threats.
The conversation highlights the critical distinctions between penetration testing and red teaming. Ana and Shawn explain how red teaming goes beyond traditional pen testing by adopting an adversary's perspective, aiming to challenge and improve the existing security measures. This approach not only tests the effectiveness of physical and cyber security controls but also enhances the overall resilience of organizations against sophisticated attacks.
The episode sheds light on the synergy between intelligence and security practices. By leveraging threat intelligence, Ana and Shawn illustrate how organizations can anticipate and mitigate potential security breaches. Their expertise underscores the necessity of a holistic security strategy that encompasses both cyber and physical aspects, urging businesses to reassess and strengthen their security posture.
Unspoken Security Ep 12: Challenging Assumptions at the Intersection of Cyber and Physical Security (Part 1)
AJ: [00:00:00] Hello, and welcome to another episode of Unspoken Security, brought to you by ZeroFox, the only unified external cybersecurity platform. I'm your host, AJ Nash. For those who don't know me personally, or are first time listeners, I'm a traditional intelligence guy. I spent nearly 20 years in the intelligence community, both within the U.S. Air Force, and then as a defense contractor, most of that time was spent at NSA. I've been in the private sector for, I don't know, eight, eight and a half years now, primarily building or helping other people build effective intelligence driven security practices. I'm passionate about intelligence, security, public speaking, mentoring, and teaching.
I'm also deeply committed to servant leadership, which is why I completed my master's degree in organizational leadership at Gonzaga university. Go Zags. The goal of this podcast is to bring all of those elements together with some incredible guests and have authentic unfiltered conversations, even debates about a wide range of challenging topics. This is not gonna be your typical polished podcast. You might hear or see my dog, although to be honest, [00:01:00] she's actually not in here today, surprisingly. Uh, people might swear here. Uh, I can't speak for the guests today. I think they seem pretty clean cut, but we'll see what we can get them to do. Uh, we may argue or debate.
That's all okay. Listen, think of this podcast as the conversation you might overhear at a bar after a long day at one of the larger cybersecurity conferences, like all the ones we're about to get ready to go to this year, these are the conversations we usually have when nobody's listening today, we have a special new twist on the show as we don't just have one guest.
We have two, they're a dynamic duo of Ana Aslanishvili & Shawn Abelson from Pine Risk Management. They're experts in security risk management, physical pen testing, social engineering, and a whole lot more. With about 30 years of combined experience. They're also both graduates of the university of Minnesota, which is the flagship university of my home state.
And I wasn't good enough to go there. So, uh, Ana and Shawn, is there anything I left out in terms of your bio that you want to add before we start jumping into the conversation?
Ana: I think you nailed it.
AJ: Honored, like, [00:02:00] did I pronounce your name correctly the first time?
Ana: Actually, shockingly, yes.
AJ: What I'm not going to do it again. I'm sure it's the one and only time. So I'm glad it was recorded. Shawn, yours is a hell of a lot easier. Uh, so listen, the title of today's episode is, uh, challenging assumptions at the intersection of cyber and physical security.
One of my longer titles, I guess, but I thought it was really important. I know we talked about this a little bit in the prep. I think it's a really interesting topic, you know, the challenging these assumptions and specifically that intersection of cyber and physical security, because a lot of organizations I've worked with, uh, or visited to talk to consulted with dealt with in any way they seem to have these areas segregated, right?
Physical is over here. Cyber is over here. Never the two shall meet. Uh, they're, they're different people from different cultures and backgrounds. Uh, so the idea of, you know, how they intersect and talk about those assumptions, I think is really, really interesting. So. You know, with, with the two of you guys, I know you got a great background in this space.
Um, and we talk a lot about in the, uh, the physical security case. You know, we talk about, [00:03:00] uh, pen testing, right? But physical pen testing, not that technical piece and red teaming. But I'm curious, what are the foundations of effective red team, and for that matter, what's the difference between penetration testing, pen testing, and red teaming in your, in your minds?
Ana: I can take this one. Um, first of all, thanks for having us. I'm excited to be the first duet on the podcast and just to have a chance to talk to you. Um, as far as the foundations of red teaming go, you know, one of the things that are frequently forgotten is that red team really is the blue team with a twist.
Um, and What I mean by that is that we all have the same mission. The blue team is there to defend the company, set it up the best they can to protect everything that matters, and red team has the exact same mission. We just go about it a little bit differently, and people tend to get a little nervous about it, but that's pretty normal, you know.
We come in with an adversarial [00:04:00] hat, and that's what sets us apart. And so what I mean by that really is that Instead of focusing on one particular aspect of testing a particular control or focusing on access controls or, you know, surveillance systems or something else, we are standing in the shoes of an adversary and trying to picture from their perspective, why are they targeting this place?
How would they go about it? And then from, you know, our realistic perspective, then we try to apply What we know about this adversary and see if the organization's defended defenses hold up against that. So, whereas our tactics are different, our mission is very much the same and that's to protect the company.
And so, you know, as, as we're increasingly facing off with more creative and more capable adversary. I'm obviously biased, but a red team is, tends to be the best security, um, team that helps stay [00:05:00] ahead of the adversarial tactic. What
Shawn: did I miss out?
AJ: That's really interesting. Shawn, what do you want to add, man?
Shawn: No, I think you hit the nail on the head with each of those points. Um, especially, uh, the, the difference between red teaming and pen testing. In my mind, there's, you're going to ask a hundred red teamers and get 150 different opinions on it. But, uh, on the physical side, especially pen testing is when you try to break into a building and red teaming is when you do it.
With an adversarial perspective, adopting an adversary's motivations, capabilities, and tactics, a good red team, before they say, I'm going to break into this building, they zoom out and say, who are the adversaries, the threat Intel for that? Who are the adversaries? What are their tactics? What's the threat surface?
Uh, and then they determine, Is the best way to get access to your most valuable critical information, trade secrets, intellectual property. Is it breaking into the [00:06:00] building? Is it one of dozen social engineering or other ways to get access to that? And so in my mind, pen testing is more control focused.
You're testing your access control, your security guards, uh, controls against tailgating, things like that. And red teaming. You zoom out, and those are part of the tactics, if an adversary would do it, that a red team does.
AJ: So it's interesting. And I, and I listen for both of you. I really appreciate, uh, you, you answering it this way, right? Cause, cause these terms are often used interchangeably. And from what you're saying, they really shouldn't be. It sounds like, and I don't want to put words in your mouth. So you tell me pen testing and pen testing can be on its own, right?
Can anybody can pen test? Just check. See, does the door lock work? Does the mechanism work right? Uh, whereas from what you're saying, red, uh, red teaming requires staying within a boundary of. You know, what do we know of the adversary and following their rules of engagement? Right. So we're, we're emulating a specific adversaries, but the group, but it's pen testing, do you think pen testing is still sort of a subset of that?
Like if you're going to emulate a [00:07:00] specific organization, Hey, we know they act in this way and this is how they tend to like to break into organ, you know, buildings. And so we're going to follow that tactic, but then we're still going to test. Obviously we're, we're doing the penetration testing in the way in which that adversary would be.
So are you saying that pen testing could be a subset of red teaming, essentially.
Shawn: I would be hard pressed to say that, necessarily. I think it is very related. I think most red teams do pen testing as part of it, but pen testing can also stand on its own. Um, If you look at your average red team engagement, you are running pen tests against different controls. It's just a lot more holistic.
So from a scoping perspective, pen tests are very narrow, not very narrowly scoped, but pretty scoped in. In a red team, your scope is what would an adversary do and how can we emulate that, uh, fairly realistically and ethically against an [00:08:00] organization. And so Yes, I guess I will agree that pen testing is a subset, uh, but it also has the ability to, to absolutely stand on its own.
Um, there's pen testing teams, there's red teams, uh, and, and they tend to be kind of separate approaches to doing the same thing, which is testing whether the security controls. Actually work the way people think. They do challenging assumptions and then helping the, the blue team fix any gaps they find, or equally as important, helping validate the controls that are in place to confirm that they are in fact working.
It's not only about recovering weaknesses, it's, it's also about assuring that things work as expected.
AJ: Okay. So it sounds like you could, Oh, go ahead on it. I'm sorry. Go ahead.
Ana: the biggest thing in my mind is that there's a time and a place for each. And so, whereas they're not, you know, mutually interchangeable, they're both very effective tactics. It just depends on what you're trying to get out of the assessment.
If it's specific [00:09:00] information and performance data on one control or a subset of controls, then, you know, one of these solutions might be better for you, um, than just doing a holistic assessment because it really depends on the readiness of your organization too.
AJ: Got it. Okay. That makes sense. Right. I mean, it sounds like within red teaming, pen testing may or almost always will exist. It sounds like at some point you can do testing without red teaming though. You can simply just be doing penetration testing. Um, which makes sense. Right. And they're, so they're cousins, they're related.
They're not always in together. They're certainly not interchangeable from what you're saying. And that red teaming seems to require. A different, I won't say more cause I don't want to insult pen testers, but a different kind of research at least, because you're saying it's framed in trying to emulate an adversary.
And, and Shawn, I think you mentioned, you know, that there's a, there's a connection here to thread Intel. And obviously that's going to get my attention. Cause that's, that's my background, right. As Intel. So let's, let's talk a little bit about how that, how that fits together. Right. So how do you guys use Intel when you're building [00:10:00] up your plans, your intentions, uh, for red teaming, you know, for, for, for working on your whole.
Background and your whole research for, for doing a red team exercise against a client's facilities or, or people, or however you want to set that up.
Shawn: Yeah, uh, excellent question. So for pen testing, like you just said, uh, I would pen test before I read team, uh, in terms of you are testing the controls. And then when you get to red teaming, it's more involved. It takes more time. It takes more money. It takes more effort. And a lot of that is because of the adversarial component.
And in order to be effective at that, it starts. involves and ends with useful threat intelligence. And so a red team, when they're prioritizing what they're going to do, what the attack, um, looks like the attack vector, the attack surface, the threat actor, all of those should come from threat Intel for the organization I [00:11:00] use.
During the course of an assessment, uh, threat Intel four times at least so prioritizing who the adversary is. If you're trying to determine the most likely and impactful adversary, the second is using threat Intel. Determine what are the tactics? What are the likely tactics? What capabilities have they shown in the past?
And what do you suspect? them capable of. And that is what you're emulating. You're not going into a free for all where you have to get in at any costs. You want to know, will this person, who I know is targeting my organization, Will they be able to get in based on what we know about them and what they're good at?
Um, determining for number three, determining what the target is, their motivation helps to determine if they're trying to embarrass your company, recruit your employees to be insider threats or steal intellectual property, uh, to give it to a company in the local country, if it's a nation state actor. And so figuring out what their motivation is and [00:12:00] what they might actually target is the third.
And the fourth is really just tactics. It's the tactics of threat intel, but it's turned against your own company. And that's where you get some fun perspectives. You take all the, the intel gathering, reconnaissance, surveillance, scans, whatever goes into really good threat intel. And you, you did a good episode on intel requirements and all of that.
And you turn it against the company that you are trying to protect because you want to see. that company through the eyes of your adversary. You want to see where the strengths appear to be, the deterrence. You want to see where the weaknesses might be, where they might target, and then you use Threat Intel to actually go out and do the assessment and see if an adversary can Do string a chain of vulnerabilities together in order to actually get into your organization, whether it's cyber or whether it's multiple getting through multiple layers of physical security.
So any good [00:13:00] full scale, realistic red team engagement, like thread Intel is the crux of that.
AJ: Obviously, I do have more great things to say about thread Intel.
Ana: Well, you know, when I tell people what my job is, um. You know, if I had a penny for every time someone's asked me if I'm the person dropping off from a helicopter. And, unfortunately, in the years that I've done this, uh, I have yet to find a threat actor. Whose tactics are drop on top of the target building from a helicopter and then infiltrate, you know that that's saved for movies And it doesn't actually work that way what works very well however is intelligence and there's threat intel for the adversary and then there's target intel in terms of What life looks like at what you're trying to protect right?
And so more more often than not it's not Sexy uniforms and jumping out of helicopters sitting in a surveillance van [00:14:00] for hours and hours and hours and regretting your career choice and, and you know, you get a couple hours of action and then you're sitting in that car again, taking note. But that is ultimately what makes you successful in the long term.
But without intelligence, I don't think red teams are
Shawn: anything. Yeah.
AJ: definitely not a recruiting case for red teaming, right? And you're just, I mean, I'm not gonna lie. I thought we were gonna do like a mission impossible discussion here and helicopters. And you're saying, no, there's none of that. No, no leather outfits and, and, uh, lasers. What's that?
Ana: I have outfits.
AJ: Why don't you have a bunch of wigs?
I know we talked about that before the show. So I'm sure you have a lot of disguises, which maybe we'll get a chance to get into, but it doesn't sound like there's a lot of like, you know, the whole laser beams. You have to duck around and get try to steal the big diamond. It didn't sound like there's any of that, you know, with Mission Impossible music playing in the background.
It's mostly just sitting in vans, uh, observing and taking a lot of notes. It sounds like.
Shawn: Yeah, and guns. Yeah,
AJ: Oh, guns. Well, now you got my
Shawn: [00:15:00] to avoid,
AJ: trying to avoid guns. Yes. I think that's a good, that's a good choice. I
Shawn: Uh, yeah. It's hard. I mean, there's a lot of, we've done a lot of assessments, um, at financial institutions with armed guards or off duty law enforcement, uh, directly targeting police stations, being hired by Yeah. Cities or police stations and obviously those are those are interesting targets, and and you just function differently I've been lucky or unlucky enough to have a police helicopter called on me once though It was canceled before it arrived, but
Ana: see at least someone gets
Shawn: helicopter
AJ: a helicopter. You just gotta, you gotta figure out a way to make it happen on. And that's You just got to really people off apparently. So yeah, now out of curiosity, I know this is off the topic from where we are and we'll get back on the right questions, but do you carry like documentation in the event things go really badly?
Do you have like a note from who hired you to say, Hey, please don't shoot me. I'm, I'm not actually a bad guy trying to break into the police station. Your chief hired me or whoever the hell hired me. Do you have documentation [00:16:00] or. Yes. And on this nodding, which is good. Yes.
I Yes.
Shawn: would argue based on what I'vseen that we're some ofof the most risk averse Red Team. We'll still go out and and do the full assessment, but We'll do safety briefings with the team and with the client beforehand We need to have I think everyone will have a version of a letter of authorization Get out of jail free card, which which is signed by the client who you've confirmed Owns the property has control over whether you're allowed in there, etc And there's multiple other methods, depending on the target, depending on the type of engagement, sometimes law enforcement, like if you're actually climbing fences and doing things the public might see, uh, and you have a good relationship with the cops locally, we've given them a heads up that if they get a call, they might want to check with us or, or at least be aware that, uh, People might have a letter that says to, to call the owner of the property.
And so there's [00:17:00] lots of safety measures that, that we put in place. Um, almost all of which are proactive, but a few of which are, are, uh, based on lessons learned over the years to keep ourselves and the team safe. So yeah,
AJ: do validate that whoever hired you actually owns the property. And it's like, I can't, I can't hire you to red team a bank and then go rob the place. I'm guessing like, you're not going can't kind of contract you for that. Right. I mean, that's, that's good. I don't plan on it for anybody who's listening, but I was just curious, just making sure.
Ana: just going back to that intelligence, we're going to
Shawn: Yes.
AJ: I, yeah, I don't own a large bank, so we're not going to go there.
Shawn: And competitive Intel, like we, there's none of the, like, if you're red team, like you're good at identifying, gathering information, getting into places. But, uh, There's a very straight line of like red teaming is just of your own you have to have permission and you're testing to improve your own security anything besides that is Like the [00:18:00] competitive Intel side quickly becomes a crime if you don't have permission to actually break into a property.
So Making sure there's all the due diligence needed ahead of time is is incredibly important. So
Ana: I Know I didn't make a you know Case for selling my profession earlier, very well, but
AJ: Keeping the competition down. I got it on it. It makes sense.
Ana: know I can
Shawn: stop,
Ana: but, um, one of the ways that you can keep the fun in the job is by having all these stops ahead of time.
So you're not finding out the hard way and not only putting yourself and your team that's on the ground in danger. But also putting your client in, in front of reputational risk, financial loss, right? Anything that would get them the opposite of what they're looking for. And so just doing that, um, mandatory, you know, due diligence, if you will, ahead of time.
And both of us have done it both ways and [00:19:00] clearly have landed on a preferred way, um, to run these things. It's, it's what makes the exercise fun. And also effective and, uh, leads to the outcomes that we're hoping for, not that we have preconceived notions about how these things will go, but no one getting shot is kind of a baseline that we start from.
AJ: that's, that's, uh, it seems neither of you have been shot, I'm going to guess, right? No, or shot at even for that matter.
Shawn: You know, I've, I've had firearm, firearms, uh, firearms, uh, pulled on me very early in my career and there's lessons that you learn once, uh, and that's one of them. It was, It's safe and controlled, uh, and very quickly de escalated, but, um, There's, there's, uh, multiple changes I've made since to make sure myself or anyone on my teams, uh, have never faced anything like that.
Um, cause it's not a fun feeling. It's not fun when something that you've spent a lot of time controlling the boundaries of to make sure it goes [00:20:00] for, it doesn't go from zero to a hundred, That someone doesn't see you breaking into a building and immediately go to Kind of a full breach scenario where they need to pull weapons and so yeah, that's Definitely have not been shot or shot at but I've been not too far from that and I think most red teamers will agree That that is way too far.
So making sure there's a lot of scenario control and that Anyone that detects you, uh, is your best friend, right off the bat. And that's part of social engineering. You, you want to turn that potential escalation into someone that actually, uh, takes you on a tour through the building, shows you the server room.
Um, we've done that, I've done that, uh, 50 times compared to the one time where, uh, there was no opportunity to de escalate, uh, until we were confronted. So it, it really Is, uh, important to focus on that, and I have, uh, and, [00:21:00] and I definitely try to make sure other red teams that we work with have opportunities to learn those lessons by, like, publishing stuff and getting it out there.
You don't want anyone else to learn that the hard way.
AJ: That's, that's, yeah, good advice. Obviously, I don't want to see anybody getting shot. Uh, so it's good to know that you, you have a lot of controls. Like you said, you've learned, you've been doing this a long time, so I'm sure you've, you mentioned deescalating, you mentioned making them your friend. I mean, there's really interesting strategies that may, you got to have ice water in your veins.
I'm sure when somebody's blows up your original plan to be able to pivot very quickly to, Oh, I'm glad you're here. Hey, I need your help. And, you know, and turn it into a positive, I'm sure that's a, you know, that's, that's an art as much as a, as a, as a science, I imagine. Uh, you know, I'm curious, a couple of things that came up as you guys were talking, you know, one, you'd mentioned earlier that, you know, red team is really kind of just the flip side of blue team, uh, how does blue team.
Feel about that. Do you think like, is that, is that an accepted norm? Do you have blue teams that are hostile towards red teams? I think, you know, poorly about red teams. Are they, are they happy to hear that there's a red team exercise going on? Did they even find out in advance that there's red team [00:22:00] exercises going on?
You know, do they see you as the flip side of the coin and as teammates in this, or are there situations where that's not how it's perceived or how it goes?
Shawn: Um,
Ana: I don't know that a universal answer exists to that. In ideal state, you have a relationship that you've built where that iteration that we have the same mission, we're on the same team is something that isn't just lip service and it's something that people actually know and trust and believe because you've done it, uh, you know, over a certain amount of time and you've built something better together, uh, doesn't always work that way and it's, you know, easy for, for things like this to kind of start moving in different directions pretty quickly.
But, you know, from, from the shoes of the blue team, They are always on a defensive and there's so many things, right? There's so many fixes. They're, they're physical, info sec, cyber, like all of these different requirements. They have bug fixes and [00:23:00] vulnerabilities that they know about. Pen tests happening all the time.
Right? And so to prioritize this work, um, really has to take that one team mentality. But at the same time, um. An effective red team shows what matters, right? So if you're looking at a queue of a thousand tasks, we are, if deployed correctly, a tool to prioritize those thousand tasks and maybe look at, you know, higher criticality vulnerabilities, or look at some of the systemic things that if we fix this approach, or if we fix this overall kind of umbrella.
situation, vulnerability finding, whatever it is, then some of these will go away or, you know, reduce risk or become less risky just because now the bigger issue is fixed. And so, uh, ideally we work together and that's when both of our teams can be most effective. Um, I think at the same time, it's [00:24:00] pretty easy to become defensive and that really will 100 percent depend on the culture of the company, um, and how failure is treated because in our world, you know, we fail, we try again, we fail, we try again.
The, the principle there and the moral is you don't fail the same way twice and you learn from it. Right. And so we're huge on lessons learned. And that's what we do. That's what we've done with the teams that we've built and the programs that we've built. That's what we do in the company that we have started because without those lessons learned, you're just doing the same scenario over and over again.
And kind of finding out, Oh, well here it's different and here it's like this. But if you do a lessons learned that applies, then you can, you can apply to greater Kind of scope of problems and learn in different ways and we want to share that knowledge That being said, you know, we are a very [00:25:00] Practical hands on operationals operationalized way to laser focus some of the work that the blue team does Yeah, we'd like to work together
Shawn: It's like an ideal scenario is a red team is Conducted and then there's kind of a purple team engagement after where both the blue team, which I don't know if we've defined it.
I think it's generally well known, but the blue team is everyone on the security team. That's not the red team. So all the defenders, all the people trying to protect. And so ideally there, there's kind of a purple team engagement that, that you have after where you focus more on learning. Less on reporting.
Um, there's an opportunity, instead of just dropping a lengthy report, you actually walk through each of the things you did and they suddenly become a lot more clear and compelling. Um, you can do what we've done. So the, there's the blue team, and then the blue team is always struggling for budget. And so one thing we've [00:26:00] done is bring people from the financial office, from the CFO's office, the people that actually give the security team the budget.
Bring them out on some parts of a red team assessment and say, Hey, we know that the security team is asking for more money for this. We're going to test it in real life. This is all real. This is not controlled. You are going to see, and maybe if we're feeling kind of crazy, we'll actually have them join the red team, be, be guest red teamers and try to break into a building.
And if they suddenly see firsthand the gaps that exist, that changes multiple perspectives, whether it's a blue teamer guest, red teamer, or a uh, someone in the accounting department that does budgeting. Either way, doing the purple team engagement, uh, focusing on education, awareness, empowerment, and telling a compelling story about what the vulnerabilities are.
Um, those, instead of the adversarial approach, we like to say like, we're not real adversaries, we [00:27:00] just play one. And so, making sure that you internalize that and communicate that to the blue team. leads to much better relationships with, with those teams. And so, it is a constant, like, the reason we have a lot of ideas and work here is because there's oftentimes the, people are not open to being tested.
No one wants to be tested, uh, by something that's kind of out of their control. And red teaming is independent, should be independent in how they approach it. And so, um, It takes some upfront listening, really listening, um, but you can absolutely get to a more collaborative place.
AJ: Do you guys get introduced to them
Shawn: What was that?
AJ: Do you get introduced to them? Like when you're brought in on a project, is it something where leadership does this and doesn't tell blue team at all? Do you get an introduction? You know, do they talk about rules of engagement or is it just sprung on them?
Or is there a mix? I'm going to gamble. It's probably the latter. But do you guys ever get a chance to talk to him in advance, build [00:28:00] that relationship before you go try to break down everything they're doing and make them better?
Shawn: The ideal scenario for me is, is kind of, as a consultant, a lengthy, like an ongoing relationship where you test. They have oper you show them in person, you have opportunities to fix them, and you go back and test. And so you're talking to them as people try to figure out, Hey, how do I fix this specific vulnerability?
They might reach out when they're trying to find the right hardware, software, etc. Um, and that is an ideal scenario, not just a one off, but an ongoing relationship. We Got an opportunity to build an internal red team, the biggest physical one in Silicon Valley for a while, which was really fun and exciting.
And in that case, you're sitting next to the blue team. Um, you're surrounded by them, frankly. Um, you're, you're like five against 300, but, uh, and that's perfect. Like, that's great because they can walk over to your desk and say, Hey, we were looking at this risk that was reported. [00:29:00] It came from your team. Can you show us like let's walk over to a door and show us what this is so we can be confident that when we're fixing it globally with standards and things that we know how to fix and it's like that is an ideal scenario, either an internal red team or kind of an ongoing relationship where you do the assessment and then you support them as they, the blue team tries to fix it.
So, um, yes, very rarely will we not get to meet them, uh, where it's just a, like a, CFO or CFO's office that wants to see how the security money is being spent will bring us in to assess it and then say, awesome. Thanks for the report. We'll, we'll take it from here. Um, almost never happens. It's usually kind of a, a collaborative.
It turns from a red team to a purple, uh, purple team effort after.
Ana: I mean, I have a caveat to that. Um. I agree that it's dynamic work and the most fruitful it can be is [00:30:00] when everyone's on the same page for what's happening. That being said, as far as the actual assessment goes for data gathering purposes, there is some healthy level of isolation that needs to happen to keep the performance evaluation realistic.
goal with this is authenticity of the test, and you know, we kind of mentioned in bypass. In passing, um, we mentioned in passing the importance of independence for our teams. And that's really where it comes in. We should have the freedom to threat model. We should have the freedom to test to the extent that we think an adversary might pursue something.
Obviously with restraints, constraints, ethics in place, everything, uh, reasonable, right? But it, it should be done in a way that It's realistic in its outcomes, and so if you're doing a test at a, let's [00:31:00] say, a data center or a, um, you know, a third party site, for example, and the guards know that there's a test coming tomorrow, you're going to get a very different, uh, staffing, first of all, and you're going get a very different treatment as someone who's trying to infiltrate than you would if they know that, you know, every quarter they get tested or have no expectation of being tested at all.
So, you know, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's And just having those different data sets and also having all of those data sets for a mature organization makes a really, really big difference.
Because if you have that relationship on a leadership level, They might have a good idea of what some of these trends are or what some of the kind of boots on the ground are experiencing in terms of, you know, bottlenecks or issues, but there's always a little bit of a disconnect. And so we kind of stress that we want to understand the pain points of the people that actually do the work and not just people that make decisions [00:32:00] because I think.
Ultimately, to be effective, both of those points are critical to know, but without each other, that's not a complete picture and the goal for us is to build a picture out as much as possible because we do kind of this very narrow niche of work that no one else tends to do or no one else has a full overlap with.
So I would just caveat this really long
Shawn: point. Fully agreed. OPSEC is essential in terms of making sure it's a pure, accurate, realistic.
AJ: No, I mean, that may, and that makes sense. Listen, both, both of your answers made sense there. I appreciate you jumping in on it with the, the clarification, you know, there's, there's a line there, as you said. Um, you know, there has to be, right? If you, if everybody knew when it was coming, Hey, if tomorrow's testing day, yeah, you're gonna get your most senior security people on their highest guard, you know, very vigilant.
Um, that's not, that's not legitimate, right? Uh, you know, you're not gonna get the rookies, you're not gonna get the people who are, uh, taking extra smoke breaks or, you know, just not paying attention to what, not to insult people, right? We're not always a hundred percent highly [00:33:00] vigilant, right? They're on their phones, whatever, right?
We're not a hundred percent vigilant all the time. But if you told me. Tomorrow's the day. Yeah, it's, you know, I'm gonna be at my best, right? And it's, I was in the military. If you told me tomorrow was inspection day for my uniform, I'm going to be ironing my uniform tonight and cleaning everything up and being perfect.
I'm going to get a haircut today. I'm going to do all the things I'm supposed to do. If you don't tell me, hopefully I'm within regs. I mean, I'm supposed to, that's my job if I'm in the military, right? But. I'm probably not going to be perfect, right? And that's the goal. So, uh, jumping a bit ahead here. So you talked a bit about this.
There's a lot of cat and mouse game in this. There's, there's hopefully some, some teamwork, as you said, but there's points when there's not, uh, you know, by design. Cause your job is to get a realistic X, you know, uh, assessment, right? So. Where do you draw the line on ethics on this? What's ethical? What's unethical in terms of tactics and strategies?
So, you know, I will say, you know, in my world, we talk a lot about some of this stuff for fishing testing. I'll use as an example. You know, what's ethical? What's not? Um, you know, I, I tend to be for what it's worth. I know people have all sorts of [00:34:00] opinions. I'm curious to get yours. I've pretty much said, listen, adversaries don't follow rules.
So I'm probably not going to either. There's an interesting what's what's kind and what's fair are two very different things. Uh, adversaries don't give a damn about fair. Um, so there aren't topics that'll be off limits. Um, so I kind of fall into that same boat with the caveat that I would in training.
Make sure to constantly train people on things that would never come in email. You're never going to get a pink slip in email. All right. So if somebody sends you a phishing campaign, that's talking about, you know, layoffs and you're getting fired, you know, it has to be phishing. Cause we would never do that as a company.
Cause that's a horrible way to notify people. You wouldn't do that. Right. So you do it in training, but then I would use layoffs as a phishing campaign. I don't consider that to be cruel because I've trained people. We would never do that. We're not a cruel company. Right. Uh, but what are your thoughts like on ethics?
What's ethical? What's unethical? Where do you draw that line when you talk, talk about your strategies?
Shawn: Yeah, this is subject to endless debate from my experience. So I, [00:35:00] uh, I don't have the same opinion in terms of how you would approach the fishing scenario specifically. And that's good, right? I, anytime I disagree with someone, I like to call it out. It's like, that's good because there's more than one perspective and we're not just group thinking each other.
Um,
AJ: Now we get to fight. That's what I said would happen in the opening right now. Now we get to duke it out. All right. You tell me why I'm wrong and I'll tell you why you're
Shawn: is great. I mean, I
AJ: Go ahead. I'm
Shawn: is true. We're, we're probably the closest guests to you so far. And so, uh,
AJ: we're actually not far apart. We could have done this in the same room today. They're only like 20 minutes from my house. So if this gets ugly, know, one of them leaves the screen, you better watch, she might show up or I'm assuming it's on a, she might show up behind me. Right. So yeah, we'll see if my house gets red teamed here shortly or pen tested, but anyway, yeah, go on.
So you think I'm wrong, which is cool. Like you said, I agree. It's different opinions. So what are your thoughts here?
Shawn: Yeah, I think if you want to, ultimately, what's the goal? The goal is that people are not clicking on links that you don't want them [00:36:00] to, uh, There may be a higher percent, like, realistically, you can intuit what is going to happen. If you send out a non, what I would describe as cruel or whatever, like, phishing attempt where it's just, hey, your performance review is available, click here.
People are going to be interested in that. And that is a good fish. And then when they click it, they probably have to do another training, and that training will show them examples of, hey, here are the more likely ones. We're not willing to send it to you because there's emotional things that come along with this, but bonuses, layoffs, some of, some of the more critical ones, the people that are going to click on the, hey, you got laid off email or the same ones that are going to click on the, your performance review is available.
And so you still get the same point. You still get to train them on that email. And so my thinking, and so I'll back up. I agree. Like adversaries do not play by the same rules. [00:37:00] They're often like inherently unethical, which is what makes them adversaries. Like they'll use the money, ideology, coercion, ego, whatever it is to get access to what they want to get access.
Um, and so my rule is if you're doing something unethical or that makes someone feel horrible. Then I am not being creative enough in terms of getting to my goals by, you can always find a different way, right? The brutal way, uh, is often the, the easy way from what I've experienced, at least on the physical side.
I'm not talking about fishing as much anymore. Um, but the brutal way is, is the easy way. And so there's usually ways to reach your goal without causing significant harm or discomfort, things that you have to walk back. So, for me Like every rule, I mean, every rule exists for a reason. So over the years, like I've established the don'ts based on seeing other people do things or hearing horror stories.
I'll say like from the phishing [00:38:00] side, especially like our goal is like a good relationship with the workforce. As a security team is essential and people that get a bonus. And then don't get a bonus, think they get a bonus. And then suddenly they have to do extra training. Um, that is the opposite of a bonus is doing security awareness and compliance. Um, and I say that as someone that has, has helped make those trainings. And so having them on your team, listen to you, take you seriously. Uh, it feels counterproductive to me to, to. isolate them in a way by sending those types of emails out when you could convince them through less emotional, like less visceral type emails.
And so on the physical side, my like, don't do ever type rules are like, don't target specific individuals. Meaning, uh, if you're hired, like, and they ask you to like, Hey, can you see if this person's vulnerable to this type? [00:39:00] That's a no go, right? That goes hand in hand with no political hit jobs, that and security, like there's always turf and politics, budget, and a lot of, um, things along those lines, and so anything, any wind that you might catch of, of political hit jobs or targeting a person is a no go.
Um, this is a big one for me, is don't get anyone fired. Um, Testing isn't to try to rat someone out and get them fired. It's to find ways where something doesn't work. If you, if you get past the security guard because they weren't paying attention, yeah, this is a, every time we do an assessment, you do this.
If you can get past the security guard, who's texting or even vulnerable to social engineering, um, it's not that security guards. Problem. Like usually it's a system problem. Usually you're, you're paying so in minimum wage to protect a multi billion dollar company, or, [00:40:00] or you are, have a single point of failure, which is a human point of failure, when in fact it should be multiple layers of security that need to fail for someone to get into your building.
And so ultimately Um, making sure you don't give names of individual employees or security guards unless they did a good job, then you want to give them credit as many places as you can. Um, but, and then obviously on my side, I guess, not obviously from my experience, but no bribery or coercion, right?
There's, you can always pay someone to get into a building, especially if there's 500 security guards, instead of actually, Potentially committing crimes or, or entrapping someone, not legally, but just ethically, uh, you can assume if you're, you're getting paid minimum wage, you'll be able to spend a couple grand part of what you got paid to red team to get into the building very easily.
And so. To me, that feels [00:41:00] lazy and you're able to call that out and say, Hey, this is something you can probably do because there's a single point of failure, but instead of paying them, I social engineered, or I tried several additional routes. Um, and so there's always ways around that. Um, the
AJ: So do you just include that like in an after action report as an assumption, like we didn't do this, but there's all we just have a general assumption. Listen, for enough money, I can buy my way into any facility. So we just have that as a general assumption because I mean, you're saying it's a thing you think would work, but you just don't do it because it's lazy, which I totally agree with.
Admire, like just there's certain things that are just lazy. And, and I got to rethink my phishing thing, just based on what you said, if you're saying that the same profile of a person would answer this one as that, that might be a better way to pivot as opposed to running up the stress on them. So that's a really good point.
But for a report, do you just have that as an after action where you just say, we made the assumption, if we gave a security guard 5, 000, there's a good chance, one of them would have said yes, because you only pay them X number of dollars, whatever, do you even report that just leave it out altogether?
Shawn: so [00:42:00] yes, but not as straightforward. So. We're not going to automatically put that into a report. We are assumption challengers as a job. We're not going to just make an assumption and throw it in a report. But if Ana is able to talk her way in relatively easy, easily to a building, um, and that person, right, that person doesn't have, uh, technical controls where they need to badge someone in, in order to get a guest badge, they need to put in certain reasons or get.
Supervisor permission, et cetera. If there's not technical controls and you have a single point of failure, as we describe the fact that you're vulnerable, your security guards are both a strength and a vulnerability, and here's how you can mitigate the vulnerability and exploit the strength. As we talk about the vulnerability, we'll call that out.
We'll say this is not within our typical scope, but it is within like, depending on the adversary. Some adversaries will do that, some will not, but depending on the threat [00:43:00] intel and the adversary, we might say this is a tactic we did not use, but the adversary is known to use it, we believe it would likely succeed because you have a single point of failure, and so that's usually how we would communicate it, uh, with a, with a client if it's relevant in the situation. Anything to add? Any thoughts on your side?
Ana: I mean, from my side as a social engineer, Who typically goes for that type of tactic first. Um, what I'll say is that Adrenaline is a very powerful teacher. And so you want to reserve that for lessons that like you really want to stick. And the fact that your, you know, company fooled you into something is probably less of a desired outcome than that.
Approaching it from a learning perspective for that person in a sort of like [00:44:00] one on one situation, right? And so I kind of always joke about this, but the, I know the name of every receptionist and security guard that I've gotten past over the years because I always feel really, really, really bad.
Because inherently, what people want to do is help each other out, and I know what buttons to push to make you want to help me. But that doesn't mean you're bad at your job. Uh, maybe it just means you're a good human, which says something about me, but that's separate. What we want to do, though, is stress the fact that this was a failure point, but a failure point, again, that we can learn from.
If you do that, if you educate those people and kind of turn that around and be like, Hey, I took advantage of you. And this is what we learned about this. These are some of the things that kind of also failed you. That person who now kind of feels a little bit less ashamed, a little less guilt is now your [00:45:00] best asset.
They have become from someone who Drones on through the annual security awareness training, like the rest of us
AJ: No, no. I pay attention to every bit of annual security
Ana: Yes, sir, all of us before.
AJ: Yes. I am very interested annually in training.
Ana: Either way, um, they now have a little bit of the personal experience that makes things a little spicier than the annual report tends to. And so they can be your best advocate and they can be your best proliferation tool for what the best practice actually is. Cause
AJ: that's a good point. So you're saying somebody has gone through it and learned like that's, that's not a negative, right? We shouldn't hold it against them. You don't look at them in less esteem. Like they didn't know until they were trained and now they know. So they're actually a good asset because they won't fall for it again.
Ana: critical point being learn. So I will, I will stress that. I don't think I can ever stress that enough, but as long as you learn from that and you never make that [00:46:00] mistake again, that is the best job I can do as a red teamer. It's like, yes. Okay. Maybe I was able to hit my objective or not either way. It's something that the organization gains from
Shawn: it.
If you fire that person and replace them with someone that has not learned that lesson. You're back to square one, but like we've both gotten reach outs from people that we've social engineered and then immediately so like before you even leave the building one, the minute the assessments over, I, so I have a rule that's like, be honest, even when you're being dishonest, the second that the operation is over, go tell the person who you are, what you did.
Tell them you're not putting their name in the report, except for this one thing that they did, which was really good, which is they challenged you up front. And then I've gotten reach outs after be like, Hey, did you just send someone to try to talk, talk their way into the building? Like, no. But you should call someone because that's a real bad person, not a fake one.
And so those people [00:47:00] tend to be, end up being your biggest allies, your biggest assets after they've gone through it. If you take the time to debrief them, talk to them, ask how they felt like the best. And the most compelling stories I've put in reports, uh, have been from the security officers at the facilities, people are the cleaning people who, who just had this sense that something wasn't right.
They might've still let us in, but as soon as we debriefed afterward and talk to them, they're like, here is why I thought something wasn't right. And here's why. You were able to talk your way past and that gives like, instead of just being like, I talked my way in, that gives specific information to the, the blue team, the security awareness team, whoever it is on how to train and how to have easier reporting.
It's like, I knew something wasn't right, but I didn't know who to call. That is an easy thing for companies to solve. Um, But without talking and debriefing with those people, being honest and joking and making sure they don't go home feeling crappy [00:48:00] about their day. Let alone fired. Yeah, let alone fired.
And so it's, uh, I'll give credit, like, Jason E Street talks a lot about the educational component to red teaming. It's gone just, it's gone beyond just uncovering vulnerabilities. It's, that part is equally as important now.
AJ: So, uh, I want to tip on something. I got a 3rd question. We got to get to, but I want to track on this. So. How do you test the untestable, right? There's some things that you just can't, I mean, I look at and go, well, I mean, how am I going to test a, you know, an assassination attempt or a, or a bombing or, you know, something like that.
You're clearly not going to do that. I mean, you're, you're obviously not going to manufacture a fake bomb either. I mean, that's, that's obviously way more stress than you'd want. So how do you do that? But also another thing I wanted to touch on, these will go together, I think, you Is we had talked before we went on and started recording all this.
I know we had talked about, uh, you had, you had said that red teamers are assumption hunters. Uh, which I think is a really big point. I want to make sure we hit on that. Cause when you talked about it privately, I thought [00:49:00] that's really cool. And I think it ties to this. Like you're, you're obviously not gonna be able to do the bottom things.
So there's some assumptions, but how do you test the untestable? And then talk to me about what it is to be an assumption hunter. Cause of course the whole show is about, you know, the assumptions. And then we'll get into the third question we have still.
Shawn: Yeah. Uh, so testing the untestable, that is ideally where red team shine, right? That's the creativity part. And so you're obviously not going to try to kidnap or assassinate a CEO, but if you sit down and you try, if you sit down and you try to figure out like who, and you read team through it, like who's the adversary, what are they going to do?
Like they need motive. Opportunity capability. And so figuring out what opportunity might they have, well, they're probably going to do surveillance and look for an opportunity when the person's alone, and then they are going to gather like most adversaries, especially in the physical space, go through the same life cycle.
Um, there's opportunistic thieves, and [00:50:00] then there's people that, that plan things out, which are going to cause serious damage to the company. And so if you're talking that, like you look at, you do the surveillance just as they would. Right. If you want access, you go out and do that, you figure out, okay, they're probably going to find someone when they're out on a run alone, and they leave their security detail behind.
So you figure out what time they do that, you figure out where they're going to be, and then you park a car there, and if they run past you, you don't tell them that you're there, you don't do anything, you just say, hey, you asked me to do this, this was, I took every step up until, you know, causing anybody harm, or even alarm.
And then I did it. I mean, same with, um, bombs. There's, there's plenty. Like if you're doing mail screening, you can, there's great companies and vendors that have very realistic, um, devices, or you can build them yourself. If you want to get put on a list, uh, you can, uh, Build that and then go through and [00:51:00] mail it.
Ideally, if you're mailing it to yourself, you're going to the loading dock and you're putting it in the incoming packages. You're not sending it off with UPS. But, um, at that point, it goes through the full cycle and you watch and you see if they catch it. Um, and so you can always test these difficult areas.
You just have to be extra creative. Um, and it's absolutely possible to figure out how to do that. Um. And then the assumption hunter was, was an excellent question. I will keep it short and maybe hand it to you, but ultimately any security gap is an assumption. You assume an integrator installed something correctly.
You assume a security measure exists or it works or somehow very commonly. You assume another team is taking care of it. They assume you're taking care of it. So like we, we end up matchmaking at companies a lot of the time saying, Hey, these two or three security team, you should all talk to each other.
Cause you all think you're securing the thing. You all think [00:52:00] the other one's securing it and nobody is. Um, yep. And so. It's really just kind of the Spider Man meme, uh, in, in person. And we do that more often than you would, you could imagine. And so, uh, yeah, ultimately we look for any assumptions that, that leaders, security managers have.
We, we document them and then we try to test and challenge them. Sometimes we validate them, which means they become facts. And that you can make much more confident decisions with facts. And sometimes we refute them and they become gaps and they can decide. They're spending money on a security measure that doesn't work, and they can either decide let's save the money, accept the risk, and move on, and suddenly you have more security budget, or they can figure out how to mitigate it.
And so, um, yeah, ultimately, everything we do, red teaming boils down to identifying and challenging testing assumptions. Anything on your side to add? No, I agree.
Ana: Um, that's
AJ: it all up, Shawn.[00:53:00]
Shawn: Sorry, I get excited about
AJ: I'm going to ask, I'm asking on the third question
Shawn: Please.
AJ: it's okay. Now, what do you got to add to this one? First on, I don't want to cut you off.
Ana: No, I think, um, I think that's a big part of it, right? And like, part of the assumptions is kind of going back to that comparison we made earlier between pentesting and red teaming, and the fact that, you know, one of it is a little bit more, like, controls focused or centered around controls than the other one, it's because there's an assumption that, hey, you know, if we're trying to protect from unauthorized access, We got to test our turnstiles and it's like, maybe you got to test your loading dock.
And so there's two, two kind of points that I want to make to this. One is no path to do this is necessarily, um, you know, too complicated or too easy. And again, you're going to want to rely on your intelligence to keep things realistic here, but never [00:54:00] underestimate your adversary. Right. So. And that's one big one.
Never underestimate your adversary. Another side of it is, if something can be done simply, don't over engineer it. And I think that, that one is a little bit harder to wrap your head around, right? But, you know, if you can just have someone hold the door, don't start picking the lock after they've shut You're buying them, right? And, you know, this is coming from someone who Kinda knows how to pick locks if I have to, but I'm the one that will go in the lobby and usually get a temporary badge or at least a tour out of trying to talk my way into a building. Um, I don't like to put in the physical, uh, hard labor, so, um, so choose the easier ways.
And I understand, uh, you know, some people won't think that speaking your way into, um, or chatting your way into a building is necessarily choosing the easy way. But. Sometimes it can be, and it, again, depends on your adversary and depends on what [00:55:00] you're testing, but, um, having a gamut of options and making sure that you're not thinking, oh, it's nation state, therefore it must be this most sophisticated thing, and it's, you know, insider threat, and it's this and this, statistically, probably it is, but, you know, don't overlook some of the solutions that may be in front of your face, but we're here to kind of Ask those questions, guides, guide in our interviews to, um, you know, more probable and likely, uh, paths that we can give information that, again, fills that picture out, builds that puzzle out.
So, so whoever is in charge, whoever is making these decisions can make them from a more informed place.
AJ: I. Yeah, I'm just picturing, sorry, I daydreamed for a minute because I was picturing how you're social engineering your way in, like is it, is it the, do you have the pregnancy outfit, do you do the, too many packages, can you help me, like you said, you
Ana: What have you heard?
AJ: uh, you know, the crying always works, I, listen, [00:56:00] you hit a good point, people generally want to be nice, right, and want to help each other, and those are scenarios, pregnant woman, almost everybody opens the door, right, uh, Uh, you know, somebody just frazzled and exhausted.
Oh my God, I'm just late for a meeting and crying. Oh, let me help you. You know, there's somebody with too many packages or doughnuts, right? That's always good. I have running the team. Oh, well allow me, uh, as long as I can have one. Sure you can. Uh, so I I'm just picturing all that and going, yeah, that seems a lot easier than picking locks, I guess.
Now I know people who like picking locks who would disagree, but,
Ana: Exactly, yeah, there's a healthy debate on that too, but, you know, I've, I've, I've kind of done all of those scenarios of pregnancy belly is uncomfortable, um, which kind of gives you a preview of what an actual pregnancy is like, but You can
Shawn: hide tools in there as well, which is a big bonus, I hear.
Ana: I use it as a pouch.
I use it as a pouch because then I can just reach in and get any tools that I need because no one's gonna start, you know, checking whether there's real belly or not, so If some metal is [00:57:00] protruding, it's a little weird, but you know, what are you going to ask the pregnant lady questions? So I've done that.
I've done this super friendly route. I've done the new employee route. I'm so excited. Oh, no. I'm here on the wrong day You're kidding me. Well, can I at least take a look at my desk?
AJ: Nice.
Ana: I've I've resorted to very many less than kind ways to get my way, but
AJ: Ana's a
Ana: One thing I would advise any aspiring social engineers is be wary of faking any medical emergencies.
Um,
AJ: they'll call an ambulance, won't they?
Ana: insulin one was kind of iffy there for a moment, but crutches always work. So,
AJ: Yeah, another good one. Yeah, help with
Ana: me about your social engineering needs.
AJ: crutches a good one. So Ana's a professional liar and an I gotcha. Good to know. So
Shawn: I will.
AJ: he's just not saying as much
Ana: I'm just mission dedicated, okay? You can look at it how you want to look at
Shawn: it, but
AJ: [00:58:00] nothing wrong. Listen, I don't say that. Negatives. Those are awesome stories. I that's, that's all good in my opinion. You're, you're making people better. That's the point. Right? So there's nothing wrong with that. And you didn't say a bunch of things that were off limits, like those seem like just, again, people want to help people.
Shawn: and, and the opposite, the opposite of helping, like it's the, the goal should never be, or the outcome should never be like people on crutches are suddenly going to get less help or anyone on crutches is, are inherently suspicious. I mean, uh, there's, there's a second half of this where there are. Processes and protocols like, Hey, let me hold the door for you.
Awesome. You're in the lobby that everyone can come in. I still need to get the same badge or the same information. Like you can still be helpful also talk to people and politely ask them questions. And so, uh, like good security isn't like, yes, no. Like you're either suspicious or you're not. It's Hey, let me like, it's half customer service, half security.
And you're still helping the woman who's pregnant or the person that's pregnant. You're still helping the [00:59:00] person with crushes. You're doing all of that. But you're still following protocol and so that it's just it somehow lowers the bar for for various people when something slightly abnormal, let alone somewhat abnormal.
And so being able to help your security officers have very simple SOPs that they still follow when these different situations pop up are essential. Simplicity, like you said in the adversary route and I'll say on the security side, like simplicity for people that have to deal with a million scenarios a day, whether it's cyber or physical, um, is going to be key so they can follow the protocols and help the various types of people that you might come in.
Sorry, I'm making things harder for you, but, um, that's good. I love it. I'm here for it. That's the goal. So yeah,
AJ: goal. Yeah.
Shawn: the second half too, is making sure that, uh, they. Yeah, you still have very effective security, helpful security, good customer service, but their goal, their job [01:00:00] is to keep good, allow good people in and keep malicious people or adversaries real or fake out of the building.
This feels personal
Ana: all of a sudden, I feel
AJ: I'd, I'd take it personally. I know that he looked right at you when he
Ana: Yeah.
AJ: for those who are only listening. Like, that quite a pointed look here. All the side. went to on on that So, uh, one more question. Uh, how do organizations, you know, get and measure. The value of red teaming, right?
How do you decide, how do you decide to spend the money? How do you decide if it's working? Like, how are people measuring the value of your services? You know, what kind of metrics are there? Like, how does that all, how does that all work out?
Ana: that's a great question. Um, I think one of the things that you have to keep in mind is that you're already spending the money. As an organization who tries to protect anything, you're already spending the money. What we are here to do is validate, refute, confirm, [01:01:00] overturn that what you're spending your money on is actually what you should be spending money on.
And as a little subset of that, that the measures you think are in place and performing a certain way. are actually there, are actually performing the way that you think they're performing. And not only that, but also kind of orchestrated with the rest of your security systems. And so that's where, again, the RET team is kind of set aside, um, from the pentesting component of physical security assessments, right?
Where it looks at the holistic setup. So it looks at everything within a system. Um, and whether or not it's coordinated or kind of go for that Swiss cheese model. Um, but you don't want to know, Oh, the doors are working or the tourist cells are working, but you can go around them, right? Like if no one's thought about that question, if no one's thought to actually test them that way and everyone just walks up with their badge and it works, then yes, the control is actually working.
[01:02:00] No qualms with the control, but if there's a planter, um, on the other side of it and you can just easily. You know, distract the receptionist or the guard over there and sneak past it, then are your, you know, tens or hundreds or millions of dollars of investment actually working the way that you think they are?
And the answer is not always, and we're here to give you that answer in a way that's data driven, a little bit more scientific, and differs dramatically from the vendor who sold you this control, right? And that's not to, um, Talk bad about vendors, but they come from a specific level of expertise of that control.
They don't necessarily have oversight of how it is integrated with the rest of your existing controls and how they work together at all. And
AJ: a good point. I mean, something can work as designed and not be effective, right? Like you said, the turnstile, it works as designed. The badge works, the turnstile switches it, whatever. As far as a vendor, you'd say, Hey, it worked exactly how we [01:03:00] said it would work. Batten needs a badge, turns out won't move without a badge.
It turns, you know, it reads the badge, it works. So it's, it's working exactly as designed. But as you just pointed out, if you can get around it, and if there's a planter or whatever, it's not effective then. It's, it's working as designed, but it's not an effective security measure. Uh, and those are two different things.
As a vendor selling the turnstile, it's not their, it's not their problem. If you decide to put it in a place with a planter next to it and a guard that doesn't pay attention, you can get around it, right? They've done their job. They sold the tool and it does what they said it would do. It's your job, as it turns out in this case, to show that that's still not an effective Security measure, unless you do these other things.
So it's where it's supposed to be working properly, which is a really, a really interesting point that I don't think a lot of people think about. I hadn't thought about until you just said it.
Ana: and that is the question that is the backbone of existence of teams like mine, right? We are there to. We are there to ask those questions and then push the buttons or, as I've seen this guy do, slide under the turnstiles to see if they detect infrared at a certain level or anything else, right? But if no one's testing it, [01:04:00] your audit team might get as close to it as, as you can think, and they'll come in with a checklist of certain things.
And again, those things might very well be in place and be working as prescribed. But they're not the team that will note the gap or even know to look for it. And again, that's not audit team's fault or anyone else's fault. It's just that mindset. I'm going to get in by any means necessary. And I just have a very different perspective and that perspective informs the asking questions where others just nod their head.
And they're like, okay, access is protected because there is a turnstile that works well.
AJ: I want see Shawn, uh,
Ana: half of that question. So I'm going to pass it over to
AJ: to see Shawn slide under turnstiles. I want to see video of that. what are your thoughts, Shawn? I know
Ana: we can put it in show notes
AJ: Yeah, yeah, I definitely want to see that one. I'm going to share that one out people. Shawn, what are your thoughts on metrics? I know you've been doing this a while.
You know, you and I had breakfast and talked about this not long ago and you said that you've had an evolution in [01:05:00] this thought process. Yeah.
Shawn: Yeah. Uh, so I originally just started with kind of a brute force, like a myopic, like the more vulnerabilities a red team can report, the better they are. Um, and that may be true depending on the organization, but that, that falls solidly on the pen testing side. And it also is, is very zoomed in. So when I was helping.
Pull together a first physical red team at a large, like a tech company, our metrics, uh, which I thought at the time were, were evolved and, and they're still progressive compared to where I started it was what changes, what positive changes and mitigations actually occur after the team goes out. And so that forces the red team to actually work, like have compelling reports, work with the blue team to fix things, um, et cetera.
And that's great for internal red teams. If you go back a year later and retest and all the vulnerabilities are gone and you have to find new ones, like that is effective. [01:06:00] And if you go back and nothing has changed, the company is wasting money red teaming because there's no action taken. And I say that as a red teamer.
Um, and, and more recently, uh, and mostly through experience, honestly, like there's three different. Categories that I would put the benefit in, like one is just risk discovery. So that's what I talked about. Like a red team uses thread and thread Intel vulnerability, identify vulnerabilities, and they look for the threats targeting your assets.
So all those combined and you identify like the unknown, unknown risks. It's one of the only teams to do that. There's shifts. The second one is shifts in perspective, which is just making better decisions. Uh, helping, uh, that's kind of purple teaming, helping the blue team. think differently. And then the third is kind of the one that I've really focused my time on, which is education and awareness.
So gamifying security, like any team can technically do this. The red team, it just seems like the one that's [01:07:00] often the most empowered to be hands on in the field. Um, Figuring out like the various efforts that you can like demonstrate or teach security officers or typically like security managers or even the budget team.
I talked about bringing out some of those folks, like bringing them out to actually see what vulnerabilities exist, how they can be exploited, the education component, and then if you want to teach everybody how to like have better security, um, we've built whereas Waldo programs were like, if you're a company that requires.
Badge, uh, requires everyone to have a badge on and it needs to be a company badge and you can get yelled at if you don't have it, swap out an employee's badge with a badge that just says Waldo with a photo of Waldo and send out a mass email that says, Hey, an employee is walking around with this. If you identify them, let them know, and they'll give you some free company swag.
If you gamify it, suddenly everyone's looking for badges. If you, [01:08:00] if you have sensitive prototypes, then you can't take pictures. Send someone in to take a bunch of pictures, like, over the course of a couple days, obviously. Um, you're not trying to do any of the covert stuff. And as soon as someone reports them, or is like, hey, what are you doing here?
Give them all sorts of swag and appreciation. And suddenly, slowly, that changes culture. People are suddenly excited, like, oh hey, there's a red team that's out doing fun things. Or even a blue team that's decided to go in the field. It doesn't need to be a red team to Gamify security awareness. And suddenly from what I've experienced, like we'll be getting pings all the time.
Like, Hey. We just had an employee that, like, really wants to get the swag and they saw someone jump the fence and run into the building. Was that you guys? And can you send them swag? I was like, no, that wasn't us. Please call the police. Um, but like, and maybe that's actually happened before, but, but it changes culture.
Like, people get excited when they're like, oh, [01:09:00] yeah, like, no one's going to think a real adversary is going to do stuff. But when they know. Red team's out when they know there's this gamified kind of system for security awareness. Everyone wants fun swag or fun rewards. And so suddenly you see your suspicious activity reports, your employee engagement looking at different things go up.
And so that section of red teaming, it's, it's just generally like awareness, education, et cetera. Um, I've seen more return on investment for companies. And there are a lot of tools out there that we encourage and that we're using to really improve the quality of assessment. And what I wanna do here is I'm gonna show you a few examples of the tools that are out there that are really interesting and interesting to me.
Um, benchmarking, and I'm gonna go into a little bit more detail about benchmarking in just a moment. Um, you might think this is pretty, pretty straightforward, but, um, there is a few gaps, you're also increasing security just through the awareness component as [01:10:00] well.
Ana: Yeah, I'm, the only thing I would add is that secure, and this is going to sound straight off like a motivational poster of security department, but security is everyone's responsibility, but not everyone's trained to it.
And so you can't expect non security people to know security things. And these You know, suspicious indicators or like specific behaviors that maybe you and I have mastered, but not anyone who has a busy day who's getting a thousand things done would even know to keep an eye out for. And so this reduces your threat surface by, you know, whatever percentile, just because the entire demographic.
All of your employees, all of a sudden they're keeping an eye out for something as opposed to just your security guards who, you know, there's obviously limitations within the industry for that. Right. But it's also just human and [01:11:00] this just gets more human eyes on things that other humans do that may or may not be for good, but either way, it's creating awareness, it's introducing certain trends that are more common than others.
And it's also. Building an opportunity to have anyone have the right resources to address a situation that's a little bit out of norm in a friendly, but firm way. And I think that's what's missing. You know, people are, most people anyway, are inherently conflict averse. They don't want to be like, Hey, you, you know, you have the welder badge.
What are you doing here? Right. That no one wants to kind of antagonize anyone else. But if you're like. Friendly, but know your protocols or know what to say or have a prompt or the next step. And sometimes it's as simple as literally knowing the number for the security department so they can send somebody.
That's a really easy fix [01:12:00] that you wouldn't even know to put in place if it wasn't this effort to kind of build the whole company's culture up. And so it's a simple but really engaging way to get everyone on the same page instead of just this very. Uh, select demographic that is high in numbers, but it doesn't make all the difference, try as we might.
AJ: Well, I think, I mean, that's,
Ana: for the security.
AJ: I mean, I think the big point, right. You made it it's yeah. Secures everybody's responsibility. We hear it all the time, but not everybody is trained, right? Not everybody's knowledgeable. Not everybody thinks about this the same way inherently. Even if you go to your annual training, you check the boxes and nobody pays attention, let's be honest, right.
That doesn't sink in then. And then they go back to their job. And if their job is. You know, they're the receptionist or they're in accounting or they're at the loading dock or whatever it is, right there, they're doing what they do every day. And they're very, very good at their job. This isn't ingrained.
This isn't a culture, as you guys have said a couple of times about, you know, creating this culture of security. And I love the idea [01:13:00] of, you know, the gamification and making it a positive and, and making people. Excited about it, right? They, they, they want this to happen. They want to participate in it. It's not a bad thing.
It's everybody gets to be better and you pick up some swag or I don't know, lunch or whatever the hell they're giving away for these things, but whatever makes people happy and excited. So I think, but I think you really hit it on it. That, you know, security is everybody's responsibility is easy to say, but you really got to build that culture where it's everybody's thought process too.
Or, you know, it's just, it's just words, right? I mean, you can say it's everybody's responsibility, but that's unfair. If you're only going to follow up with these, you know, week, annual trainings that people check through, it's the same thing every year. And they don't really pay attention and then they go back to their jobs.
So, all right, listen, we got to close up the show. Everybody knows at this point, right? This is my favorite part. I'm not gonna lie. The name of the show is unspoken security. So with that in mind, you got to tell me something you ever told anybody, something that's been unspoken to this point. Now I got two of you today, which going to make extra fun for me.
So I was going to pick on Ana first, but because you just finished that last question, you get a minute to see what Shawn's [01:14:00] going to reveal. Cause I. I'm curious if, if you've even told each other these things. So Shawn, tell, tell everybody listening all of America and the world. We are around the world, millions of people.
No, I wish that was true, but it's not yet. It's not true yet. But, uh, anyway, the eight people that listen to the show, tell me, tell, tell them. Something unspoken, right?
Shawn: yeah, so I started my red teaming, hacking, whatever you want to call it. career thinking I was a little journalist and this is when I was young and when, you know, cell phones were, you know, you pulled the antenna out and everything. And suddenly there was new cell phones that maybe had a little screen on the outside and you didn't have to pull the antenna out.
And I thought that technology was so cool. And the first one that had a camera. I, I hacked it so it could take like six pictures a second and suddenly you had like four second videos you could take and publish that and so like I got very into phones for [01:15:00] many years of my nerdy young life. Um, and I started publishing and publishing like, hey, here's the new tech that's coming out and I would gather all the different little pieces from the internet.
And, um, eventually I found my way into systems and employee portals that maybe I shouldn't have been in. And so I was thinking I was just being a really good young journalist, publishing this exciting information about this brand new phone technology. Like Google was considering coming out with some type of operating system and I was one of the first to publish on that because I read about it.
Somewhere internally and, and, uh, many years later, I look back and I was like, no. I was, I was just hacking into shit. I shouldn't have been in and, and, uh,
AJ: were committing felonies is what you were doing. You were, you were stealing IP and releasing it to the world. That's done, Shawn. Good work.
Shawn: So it, it all Ed. Yeah. All hunky Dory [01:16:00] told. One of the big company's digital forensics teams showed up at my doorstep and told me to stop. Um, which I did. And, uh, that, that's where we are today. We're not getting into the rest of the story. All good.
AJ: sure the statute of limitations has run out on all this. Don't worry if anybody's and there's a badge, just leave him alone. long ago now, and he's a good guy, but so yeah, as a, as a kid, you were accidentally breaking in and stealing IP and sharing it with the world. That's a. Good job.
Well, well, well done, Shawn, your,
Shawn: I just, I was young, dumb, and I loved bones.
AJ: Listen, it's a great story, but what are you gonna do? You were a hacker and you were doing stuff and you did it innocently, which a lot of people have. People gotten in trouble for things that they didn't intend to break the law or they didn't think about it in that way, right? And I mean, that's, that's the mindset, right?
You, you, you were into something. It was, it was your passion. You were geeky about it and it didn't occur to you that you were crossing lines. Like that's cool. Probably set you up for your career now, as long as you stay out of [01:17:00] pinstripes.
Shawn: Couldn't agree. I definitely peaked in my hacking, uh, decades ago, so. That's okay.
AJ: I doubt that's true.
Shawn: Yeah, yeah. Starts and ends. But
AJ: All right, Ana, what's your secret? Nobody's heard this before, including Shawn, from what I understand.
Ana: Shawn, um, it's far less exciting. So I kind of wish I had gone first, but for those eight listeners that have stuck around this far, I just want to thank you for being here on this nerd journey. I love it. I say it lovingly. People have taken offense, but you know, we are one. Um, so. My story is, um, goes way back in the days of the Slavuty Union, but, uh, I, you know, I often will introduce myself as a pioneering woman in red teaming, and partially it's because, uh, there's not a ton of us women doing this type of work, and partially because it's a pun and a callback to where I come from, but, you know, I'm a pioneering woman in the world of red [01:18:00] teaming.
I'm a program builder and I'm a program manager. But what people don't know, including present company, uh, is that my first project that I ran was staging a riot in seventh grade where, um, my favorite teacher
AJ: this is awesome.
Shawn: That tracks. Yep.
AJ: than Shawn's already. You staged a riot in, and where were you at this time? I want you to finish the story, but I want people to know, cause I, I know where you grew up.
Ana: Yeah, I, I was born and raised in Republic of Georgia in the, the shambling aftermath of the Soviet Union falling apart. And so, you know, there was civil war and unrest and military coup this week and the civil coup the following. And so there was a lot of unrest. And, um, we kind of, before we started recording, I know we joked about some acquired trauma, but, um, my favorite teacher had left.
And I [01:19:00] found this unfair, uh, because I didn't have a lot to hold on to. And so I just, I remember distinctly making posters and recruiting people to participate. And, um, I got into a lot of trouble. And so that just refined my tactics for the future. But I'm gonna just leave it at that. Cause that was my first, uh, more or less security related project that I ran.
AJ: You staged a riot in the former Soviet Republic of Georgia.
Ana: It wasn't like a weekly occurrence. It's really not that big of a deal, but
AJ: this?
Ana: it was, um, I don't know, actually it started with like 20, some people, but it got very quickly out of control. And my mom did have some choice words for me after she spent some time at the principal's office. And, um, yeah,
AJ: Oh,
Ana: my lesson, but I don't know if that's the moral that they wanted me to take away from it.
Shawn: Yeah.
AJ: Right.
Ana: [01:20:00] Better backstopping, you know, kind of remove myself as a leader of the movement. But yeah,
AJ: how to motivate people, apparently, to a cause, and how to appeal to their emotions, which I'm sure still plays in your favor now. Um,
Ana: I've just always, you know, I've always advocated for education. That's what I can say.
AJ: There you go. That's right. It was, it was all about education. It was all for the children. Like just say it's the children. People love that
Ana: This is a fresh take. This is a routine perspective.
Shawn: I'm glad you landed on the right side of the law. Who said,
Ana: you know, this explains my career so much, right. It's just like, okay, I've always been teetering on that edge and dark side pulls me over every once in a while. But. At least I've made a career out of it.
Awesome.
AJ: whether they're a red team, whether they're blue team, whether they're white hat, black hat, whatever it might be, however people want to label them. There's a mentality there. You know, it's about doing things that are a little outside of the boundaries and trying to, you know, See how far you can go.
Listen, I'm notorious for [01:21:00] any sporting event, any major venue. I like to go through the doors that say employees only, you the back. But I, I'm, I've been known for, I'll do it at movie theaters too. I've been known to do it. Uh, uh, I used to work. In a stadium. I worked at the Metro, the old Metrodome. I worked there.
So I got used to all the ins and outs of stadiums. And I realized just most of those doors, you can just walk right through. Um, and so I've done it. I escaped out of, uh, I didn't feel like walking the whole mall. It's like, I'm just going to go out this area and kind of walk through about like an employee area.
And sometimes it works out well, sometimes it doesn't. Uh, last time I was in Vegas, I made the mistake of going the wrong direction out of a casino and a backdoor. And I didn't feel like wasting time. And I ended up in a. Like trapped in a frickin alley, where the hell I was, and the Uber couldn't find me.
It was a nightmare scenario. So, um, A lot of people in this industry just do that, like they break rules and go different directions, not out of malice, but just out of curiosity and, you know, Hey, where does this go? What can we do here? What's the worst thing that can happen? You know, somebody gets you, and you go, Oh, I got lost, I didn't mean to, you know.
Now I'm old enough and I got a few gray hairs, I can actually pretend to just like be seen that way, [01:22:00] I suppose, but oh, I, I was looking for the bathroom, you know, I, I, you know, whatever you can make up a story, right? And most security guys be like, I'll walk you back where you're supposed to go. It's not like I'm showing up in the locker room and, you know, getting autographs.
Um, so everybody's got one of those, man. And I think your story is a hell of a lot better than mine. And, and despite what you think on, I think your story is a hell of a lot better than John's. Uh, so like they're both really cool stories. So listen, I want to thank you guys. I can't thank you enough for being on the show.
Um, you know, it's, you guys do amazing work. I think it's, it's what you guys do is really interesting stuff. Uh, you know, most people are envious of the career you guys have, myself included, breaking into places for a living is. Pretty fricking cool. Um, before, you know, before we close it out, just, you know, I can give you all the thank yous, but is there anything you want to say?
Do you want to plug your company at all the people? Where can they find you? Like you're totally can do that. I want to make sure you guys, you know, people know you guys are awesome now. So how do they hire you? You know, who are you, you know, where can they find you? And then, then I'll just close out the show and be done.
Shawn: Uh, we're both incredibly passionate about red [01:23:00] teaming. And so if you're an aspiring red team or a company that wants to build your own internal one, a cyber team that wants to do physical or any, anything in between, and you just want to chat, um, you can email. Either of us, uh, you can just do info at pine risk.
com. Um where we try to publish and open source and if you want a red teaming template, like we're we're not in it Like we're we're in it to professionalize Red teaming. We want more people doing it. Uh, doesn't mean we're, we're out doing it. We'll send templates and we'll support you as much as we possibly can.
So feel free to reach out. We're happy to, to help. I'm happy to talk to anyone earlier in their career. Um, and yeah, that's it. Thank you very much, AJ, for having us. I really appreciate
Ana: it. I will just add a quick caveat that if there is someone who'd like me to drop on a building out of a helicopter, I'm here for it.
Ana at Pine Risk and I'm available for that anytime. So just. You know, we're [01:24:00] here to solve problems and have chats like this, and thank you for having us. It was great, uh, telling you these stories and our experience, so.
AJ: Oh, totally. I'll have you guys back. You can tell us what's a war story. So I noticed that Shawn's at info at pine risk. com. We're very corporate. Ana wants to get dropped out of airplanes. So hers is just on a
pine
Shawn: Yeah. That's
AJ: Shawn is actually also Shawn at pine risk. com by the way. But
Shawn: sickness. I'm not going to do the helicopter thing, but I'll be, I'll be there.
AJ: I, I, I, I'll tell you guys, like, I think you guys, LinkedIn, uh, pine risk. Yeah, it's pretty awesome. And these guys are great. And you'll see the rest of the resumes and all the things they've done. Um, you know, I, I recommend, you know, working with them. Cause they're just cool people. So again, I just want to thank you guys one last time for coming on the show.
Really appreciate it. Everybody who's been listening and watching. Thank you as well. You know, do all the things to help us, you know, likes and reviews and downloads and, and, and all the, I don't know, click all the buttons to do all the things. Right. Just keep. Keep doing this and give the feedback, right?
You don't like shit. Let me know. You do like it. Let me know. You got people you think you want on the [01:25:00] show. You want to get rid of me and get me off the show. Let me know. Uh, just, you know, whatever it is, just reach out. You can find me on LinkedIn. You can find me through ZeroFox AJ at ZeroFox. com. You can DDoS me if you want.
I don't care. So anyway, again, appreciate your time. Appreciate everybody listening and watching. Uh, until next time, you know, that that's it for this episode of Unspoken Security.