Development and operations a practical guide


How do we solve this dilemma?



Download 4.62 Mb.
View original pdf
Page8/96
Date11.02.2023
Size4.62 Mb.
#60628
1   ...   4   5   6   7   8   9   10   11   ...   96
1 Joe Vest, James Tubberville Red Team Development and Operations
How do we solve this dilemma?
We can solve through Red Teaming based exercises. Red Teaming captures the threat perspective.
Inspired by military philosophy, many industries have discovered the virtue of "Red Teaming" a defensive capability, and that its effectiveness grows when tested under actual battlefield conditions.
Merely studying a threat's tactics is less useful than actually experiencing them. Simulated threats build real confidence and muscle memory in network defenders and arm them with better situational awareness of tooling and tactics as well as lessons learned from simulated failure.
Red Teaming maybe referred to as threat emulation, threat simulation, adversary emulation,
adversary simulation, or some other phrase that expresses a threat-based approach to security testing.
Before we jump too deep into the concepts of red teaming, we must level set our definitions. A
common lexicon is critical to keep everyone on the same page to ensure we maintain a common unbiased base of understanding. The authors of this book have seen misunderstood terms cause severe complications and missed expectations. Concepts will be defined and explained throughout this book.
We begin by defining red teaming.
Red Teaming is the process of using Tactics,
Techniques, and Procedures (TTPs) to emulate

a real-world threat with the goals of training and measuring the effectiveness of the people,
processes, and technology used to defend an environment.
Assumptions, bias, misunderstandings, and disbelief have a significant impact on the security operations of an environment. Red Teams provide formidable, honest assessments of internal practices and security controls by challenging assumptions, disregarding norms, and exposing atrophy and bias. An unbiased analysis using Red Teaming measures the gap between "what is" and "what should be. The application of red teaming provides unbiased ground truth and a deep understanding of security operations as a whole.
Red Teaming epitomizes the practice of attacking problems from an adversarial point of view. This mindset challenges an idea to help prove its worth, identify weaknesses, or identify areas to improve.
Complex systems are developed, designed, and implemented by skilled, trusted professionals. These individuals are well respected and trusted in their field and are highly capable of designing and developing functional systems. Although these systems are highly functional and capable, the ideas,
concepts, and thoughts can sometimes be "boxed in" leading to incorrect assumptions about how a system honestly operates. People build systems, and people make assumptions about capability,
functionality, and security. These assumptions lead to flaws in which a threat may take advantage.
Red Teaming provides a means to challenge and test conventional wisdom and thought. A few standard methods to apply Red Teaming scenarios are:

Download 4.62 Mb.

Share with your friends:
1   ...   4   5   6   7   8   9   10   11   ...   96




The database is protected by copyright ©ininet.org 2024
send message

    Main page