Washington DC 13 APL 2023
If an anthropologist or sociologist were to describe the US Department of Defense they would describe a planet,1 populated by multiple competing self-governing nations, each with its own culture, language, customs, traditions, sacred lands, and folk costumes. The history of planet DOD is one of constantly warning nations struggling in a rulerless system, aka a state of anarchy. Eventually this power vacuum was filled by the creation of a Sovereign to rule planet DOD.2 The nations slowly and begrudgingly ceded power, money, and control to the King. While much less than before, there continues to be strife among the DOD nations until this very day.
Learning from great imperial powers, the King sent forth his minions to engage in extensive missionary work among the warring nations and tribes, undermining their pagan ways with new sacred texts, and imposing a common language over local dialects and accents, thereby forging traces of a common identity.
The DoD Dictionary of Military and Associated Terms is the foundational text upon which the holiest of holies, the Joint Publication Series, rests. Together these bibles comprise the new law of the Kingdom. All disputes are resolved by reference to the sacred texts.
It was not until an encyclical signed by Pope Martin Dempsey, in 2013 that, and I quote, “clarified the distinction between the term “red cell” and “red team.” “
Hallowed be thy name, Joint Intelligence 2-0. Until that burst of enlightenment, ignorance, confusion, and chaos, stalked the lands. Despite papal intervention, “Red Team” is still a disputed concept in the doctrine of the church. This is in part because it attracts disputatious seers who many believe just get in the way of action.
It is a brave church that encourages and indeed facilitates the questioning of its doctrine. Although, it must be noted, it was a long time coming and kept to strict limits - an example where creation of an organization ≠ innovation. This is in part because the Red Team is cloistered in a roped-off the side chapel with very limit seating and no visitors are allowed.
The Team is Red for a reason. For they are a group of heretics who advocate for the Devil. In doing so, they question the scriptures and oftentimes issue their own interpretations of holy writ. Despite being church-approved devils, and thus quite tame by comparison to those shooting arrows from castles on K Street (let alone Beijing or Moscow), their self possession frequently rankles the invisible powers of authority. Alas, more often than not, the Devils come to learn that no one is a prophet in their own land.
Yet occasionally, the Gods of War listen. For some are wise enough to know that thinking afresh is much less humiliating than a defeat from a foreseeable surprise.
Here endth the lesson. Thanks be to JP-2.
RT and Strategic Thinking
We here are interested in Red Teams and their impact on strategic thinking and plans. Their number one task is to prevent surprise. They deliver this by providing a vital review function, a quality control check on policy.
The DoD Dictionary of Military and Associated Terms (2021) declares a Red Team shall be
Red Team — An organizational element comprised of trained and educated members that provide an independent capability to fully explore alternatives in plans and operations in the context of the operational environment and from the perspective of adversaries and others. (JP 2-0)
Joint Publication 2-0 Joint Intelligence 2013 elaborates yet further
Red Teams and Red Cells. Command red teams are organizational elements comprised of trained, educated, and practiced experts that provide the JFC [Joint Force Commander] an independent capability to conduct critical reviews and analysis, explore plans and operations, and analyze adversary capabilities from an alternative perspective. Red teams assist joint operation planning by validating assumptions about the adversary, as well as participating in the wargaming of friendly and adversary COAs.
In contrast, J-2 red cells perform threat emulation.
Other uses of “Red Team” or “Red Cell” Terminology
War Games
Some of the confusion arises from war game terminology, where game play is Blue Cell (the good guys) v Red Cell (the baddies). Decisions are made about the progress of the war by the White Cell - the umpires.3
Cyber
Red cells, teams or hats, refer to penetration testers, so they fit the threat emulation function which, by holy writ, is a red cell activity.
SOF
Red Cell was made famous by the late, great, legendary SEAL Team VI (ST6) founder, Rich Marcinko. Working from the premise that “it takes one to know one” ST6 used its knowledge of terrorism to become in-house terrorists. Their mission? To conduct real-world penetration testing of US military bases and forces to test their defensive posture against terrorist attacks. It never went well for the defenders. ST6 penetrated nuclear power plants, nuclear submarines, and a range of other military ‘hard targets’ as well as civilian critical infrastructure. They also took hostages and took role play to a whole new level. Some Admirals did not like getting bitch slapped as part of a drill. The best part was their antics were filmed so their targets could not deny their defenses were successfully probed. Presented for your entertainment is an hour long documentary. [Watch it later… keep reading!! ;)]
While it’s not the focus of this article, there is a vital lesson for innovation in ST6’s red cell story. It was very effective at finding and exposing vulnerabilities that needed plugging. In the interests of realism, and because they were basically pirates (every military needs them), the Team got a bit carried away. Their point was terrorists don’t play by the rules so neither would they. In MILabs estimation, thats fine up to physical assault on people. As you will see, they crossed that line. The unfortunate consequence was the program was terminated and valuable realistic testing went away because it was too counter-cultural.
The lesson here is clear: Innovation ruffles feathers. If it doesn’t, you are not doing it right.
Alternatively, as an Admiral, would you rather get bitch slapped or lose a war?
Red Team
MILab wants to highlight the utility of Red Team’s as an innovation tool for strategic thinking. It is a tool, not a substitute. Post 911, many military commands had, or created, a Red Team. Almost all have since disappeared. This is either an indication that senior leaders think their staffs are sufficiently reflective and comfortable with incisive robust self-criticism to the point of always being able to anticipate surprise, or it should be a significant concern.
It is vital to note that nothing about Red Teams is settled knowledge.4
They are often assumed to be an intelligence function.5 This is a relic of recent history and J-code silo bias.6 Red Team methodology should be used in any planning activity - strategic plans, logistics, ops and intel, can all benefit from critical review.7 Indeed, the method should be applied in all planning teams, not just Red ones.
Red Teams should like a gym or a 'test range of the mind' where thinking muscle memory is built around what John Cleese and Robin Skinner call “Open Mode thinking” (OMT).8 That is, a way of thinking where there is freedom to be creative, to think the unthinkable, to propose, challenge, and deliberate ideas, without self-censorship, or fear of peer or higher condemnation.9
It is a way of thinking. It is the ‘how’ of critical thinking to get to the toughest question - the ‘why’.10
Typically, the highest payoff for Red Team methodology takes place where issues take on deep, multifaceted, complexity, where easy solutions are elusive, and the choice is limited to least worst options. These high-risk environments offer multiple dark corners from which surprise can drop like a wrecking ball at any moment.
Perfect clairvoyance is not the objective. Ideational agility, ability to think from multiple perspectives, comfort with volatility, uncertainty, complexity and ambiguity, and willingness to entertain concepts and analyze solutions on their merits free from restrictions of inter alia, positional authority, careerism, self interest, is the objective.
Merging official definitions, MILab tentatively defines Red Teams as
“An independent capability to conduct critical reviews and analysis of plans and CONOPs, and to fully explore alternatives in support of command decision making with the primary intent of avoiding surprise”
Avoiding Surprise
The first question that should spring to mind is “are you telling me our strategic analysts don’t think critically and fully explore alternatives?”
This is a complicated question. They absolutely should. MILabs was established in part because, in our observation, command staff’s often do not know ‘how’ to think. This is a leadership issue as much as a staff issue. All commanders and planning team leaders should be taught the best way to create "Open Mode Thinking" in order to squeeze the best out of their teams. This applies to any team. There is not much point in leaving serious, rigorous, thought up to quality control checks at the end of the process. It should apply all along. Once a decision is made, that is the transition point to the “Closed Mode” for implementation.
Professional Military Education (PME) - known as the Command and Staff, and War, Colleges - were established to teach critical thinking skills as well as staff planning processes.11 These may appear to be contradictory activities but they are just other labels for open and closed thinking modes. Poorly led and/or misunderstood, military planning models can turn into an unimaginative color by numbers, rote assembly line. There has been a big push over the last decade to introduce "design" thinking into the world of military plans in an attempt to avoid reductionism. It has certainly helped planners appreciate the VUCA would and the usual lack of black or white options.
PME is a huge topic in its own right, suffice to observe for now that their performance is generally good. The problem lies with the receiving staff's that too often have an insular culture that rejects the "new ideas" taught in "the school house".12 The refrain "that is not how we do it here" is everywhere, making officers wonder why they bothered with PME (beyond the fact it is mandatory). On arrival at a staff they apply the "Open Mode Thinking" (OMT) they learned in PME only to be shot down by "Closed Mode Thinking" (CMT). As we will discover when we dive into the Cleese and Skinner system, a mix of both is needed and Red Teams are a vehicle to get there. A major advance would occur if all Staff's adopted the system, so that the Red Team would quality control the products of advanced thinkers.
Biases in Planning
Psychology also has some insights into this question. There are a range of cognitive biases and logical fallacies that can inadvertently lead analysts and generals astray. Bias will always be with us. The best we can do is attempt to be conscious of them.
So in the most basic sense, Red Teams are a quality control applied to strategic planning.
There are literally hundreds of biases.13 Because strategy demands prioritization, MILab is not so much interested in an exhaustive listing, but will drawn on our experience on command staff's and war college graduate seminars, to touch on some of the most pervasive of the species.
Military Biases
“The only phrase I’ve ever disliked is, ‘Why, we’ve always done it that way’ ”. RDML Grace Hooper USN, demolishes the “appeal to tradition” bias that is one of the most prolific and damaging biases in strategic thinking. This example was given in the technology ≠ innovation article. Equally, possibilism, outlined in the solution to innovation article is a rejection of both optimism and pessimism biases (and associated catastrophic thinking).
Appeal to tradition - “thats how we did it last year”
Status quo - preference against any change
Authority - trust of, and swayed by, seniors
Metrics and fake science - obsession with measurement and misapplication of engineering processes to human political activity and thought
Anchoring - over-reliance on the first piece of information that comes to hand when making a decision
Confirmation bias - we tend to find and remember information that confirms our perceptions or beliefs (as opposed to facts)
Anecdote - using an isolated experience as compelling and worthwhile evidence
“Outsized operator bias” - this is often a powerful subset of an anecdotal bias where the individual’s specific tactical experience is allowed to affect the analysis of larger strategic or operational problems.14
"Currency bias" where experts censor themselves or others because they have not seen the latest intel or have not been on the ground in 90 days (seems to be the average acceptable period of currency). This can be a real problem sometimes. More often than not an inability to put things in context is the issue, not currency.
Availability heuristic - Reliance on immediate examples that come to mind making judgements
Framing - Drawing different conclusions from the same information depending on how its presented (this can be useful if used knowingly)
False consensus - We believe more people agree with us than is the case
Group think - Due to a desire for conformity and harmony, we make irrational decisions to minimize conflict
Correlation and causation - Deducing cause based on correlation
Law of triviality - disproportionate weight to trivial issues, avoiding more complex issues
The Input Bias. Bureaucracies are much better at measuring what they spend and actions they take (input) versus the effectiveness of spending and actions (output). Finding the line of effectiveness is the challenge.
Paralysis by analysis - whenever the evidence points to an unwelcome outcome, more study is called for in substitution for a decision. Also known as kicking the can down the road. This is a Pentagon classic.
In military decision making, the power of these biases are heightened because of the unique context of information availability (denied or overabundant), the speed at which decisions are required, the character of wicked problems, and the inability of human thought and action to conform to easy measurement and prediction.
War is a human and social activity. Not everything that is important is measurable, and much that is measurable is unimportant. Fear, honor and interests, cannot be measured and predicted like the trajectory of a missile. The metrics and fake science bias is the most prevalent and dangerous and is born from an over emphasis on engineering and an under emphasis on the humanities.15 Many of the rest of the biases in this list, spring from this one. See the methodology section of the introduction to MILab’s mission statement.
So having a quality control system to check assumptions, conduct critical reviews and make sure alternatives are fully explored is extremely worthwhile. Depending on the skills of the original planning and Red Team’s, the activity might be marginally duplicative but thats a small price to pay in high-consequence, nationally sensitive, decision making. Indeed, post robust analytical quality control, duplication in this context suggest confirmation.
In the next article, the leader of the US Special Operations Command Red Team, who was in charge for over a decade, discusses his Red Team’s tools of the trade.
The biggest employer in the world is Walmart with 2.3m people. This is dwarfed by the US DOD at 3.5m. The number includes active duty, National guard, IC, Govt civilians, and a staggering 641,428 DOD Government contractors.
Sovereignty passed incrementally by various events from the 1947 National Security Act, [creating the CIA, NSC, JCS and USAF, and fusing the services into the Department of Defense], the 1986 Goldwater Nichols Act [Joint Forces] and other activities. Prior to WWII the Sec of the Dept of War (Army) was co-equal to the Secretary of the Navy, and both answered independently to the President. From 1947-49, with the addition of the Air Force, the three services secretaries continued to report individually to the President until they were placed under the Secretary of Defense by an amendment of the Act in 1949.
War games can be table top, using computers or board games, or using fielded forces, or a mix - senior leaders in a command bunker directing their forces in 'battle’. In the latter games, the white cell are the invincible peeps with white arm bands wandering the ‘battlefield’ yelling “No Johnny, you are dead, lie down”.
The DOD only created a formal definition in 2013. The fact that discussions about Red Teams get so disputatious is evidence that they work! They are staffed by analysts whose job it is to probe assumptions, question definitions, and provide alternative views.
The intelligence failures behind 9/11 lead to the Intelligence Reform and Terrorism Prevention Act of 2004 (IRTPA) that directed the Intelligence Community (IC) to
SEC. 1017. ALTERNATIVE ANALYSIS OF INTELLIGENCE BY THE INTELLIGENCE COMMUNITY.
(a) IN GENERAL.—Not later than 180 days after the effective date of this Act, the Director of National Intelligence shall establish a process and assign an individual or entity the responsibility for ensuring that, as appropriate, elements of the intelligence community conduct alternative analysis (commonly referred to as ‘‘red-team analysis’’) of the information and conclusions in intelligence products.
It is notable that the CIA Red team were involved in drafting this language.
J code refers to the staff structure of “Joint” US military commands. These codes are mirrored in each HQ and if not joint commands will be G for ground, A for air and N for naval. So the N5 is a naval strategy shop. A5 Air and so on.
Administration
Intelligence
Operations
Logisitics
Strategic Planning
Command Control Communication
Force development
Resources and Assessment
J Code Silo Bias is the tendency to silo activities according to function, reliving other functions of certain activities - like using Red Team techniques.
The Input/Output Bias is big here too. Bureaucracies are much better at measuring what they spend and actions they take (input) versus the effectiveness of spending and actions (output). This is often because so many other variables impact outcomes. Finding the line of effectiveness is the challenge.
So can high impact/consequence tactical ops. ST6 and most aviation units conduct gloves off review of their plans and after action reports. For example, senior enlisted leaders have no issue with telling a junior officer their concepts will get the guys killed. This is exactly how it should be in all units all the way up the chain. But authority bias, groupthink and careerism almost always have a self-censoring effect the higher you go.
There will be a stand alone post on their thinking in the near future. John Cleese, Creativity in Management: A Speech to an International Audience, Grosvenor House, London, 23rd JAN 1991. See also, Creativity: A Short and Cheerful Guide, Penguin, 2020.
As a war college professor, MILab spent a considerable period of time at the start of the academic year preparing field grade officers for intense but structured and listening-based debate. By definition, not an easy task. First, partisan politics and cable news talking points had to be stripped away. Second, a culture of tolerance and patience had to be introduced- pretty much the exact opposite of point 1 (which is important re national political culture). Third, the noisy and frequent and to be moderated and the shy and quiet brought to life. The only way to do this was by creating a safe space. NOW before you lose it, I do not mean it in the woke sense. I abhor restriction of debate except when it invokes threats of physical violence or rank bigotry. I cultivated a place where no one was laughed at, or alternatively, “the stupid question” was valued. By that, I mean when someone asked a question that seemed basic to others, likely due to their experience or education being in a different field that most, I would comment on it in a way that suggested it was one of those questions that seems obvious but once the surface was scratched might yield surprising insights. Like a child asking “why is the sky blue”, this was often the outcome. The point being that highly successful, ego driven military officers, had to drop their guard and be intellectually vulnerable.
The best example was a B52 officer who asked why didn’t we nuke bin Ladin in Tora Bora. The class erupted in laughter. After It died down I ask them why that was so funny. They looked at me like I was a lunatic. But because the professor was asking, they paused for a second and wondered if they had missed something. I asked the officer to elaborate what he had in mind. He went on the lay out a very good case for using a small tactical nuclear weapon that had tactical to strategic reasoning in its favor. By the end of the session, we had explored everything from how a bomb could be dropped to negate fallout. How its delivery system threatened no one. That there was no possibility of nuclear retaliation. That it would have killed bin Laden and sent a message to anyone else not to even think about another 911. It would have terrified terrorists and their state sponsors and the past 20 years might have been very different. The point is not to rehash the case, but to show how a comment that made a group laugh because it seemed crazy to them, could be turned into a long, thoughtful serious debate. We all left the room that day seeing g things in a whole new way. That would not have been possible had the group not been encouraged to stop quick reactive thinking and engage in slow reflective thinking, trusting that their debate partners would not humiliate them for putting forward unorthodox ideas.
Japanese planners never debated whether they should attack Pearl Harbor. Their debate was limited to ‘how’ and ‘when’. Perhaps had Yamamoto said before the attack ‘we will wake a sleeping giant, lets instead attack the places we need for our resources, and not give the American’s an excuse for breaking their powerful isolationism’ things might have gone differently? Or alternatively to invade and occupy Hawaii, pushing the Navy back to the mainland.
MILab is compromised of educators and serving officers who collectively have over 50 years of PME teaching experience.
Frequently, MILab had his best and brightest reach out years after school to ask “why did you teach us to be open minded, this is not what the ‘XYZ’ staff wanted at all?” “The school house” is used derisively in this context to imply an ivory tower that cant possibly understand the pressures of war that staffs shoulder every day. The anti intellectualism of “if you can’t do, teach” is part of this mindset and forgets that senior uniformed officers comprise the bulk of PME educators. If the service is not sending their best officers to be instructors, then thats an important thing to remedy. The only way to get movement on this is to incentivize the best to spend a year or two passing on their wisdom - with performance reports rating officers as both students and teachers. Remember Alfred Thayer Mahan had his greatest influence upon the USN at least initially, through his period on the staff at the NWC. He provided an open thinking culture that created the interwar war games that delivered innovations like carrier warfare and amphibious EABO (Pete Ellis was Mahan’s student).
Here are some useful starting points for inquiry.
List of 50 Cognitive biases.
List iof 30 logical fallacies.
List of 15 common data fallacies.
Operator bias - whereby the member of a planning team who has real world operational experience is given disproportionate deference compared to those who are “book-trained” or inexperienced. With thanks to “Mike” in the comments section for this recommendation.
Not “social sciences”. This is an incredible misleading term. America was built on capitalism using STEM - Science, Technology, Engineering and Mathematics - to build railways, cities, mass industrialization, space flight and so on. The culture of the US is therefore understandably biased toward scientific solutions. However, humans are different from atoms in that, at least some, can think for themselves. Human problems, particularly of disputed politics are not soluble by STEM techniques.
War is a political activity. It must make strategic and ethical sense in relationship to attaining a satisfactory political outcome. The job of senior military commanders is to correlate the use of organized violence to achieving political outcomes. These calculations can not be fed into a computer to get the correct solution for the problem.
Senior leaders will often reluctantly make decisions that address political objectives but have the impact of limiting violence on the battlefield. Such generals are often maligned by subordinates as “politicians” one of the dirtiest terms of abuse anyone can hurl in a fight.
This is the operational to political arc of decisionmaking, where immediate operational imperatives give way to bigger political issues. A great example is General McChrystal's restrictions on air strikes in Afghanistan. Typically when decisions like these are taken, senior leaders maybe criticized by the troops for “becoming a politician” or forcing the troops to fight with “one arm tied behind their backs” a common refrain during the latter parts of the Vietnam war and especially thereafter.
Good article! Many of the notes are as interesting as the main body. Another bias to consider could be called the “real world bias” or the “outsized operator bias” whereby the member of a planning team who has real world operational experience is given disproportionate deference compared to those who are “book-trained” or inexperienced. This is often a powerful subset of an anecdotal bias where the individual’s specific tactical experience is allowed to affect the analysis of larger strategic or operational problems. This can also be found outside the planning cell when a veteran who has little experience in or knowledge of the subject matter becomes an expert news pundit on all things military.