98 | Features InterAgency Journal Vol. 8, Issue 3, 2017
What were you
THINKING?
by Ted Thomas and Robert J. Rielly
Ted Thomas is director of the Department of Command and Leadership in the U.S. Army Command
and General Staff College at Fort Leavenworth, Kansas. Thomas graduated from the United
States Military Academy and served in various command and staff positions before retiring.
He received a master’s from the University of Illinois, and a Ph.D. from Missouri University of
Science and Technology.
Robert J. Rielly is a retired Army lieutenant colonel and currently serving as an associate
professor in the Department of Command and Leadership, U.S. Army Command and General Staff
College, Fort Leavenworth, Kansas.
Biases and Rational Decision Making
In general, we expect people to think and act rationally. Market theories, negotiations, and other human endeavors are based on people reacting and thinking in sane, rational ways. It is based on an assumption that we are logical and can make good decisions. But are people really that
rational? Dan Ariely, a noted scholar, wrote a book on how we are all “Predictably Irrational.”1
Numerous authors have pointed out how psychological traps, cognitive biases, and world views
cloud our thinking and lead us to irrational choices. Decision making is the realm of the leader.
Leaders make decisions and our assumption is they are making good, rational decisions. However,
in our rush to make a decision we forget that psychological traps and biases affect them just as
they do the rest of us. This article will use the Bay of Pigs invasion as a case study to examine how
these human characteristics often cause us to act in counterproductive ways and what a leader can
do to offset them.
Bay of Pigs Invasion
The 1961 Bay of Pigs invasion provides a fertile example of poor thinking and decision-making.
In 1959 Fidel Castro completed his overthrow of the corrupt Batista government in Cuba. In the
spring of 1960 Castro formally aligned himself with the Soviet Union, establishing a communist
regime. Many of those in Batista’s regime and those who did not want to live in a communist country
left Cuba for the United States.2 In the era of the Cold War, the U.S. did not relish the idea of having
a communist country 90 miles off its coast, much less a nation closely allied with the Soviet Union.
Features | 99Arthur D. Simons Center for Interagency Cooperation, Fort Leavenworth, Kansas
The U.S. began making plans to overthrow
Castro during President Eisenhower’s presidency
in 1960. President Eisenhower, the Supreme
Allied Commander, five star general, and hero
of WWII, directed the CIA to start looking
at planning covert operations to bring down
Castro. Kennedy did not know the planning was
going on before the election and even heavily
criticized the Eisenhower administration for
their passivity.3 Two days after newly elected
President John F. Kennedy was sworn in as
President, he was briefed by Richard Bissel, a
CIA planner and chief architect of the plan to
invade Cuba. Kennedy described Bissel “as the
only CIA man he knew well enough to trust.”4
Possessing a certain amount of hubris after
winning the election, the Kennedy administration
proceeded with the strategy. The plan envisioned
recruiting and training approximately 1400
Cuban exiles to do a beach landing in Cuba to
overthrow Castro’s regime. Should the invasion
fail, the exiles were supposed to escape into the
Escambray Mountains and link up with guerillas
in the mountains continuing an insurgency
against the communist government.5
Since it was supposed to be a secret
operation not many people were briefed, to
include the Joint Chiefs of Staff (JCS) who were
marginally read in on the plan. When asked their
opinion, the chiefs said it had a “fair chance” of
success, which President Kennedy interpreted as
a “good chance.” In the post mortem following
the failed invasion the JCS were asked what they
meant and said they thought it had a three times
higher probability of failure than success. That is
not the way President Kennedy interpreted “fair
chance.”6
As a result of the Bay of Pigs invasion the
Kennedy administration was diplomatically
embarrassed, the CIA was discredited, and
several of its leaders were fired. It also provided
a major victory for the Cuban revolution, Fidel
Castro in particular. Castro was forced deeper
into the Soviet Bloc for support and survival.
This incident set the stage for the showdown
between the United States and the Soviet Union
in the Cuban Missile Crisis, bringing the world
to the edge of nuclear war.7
The question is, how could so many smart
people make so many irrational decisions?
Kennedy’s cabinet was stacked with intellectuals
and experts who had years of government and
corporate experience or who were Harvard
professors and subject matter experts.8 Irving
Janis’s book attributes much of the failure
of the operation to groupthink. He defines
groupthink as “a mode of thinking that people
engage in when they are deeply involved in a
cohesive in-group, when the members’ strivings
for unanimity override their motivation to
realistically appraise alternative courses of
action.”9 Groupthink was certainly a major factor
in the poor decision making and lack of critical
thinking evidenced at the Bay of Pigs fiasco.
However, there are other just as insidious threats
to rational decision making evident in this case.
Cognitive Biases
Cognitive biases or hidden traps in thinking
often lead to poor decisions. “People sometimes
confuse cognitive biases with logical fallacies,
but the two are not the same. A logical fallacy
stems from an error in a logical argument,
while a cognitive bias is rooted in thought
processing errors often arising from problems
with memory, attention, attribution, and other
mental mistakes.”10 Logical fallacies come from
poor thinking while cognitive biases are a part
of being human. The problem with these biases
is they become part of how we think and are
Groupthink was certainly
a major factor in the poor
decision making and lack of
critical thinking evidenced
at the Bay of Pigs fiasco.
100 | Features InterAgency Journal Vol. 8, Issue 3, 2017
therefore invisible to us, causing us to not see
them even as we fall into them.11 Research has
uncovered many cognitive biases. This article
will focus on six of the more common traps:
confirming evidence, sunk cost, framing, status
quo, anchoring, and overconfidence.
The confirming evidence trap leads us
to seek out information that confirms our
existing point of view and avoids or discounts
information that contradicts our point of view.12
President Kennedy wanted plausible deniability
of US involvement. Yet Pierre Salinger, the
President’s press secretary, referred to the plan
as “the least covert military operation in history.”
Even the President read in the newspapers about
secret training camps in Guatemala and efforts
to recruit Cubans in Miami to fight in the exile
forces. Despite the abundance of leaks, the
administration didn’t see the information as a
problem. Instead, they decided to ignore this
evidence and focus on plausible deniability
of U.S. participation due to the lack of direct
involvement. Somehow, they thought that no
“direct involvement” of U.S. forces would be
enough to convince the world that the U.S. was
not involved.13
The sunk cost trap is how we make current
decisions based on past decisions regardless of
whether or not the past decision has any bearing
on the current issue. To change our current
decision might make us look like we made a
bad prior decision, and we are often unwilling to
admit we made a mistake.14 President Kennedy
and his advisors made a decision two days into
the presidency to back the invasion of Cuba
based on a persuasive briefing by a trusted
expert, Richard Bissell. As evidence started to
mount on the inadvisability of the decision, the
administration did not want to look like they
had made a mistake in their earlier decision.
Bissell who had put so much emotional energy
into planning the invasion was not able to “see
clearly or to judge soundly.”15 So much effort
and planning were already sunk into the invasion
that it moved inexorably forward.
How a problem is framed influences how
we approach the problem. People tend to accept
the way the problem is given to them without
looking at it from a different perspective or point
of view. For instance, people tend to be risk-
averse when decisions are framed in terms of
gains and losses, wanting to avoid losses over
possible gains.16 The CIA framed the Bay of
Pigs invasion in terms of the danger of having a
Soviet satellite 90 miles off the coast of Florida.
With Soviet influence virtually on our borders,
the gain was in terms of the safety and security of
the U.S., as well as the possibility that other Latin
American countries would not follow suit in
becoming communist.17 This strongly influenced
how the administration saw the problem. Had
the decision been framed by the consequences
of failure and loss, the result would have been
different. The U.S. lost credibility and the trust
of nations throughout the world, and lost security
on its borders by the forcing of a closer alliance
between Cuba and the Soviet Union.18
The status quo trap is based on the fact that
people are averse to change and would prefer
the current situation over something new or
different.19 When Kennedy became president, the
planning for the invasion was already well under
way. Rather than change the plan, Kennedy
elected to stick with it and maintain the status
quo.
The anchoring trap is reflected by the fact
that we give inordinate credence to the first
information we receive and then compare any
new information to the original thought, idea,
or data.20 Thus, the first information we receive
“anchors” our thoughts. The first briefing by
Bissell anchored the administration to the idea of
How a problem is framed
influences how we
approach the problem.
Features | 101Arthur D. Simons Center for Interagency Cooperation, Fort Leavenworth, Kansas
an invasion. Bissell himself altered the plan from
a small scale covert operation to an invasion in
November of 1960. The President was only
briefed on the invasion plan two months later in
January of 1961. The President and his advisors
never seriously considered other options such as
using diplomatic and economic leverage, a small
scale infiltration of exiles, or even major military
intervention with U.S. forces, because they were
anchored to the exile brigade beach assault and
invasion option.21
The overconfidence trap states we are
too self-assured about our abilities in making
decisions and forecasting future consequences,
which causes us to take greater risks.22 Experts
are especially vulnerable to this trap because
they are more convinced they are right due to
their expertise and partially to maintain the
appearance of being an expert.23 If they don’t
know the answer, then they are obviously not
much of an expert. After the election in 1960,
there was a sense of euphoria that nothing
could stop the new administration in solving the
nation’s problems and challenges. Kennedy and
his advisors were overly optimistic, giving them
a low sense of vulnerability about their cause
and ability to win. They viewed the Bay of Pigs
plan through the lens of democracy is good and
communism is bad and whatever we do will be
vindicated by the non-communist nations of the
world.24
Many of these traps are linked and feed off
each other. Overconfidence often starts with
anchoring. Confirming evidence is often done
after a prior decision is made, and we look for
evidence to confirm the sunk cost or the status
quo. The status quo is often due to the sunk
cost. Our framing of a problem may start with
the anchoring of a suggestion or fact that may
or may not be relevant. These six cognitive
biases are only a few of the biases, but some
of the more prevalent. The real importance of
understanding these thinking traps and biases is
knowing how to deal with them.
Cognitive biases can be particularly common
in the military especially with planning and
execution. Both commanders and their staffs can
be vulnerable to the anchoring trap with the first
piece of information they receive. They can view
all subsequent pieces of information through this
lens. In addition, when the commander makes
the decision and the staff begins preparing for
execution, we see confirmation bias when people
tend to ignore any information or intelligence
that contradicts the approved plan. Commanders
and their staffs can fall victim to the sunk cost
trap when they refuse to reframe a problem or
adjust a course of action or decision because of
the time, effort and resources already invested.
Finally, most leaders are not enthusiastic
about change, but change can be necessary.
Commanders and their staffs fall victim to
the status quo trap when they choose to keep
doing the same thing despite evidence to the
contrary. We often tend to do more of the same
and reinforce failure hoping for a change in the
outcome.
Ways to Address our Biases
There are many different ways to address
faulty thinking and cognitive traps. Just knowing
that these traps exist, and that we are all subject
to them, is the first step in overcoming them.
Leaders have to overcome these traps on
two levels – first individually as a leader, and
secondly as part of a collaborative group. At
the individual level a person not only needs to
recognize that traps exist, but they also need to
be proactive in what they can do about it.
Leaders have a responsibility to examine
The overconfidence trap
states we are too self-assured
about our abilities in making
decisions and forecasting
future consequences…
102 | Features InterAgency Journal Vol. 8, Issue 3, 2017
their thinking and avoid cognitive biases to
the best of their ability. To avoid the anchoring
trap, good leaders purposely seek out those
with different opinions. Leaders should avoid
speaking too early and giving their opinion,
otherwise they may anchor those they supervise
to their own preconceptions. Leaders should also
think about the situation on their own before
consulting others’ opinions to avoid becoming
anchored themselves.25
Leaders should examine how emotionally
attached they are to the situation and realize how
that will taint their decision-making. They find
people who are uninvolved in the current or past
decisions and who do not have the knowledge of
sunk costs. They build a climate where people
embrace experimenting and failure, where it is
accepted to own their mistakes and fail forward.26
They try to look at the problem through
a different lens or point of view and try to
reframe the question or problem using different
perspectives and pose problems neutrally, not
favoring either gains or losses.27 They examine
what their current procedures are to determine if
those procedures and processes are getting the
organization to their vision.
For the status quo trap, leaders need to
identify other options and compare them to the
status quo to determine if the status quo is the
best option to reach the objective. They should
also examine if the status quo would still be an
option if it was not already in place.28
The principle ways to combat the confirming
evidence trap are to examine all information
equally with the same criteria and use red team
techniques (explained below) or designate a
trusted person to play devil’s advocate. Finally,
leaders should avoid asking leading questions to
get the answers they are looking for and instead
ask open ended questions to explore the situation
and encourage debate.29
Finally, leaders should conduct pre-
mortems and post-mortems as a way to counter
overconfidence. A pre-mortem looks at how the
project, plan, or organization could fail in the
future, while a post-mortem takes a view from
the future looking into the past to determine
why it did fail. The decision maker should
challenge their own judgment especially when
forecasting results of actions. In addition, the
decision maker can provide data to support
their predictions.30 Leaders drive the process to
help their organization overcome biases and that
process starts with themselves.
Protecting against traps is not just an
individual responsibility, but also a group
responsibility. Combatting traps in a collaborative
group begins with climate. When leaders set the
proper climate in terms of policies, procedures
and systems to protect against biases, they will
make better collaborative decisions. A few
techniques and methods for leaders to improve
decision making in a collaborative group are red
teaming, diversity, questioning, and establishing
a safe to fail climate.
Red teaming involves establishing a team
to look at the issue from the adversary’s or
opponent’s view point. It is more than just
playing devil’s advocate. It seeks to get in the
mind of the adversary and think the way they
do. Red teams challenge assumptions, look
at “what-if” scenarios, and provide possible
answers to how the opponent would act and react
to different decisions and scenarios. A few of
its goals are to break through cognitive biases,
improve decision making, and avoid surprises.31
Red teaming avoids groupthink by taking people
out of the group to look at the problem. It also
addresses each of the other six cognitive traps.
The red team challenges the evidence and looks
at disconfirming information. They are not
worried about sunk cost or the status quo. They
Protecting against traps is not
just an individual responsibility,
but also a group responsibility.
Features | 103Arthur D. Simons Center for Interagency Cooperation, Fort Leavenworth, Kansas
look at the problem from different points of view
and avoid the framing and anchoring traps. They
are trying to find ways for the plan or decision to
fail and avoid the overconfidence trap.
Diversity ensures there are differing opinions
in a group including minority views, dissenting
opinions, and disinterested parties who have not
made a judgment on the problem. Diversity can
be accomplished through different nationalities,
religions, cultures, races, gender, ethnicity,
language, age, social status, experiences, and
political affiliation, to name a few. A diverse
set of viewpoints increases creativity and
innovation32 and helps overcome groupthink,
anchoring, sunk cost, and status quo traps.
Establishing a climate where questions are
encouraged and valued helps people to challenge
assumptions, predispositions, and paradigms
that lead to cognitive biases. Questioning helps
organizations survive and thrive in volatile and
quickly changing environments. Questioning
requires humility and a desire to learn, which
comes from genuinely listening. Understanding
the foundations of critical thinking are a great
place to start in developing a keener ability to
ask the right questions and overcome biases.
Questioning facts, assumptions, points of view,
paradigms and mental models, purpose, and
problems are key lines of thinking to exposing
all of the cognitive biases addressed here.33
Leaders who create a climate where it’s
safe to fail have an organization in which
people are willing to expose their thinking and
reasoning to the group. It means leaders are
eager for feedback to improve their thinking and
processes, especially when things go wrong. In
order to achieve a safe to fail environment, we
need a climate where it’s safe to think and safe
to challenge. A safe to think climate is one in
which people have time to read and think, to
be curious and gain new information. A safe to
challenge climate is one in which people are able
to challenge the organization’s idea of who it is
and what it does, to question its mental models
without fear or threat of reprisal. Safe to fail is
about allowing and taking risks to stay relevant34
and to avoid the cognitive traps of anchoring,
status quo, sunk cost, and framing.
Conclusion
The next major emergency that President
Kennedy faced was the Cuban Missile Crisis.
He learned from his previous fiasco. His
embarrassment and failure in the Bay of
Pigs certainly prevented him from becoming
overconfident in dealing with Soviet nuclear
weapons in Cuba. The administration
continuously examined what could go wrong
and projected what would be the cascading
effects from possible decisions they could make.
President Kennedy widened his circle of trusted
advisors, including people from outside his party
and with divergent views to help in framing the
problem and finding an answer. He created a
special group to come up with solutions and look
at different alternatives which helped to prevent
anchoring. Nuclear weapons in Cuba was a
totally new problem to this administration, but
rules of engagement and contingency plans were
already written and could have boxed him into
a decision resulting in world war three. He did
not let the sunk cost of those plans and the status
quo they represented constrain his thinking and
decision-making. He learned to not blindly trust
the experts, since the experts are often narrow in
their viewpoints. He also used different experts
to counter each other’s opinions and avoid the
danger of confirming evidence. In effect, he
learned to counteract his cognitive biases and
Establishing a climate where
questions are encouraged
and valued helps people
to challenge assumptions,
predispositions, and paradigms
that lead to cognitive biases.
104 | Features InterAgency Journal Vol. 8, Issue 3, 2017
avoid groupthink to solve a very complicated problem and avoid thermonuclear war.
Our decisions may not have as catastrophic consequences as thermonuclear war, but poor
decision making due to faulty logic and cognitive biases can certainly lead to the demise of
companies, programs, or people’s careers. Our assumptions are heavily influenced by cognitive
biases. Understanding our human tendencies to fall into these traps is needed to have the self-
awareness to avoid them. Knowing how to overcome these thinking traps and biases is an invaluable
tool for leaders to have and use. IAJ
NOTES
1 Ariely, Dan. Predictably Irrational. Harper Collins: New York, pg. xix, 2008.
2 Neustadt, Richard E. and Ernest R. May, Thinking in Time, The Free Press: New York, 1986, pg. 141.
3 Neustadt, pg. 141-142.
4 Ibid. pg. 145.
5 The Bay of Pigs Invasion, 2016 Featured Story, https://www.cia.gov/news-information/featured-story-
archive/2016-featured-story-archive/the-bay-of-pigs-invasion.html, accessed 26 May 2017.
6 Neustadt, pg. 142.
7 Braudel, Fernand. Invasion at Bay of Pigs, http://www.historyofcuba.com/history/baypigs/pigs6.htm,
accessed 11 April 2017.
8 Janis, Irving, Groupthink 2nd Edition, Houghton Mifflin Co: Boston, 1982, pg. 16-17.
9 Ibid. pg. 9.
10 Cherry, Kendra. “What is a Cognitive Bias? Definition and Examples,” Verywell, May 9, 2016, https://
www.verywell.com/what-is-a-cognitive-bias-2794963, accessed 22 March 2017.
11 Hammond, John S, Keeney, Ralph L., and Raiffa, Howard. Smart Choices, Broadway Books: New
York, 1999, pg. 186.
12 Ibid. pg. 194
13 Janis, pg. 20.
14 Hammond, pg. 192.
15 Janis, pg. 46.
16 Hammond, pg. 197-199.
17 Janis, pg. 30.
18 Janis, pg. 15.
19 Hammond, pg. 190.
20 Hammond, pg. 187.
Features | 105Arthur D. Simons Center for Interagency Cooperation, Fort Leavenworth, Kansas
21 Neustadt, pg. 142 and 146.
22 Hammond, pg. 162.
23 Lee, Samantha, and Lebowitz, Shana. “20 Cognitive biases that screw up your decisions,” August 26,
2015, http://www.businessinsider.com/cognitive-biases-that-affect-decisions-2015-8, accessed 22 March
2017.
24 Janis, pg. 35 and 37.
25 Brusman, Maynard, “The 8 Traps of Decision Making,” Working Resources, http://www.
workingresources.com/professionaleffectivenessarticles/the-8-traps-of-decision-making.html, accessed 23
March 2017.
26 Hammond, 193.
27 Ibid, pg 200.
28 Ibid, pg 190-191.
29 Ibid, pg 196.
30 Ibid, pg 202
31 Red Team Journal, “Red Teaming and Alternative Analysis,” http://redteamjournal.com/about/red-
teaming-and-alternative-analysis/, accessed 22 Mar 2017.
32 Abreu, Kim. “The Myriad Benefits of Diversity in the Workforce,” Entrepreneur, December 9, 2014,
https://www.entrepreneur.com/article/240550, accessed 22 March 2017.
33 Thomas, Ted and Thomas, James. “Developing a Culture of Questioning or Don’t Tell, do Ask,”
InterAgency Journal, Vol. 7, Issue 3, Fall 2016.
34 Power, Gus. Energized Work, “Safe to Fail,” April 23, 2015, https://www.energizedwork.com/
weblog/2015/04/safe-fail, accessed 22 March 2017.
Case Study #1 Assignment.html
Read the Bay of Pigs case study. Prepare a case study analysis and post it under the group discussion thread.
The case study analysis should have sections as follows:
In the first section, summarize three or four interesting facts about the case study.
In the second section, address at least two concepts from Chapter Four that relate to the case study and faulty decision making. Explain how you think the concepts relate to the case study.
In the final section, reflect on the changes JFK made when he faced the Cuban missile crisis. Discuss at least two ways managers might avoid groupthink and faulty decision making. Do you think avoiding groupthink could lead to more ethical decisions?
You should post your case study analysis under the group discussion thread by February 4 at 11:59PM. The analysis should demonstrate your understanding of relevant concepts. The analysis is your initial response to the assignment. Then, you will post short responses to two other group members by February 8. This will give you the opportunity to reflect on different ideas and approaches. You do not need to respond to every group member!
Remember your posts should be respectful. I will be able to see all discussion posts. Please use your business writing skills.