|
The Role of Cognitive Errors in the Drug Policy Debate
David Hadorn, M.D. e-mail: David.Hadorn@vuw.ac.nz
From the International Journal of Drug Policy, Volume 7, 1996.
People often don't think clearly. So much is well known to everyone,
including drug policy reform advocates who frequently confront
the muddled thinking so often characteristic of prohibition advocates.
What is less well known is that an entire body of scientific literature
has accumulated concerning the cognitive errors that lead
to the development and maintenance of erroneous judgments and
beliefs. Knowledge of this literature is vital to an understanding
of how otherwise (seemingly) rational people can so persistently
resist the force of evidenceand for developing strategies to
increase one's persuasive power.
Among the twenty or so categories of cognitive error identified
so far, two are of particular relevance to the drug policy debate:
illusory correlation and belief perseveration. The
joint action of these errors explains much of the observed intransigence
of prohibitionist advocates in the face of evidence. Such intransigence
is generally ascribed simply to ignorance or malevolence. But
this is not the whole story.
Illusory Correlation
Among the key processes underlying belief formation is our innate
tendency to notice (apparent) correlations between two or more
factors or phenomena. Thus, we may come to believe that cannabis
smoking is correlated with dropping out of school because we observe
(or learn about) some pot-smoking teenagers who drop out of school.
Once correlations between attributes or events are noticed, we
often go on to develop causal explanations for these correlations.
The conceptual difficulties inherent in developing causal explanations
are themselves the subject of a vast literature and are eminently
relevant to the present debate. However, we restrict our attention
here (for the most part) to the process by which simple correlations
are inferred, irrespective of any subsequent causal attributions.
As it turns out, accurate assessment of correlation is difficult
enough, and illusory correlation is a ubiquitous problem.
Defined as "the tendency to see two things as occurring together
more often than they actually do"[1], illusory correlation
was first studied in the early 1960's in the setting of word association
tests, in which series of strategically designed pairs of words
were briefly presented to test subjects [2]. Invariably, subjects
would report that related words (e.g., lion, tiger) appeared together
much more often than they actually did.
This work was later extended to real-world applications, including
several studies concerning psychologists' interpretation of projective
tests. For example, based on a few paranoid people who drew large
eyes during development of the "Draw a Person" test,
psychologists believed for many years that drawing large eyes
was correlated with (and perhaps caused by) paranoia [3]. This
quaint notion has now been thoroughly debunked.
Illusory correlation occurs frequently because, despite its apparent
simplicity, the process of judging which things are inter-correlated
is fraught with hazard. In general, people tend to make far more
errors than correct judgments of correlation, especially whenas
is often the casetheir expectations, hopes, and prior theories
interfere with objective coding and processing of data.
Even absent such biases, however, the process of judging whether
Attribute A (e.g., cannabis smoking) is really correlated
with Attribute B (e.g., dropping out of school) is a challenging
one. Successful judgments require construction (if only subconsciously)
of "2 x 2 tables" for each possible combination of attributesthe
four cells corresponding to (1) Attribute A present, Attribute
B present, (2) Attribute A present, Attribute B absent, (3) Attribute
A absent, Attribute B present, and (4) Attribute A absent, Attribute
B absent. For example, data in the Cell #1 might correspond to
people who smoked cannabis (Attribute A) and later drop out of
school (Attribute B); data in Cell #4 would then correspond to
people who did not smoke cannabis and who did not drop out of
school..
The major factor responsible for illusory correlation is the strong
tendency to focus almost exclusively on the cases found in Cell
#1 (present, present) in the above schema. Focusing on these cases,
which are usually more visible and impressive than cases in the
other three cells, is often worse than useless in terms of judging
correlation. Nor is considering even two or three of the four
cells adequateall four cells are essential.
For example, as discussed by Jennings et al. [4], if asked to
test the theory that red-haired individuals are hot-tempered,
most people would attempt (simply) to recall the red-haired individuals
they have known and try to remember how many of those individuals
were hot tempered (cells #1 and #2). More sophisticated "intuitive
psychologists" would attempt to recall the hot-tempered people
they have known and determine how many of these had red hair.
But it would occur to very few people that the proportion of even-tempered
blondes and brunettes is essential to the judgment task at hand.
Illusory correlation also underlies much of the prejudice manifested
throughout the world. ("That guy's a thief! Say, isn't he
Jewish?") By focusing on "present-present" cases
(e.g., cannabis users who drop out of school, Jewish thieves)
and ignoring the other three categories of people (e.g., cannabis
users who don't drop out of school, non-Jewish thieves), we get
ourselvesand othersinto all kinds of trouble.
Belief Perseveration
Once beliefs are formed, the second major kind of cognitive error
relevant to the drug policy debate comes into play: belief perseveration.
As described by Ross and Anderson,
"It appears that beliefsfrom relatively narrow personal
impressions to broader social theoriesare remarkably resilient
in the face of empirical challenges that seem logically devastating.
Two paradigms illustrate this resilience. The first involves the
capacity of belief to survive and even be strengthened by new
data, which, from a normative standpoint, should lead to the moderation
of such beliefs. The second involves the survival of beliefs after
their original evidential bases have been negated" [5].
Drug policy reform advocates will immediately recognize these
phenomena in their opponents. Prohibition-minded individuals seem
remarkably resistant to alterations in their belief when confronted
with contrary evidence. (It would be well, of course, to search
for the presence of this problem also in ourselves, but with [almost]
all the evidence on our side, we can be relatively assured
that our beliefs are not persisting in spite of evidence.)
A study conducted by Lord et al. [6] illustrates the kind of research
used to analyze the problem of belief perseveration. These investigators
identified people with clear views (one way or the other) concerning
the effectiveness of capital punishment as a deterrent to crime.
In a counterbalanced design, subjects were presented with purportedly
authentic empirical studies which either supported or refuted
their position. Subjects consistently rated the studies supporting
their position as "more convincing" and "better
conducted" than the studies opposing their beliefs.
"In fact, many individual subjects who had read both the
results summary and the procedural details of the study that opposed
their belief ultimately became more convinced of the correctness
of that belief! No such effects occurred when the same results
and procedures were read by subjects whose initial views were
supported."
An even more serious challenge to one's beliefs than the introduction
of new evidence is the revelation that the original bases for
the beliefs were completely spurious. Even under these circumstances,
beliefs often persist. Several studies have been conducted to
explore this phenomenon, among the most illuminating of which
is a study by Anderson et al. [7], in which subjects were informed
that a functional relationship existed between how well firefighters
perform in their job and their scores on a test of risk preference
(i.e., risk avoiding versus risk seeking). Subjects were provided
with scenarios in which the job performance of certain firefighters
was presented along with their scores on the test.
The investigators found that presenting even a single pair of
cases (i.e., one successful and one unsuccessful firefighter with
appropriately discrepant scores) was sufficient for subjects to
develop beliefs about the functional relationship of performance
and test scores. Moreover, these beliefs
". . .survived the revelation that the cases in question
had been totally fictitious and the different subjects had, in
fact, received opposite pairings of riskiness scores and job outcomes.
Indeed, when comparisons were made between subjects who had been
debriefed and those who had not been, it appeared that over 50%
of the initial effect of the "case history" information
remained after debriefing."
Ross and Anderson explore a variety of cognitive mechanisms which
might underlie the unwarranted persistence of our beliefs and
social theories, including biased search, recollection, and assimilation
of information; erroneous formation of causal explanations; behavioral
confirmation, and "self-fulfilling" hypotheses, among
others [5]. All of these mechanisms have been well studied and
described, and none is subject to ready correction. The collective
power of these errors is formidable.
The authors conclude that "attributional biases and other
inferential shortcomings are apt not to be corrected but instead
to be compounded by subsequent experience and deliberations."
This is a discouraging situation for those who wish to change
the minds of others by reference to evidence.
Note that illusory correlation and belief perseveration are mutually
reinforcing phenomena. Once an apparent correlation is observed,
people are remarkably adept at developing theories about why the
association was observed. In the case of cannabis smoking and
dropping out of school, for example, the likely hypothesis would
be that cannabis produces an "amotivational syndrome"
or in some way impairs one's ability to function in school. This
hypothesis might seem reasonable to many people (especially given
the influence of government propaganda), and it is therefore added
to one's repertoire of beliefs. The belief then perseverates despite
the discrediting of its original evidential bases because, well,
the hypothesis still seems reasonable.
Discussion
Drug policy reform advocates must understand and appreciate the
power of illusory correlation and belief perseveration if they
are to tailor and focus their persuasive efforts to do the most
good. What might this entail?
Most evidently, perhaps, the phenomenon of belief perseveration
would tend to indicate that the goal of the debate is not
(primarily) to change the minds of prohibitionists, but rather
to influence those who are truly undecided. Arguing with prohibitionists
is often likely to be a waste of time.
On the other hand, beliefs sometimes do change. Ross and Anderson
[5] note that "even if logical or empirical challenges have
less impact than might be warranted by normative standards, they
may still get the job done." A particularly effective method
for changing people's beliefs, as noted by these investigators,
is the presentation of "vivid, concrete, first-hand experience".
In the drug policy arena, such experiences might come, for example,
through the revelation that a respected friend, family member,
or colleague uses cannabis (or other drug), or when such a personwho
is otherwise blamelessis arrested and jailed for use of drugs.
Revelatory experiences often occur in certain settings of powerful
emotional appeal. In this regard, Ross and Anderson note that
"the effectiveness of groups and leaders that accomplish
dramatic political or religious conversions offer inviting targets
of future research" [5]. The extent to which such a proselytizing
approach works, or has worked, in the drug policy debate is largely
unknown.
This question, and others, should be addressed as part of a study
of the "epidemiology of belief revision" with respect
to drug policy. Surprisingly little (if anything) is known about
how often people change their minds, on way or the other, on the
subject of drug policy, let alone why people change their
minds. This subject would thus seem to be a potentially fruitful
avenue for research.
One possible approach to such a study would be to identify people
who previously held prohibitionist views but who have been persuaded
by one thing or another to abandon those views. Identifying such
people would not be easy, however. Recruiting people through internet
newsgroups is one possible approach, although such a sample would
obviously be regarded as a convenience, not a representative
one.
Another implication of the foregoing analysis of cognitive errors
with respect to developing a strategy of argument in the drug
policy debate is that drug policy reform advocates should cite
the existence and nature of these errors as recurring themes in
their arguments. Real-life examples to illustrate this common
theme are not hard to come by, and indeed a critical part of this
strategy would entail being alert for prohibitionist arguments
based on "Cell 1 myopia." For example, in response to
the claim that using cannabis leads children to drop out of school,
one might ask "What percentage of kids who smoke pot don't
drop out of school? Are you sure this figure isn't even higher
than those who never touch the stuff? And what proportion of kids
who don't smoke pot drop out? You don't know? Well, without knowing
the answers to these questions it is impossible to say
anything about the relationship between smoking pot and dropping
out, if any!"
Or words to that effect. This line of attack (or defense) is likely
to be particularly effective in cases, such as the one just alluded
to, where empirical data is not readily available to support or
refute a particular position. Where such facts do exist, bringing
them to light will often be more effective than pointing out the
cognitive errors inherent in the prohibitionists' arguments. Ideally,
one would like to do battle on both fronts, but this will not
always be feasible. In most cases, judgment will be required to
determine which approach is most likely to change one's opponent's
mind.
In summary, the role of cognitive errors in the drug policy debate
implies that reform advocates should strictly limit the time and
effort they devote to (usually vain) efforts to convince devoted
prohibitionists that they are in error. A more productive strategy
is to search for the truly undecided and focus one's attention
on them.
Research is needed into how frequently prohibitionists reverse
their positions, and what causes them to do so. In the meantime,
pointing explicitly to the existence and role of cognitive errors
in the drug policy debate is probably a worthwhile tactic, and
certainly all cases of "Cell 1 myopia" should be ruthlessly
exposed and debunked.
References
1. Chapman LJ, Chapman J. Test results are what you think they
are. In Kahnemann, Slovic, Tversky eds., Judgment Under Uncertainty:
Heuristics and Biases. Cambridge U. Press, 1982.
2. Chapman LJ. Illusory correlation in observational report. Journal
of Verbal Learning and Verbal Behavior 1967; 6: 151-155.
3. Chapman LJ, Chapman JP. Illusory correlation as an obstacle
to the use of valid psychodiagnostic signs. Journal of Abnormal
Psychology 1969; 74: 271-280.
4. Jennings DL, Amabile TM, Ross L. Informal covariation assessment:
Data-based versus theory-based judgments. In Kahnemann, Slovic,
Tversky eds., Judgment Under Uncertainty: Heuristics and Biases.
Cambridge U. Press, 1982.
5. Ross L, Anderson CA, Shortcoming in the attribution process:
On the origins and maintenance of erroneous social assessments.
In Kahnemann, Slovic, Tversky eds., Judgment Under Uncertainty:
Heuristics and Biases. Cambridge U. Press, 1982.
6. Lord C, Lepper MR, Ross L. Biased assimilation and attitude
polarization. The effects of prior theories on subsequently considered
evidence. Journal of Personality and Social Psychology
1979; 37: 2098-2110.
7. Anderson CA, Lepper MR, Ross L. The perseverance of social
theories. The role of explanation in the persistence of discredited
information. Journal of Personality and Social Psychology
1980; 39: 1037-1049.
|