SciAm ran a column on beliefs this month. It made me think about much of the public's understanding of Leucadia issues. Excerpts below.
We form our beliefs for a variety of subjective, emotional and
psychological reasons in the context of environments created by family,
friends, colleagues, culture and society at large. After forming our
beliefs, we then defend, justify and rationalize them with a host of
intellectual reasons, cogent arguments and rational explanations.
Beliefs come first; explanations for beliefs follow. In my new book The Believing Brain
(Holt, 2011), I call this process, wherein our perceptions about
reality are dependent on the beliefs that we hold about it,
belief-dependent realism. Reality exists independent of human minds,
but our understanding of it depends on the beliefs we hold at any given
Once we form beliefs and make commitments to them, we maintain and
reinforce them through a number of powerful cognitive biases that
distort our percepts to fit belief concepts. Among them are:
Anchoring Bias. Relying too heavily on one reference anchor or piece of information when making decisions.
Authority Bias. Valuing the opinions of an authority, especially in the evaluation of something we know little about.
Belief Bias. Evaluating the strength of an argument based on the believability of its conclusion.
Confirmation Bias. Seeking and finding confirming
evidence in support of already existing beliefs and ignoring or
reinterpreting disconfirming evidence.
On top of all these biases, there is the in-group bias, in which we
place more value on the beliefs of those whom we perceive to be fellow
members of our group and less on the beliefs of those from different
groups. This is a result of our evolved tribal brains leading us not
only to place such value judgment on beliefs but also to demonize and
dismiss them as nonsense or evil, or both.
Belief-dependent realism is driven even deeper by a meta-bias called
the bias blind spot, or the tendency to recognize the power of
cognitive biases in other people but to be blind to their influence on
our own beliefs. Even scientists are not immune, subject to
experimenter-expectation bias, or the tendency for observers to notice,
select and publish data that agree with their expectations for the
outcome of an experiment and to ignore, discard or disbelieve data that
This dependency on belief and its host of psychological biases is
why, in science, we have built-in self-correcting machinery. Strict
double-blind controls are required, in which neither the subjects nor
the experimenters know the conditions during data collection.
Collaboration with colleagues is vital. Results are vetted at
conferences and in peer-reviewed journals. Research is replicated in
other laboratories [maybe]. Disconfirming evidence and contradictory
interpretations of data are included in the analysis. If you don’t seek
data and arguments against your theory, someone else will, usually with
great glee and in a public forum. This is why skepticism is a sine qua
non of science, the only escape we have from the belief-dependent
realism trap created by our believing brains.
We should embrace processes that challenge each other's beliefs. It can result in better understanding and help shrink our blind spot.
Nobody gets it right all the time. Be wary of leaders who avoid challenges to their positions.
See also: Roots of disagreement.