You would think by now that we could say unequivocally what causes what. But the question of cause, which has haunted science and philosophy from their earliest days, still dogs our heels for numerous reasons.
Humans are evolutionarily predisposed to see patterns, and psychologically inclined to gather information that supports preexisting views, a trait known as confirmation bias. We confuse coincidence with correlation, and correlation with causality.
Advertisement
For A to cause B, we tend to say that, at a minimum, A must precede B, the two must covary (vary together), and no competing explanation can better explain the covariance of A and B. Taken alone, however, these three requirements cannot prove cause; they are, as philosophers say, necessary but not sufficient. In any case, not everyone agrees with them.
Speaking of philosophers, David Hume argued that causation doesn't exist in any provable sense [source: Cook]. Karl Popper and the Falsificationists maintained that we cannot prove a relationship, only disprove it, which explains why statistical analyses do not try to prove a correlation; instead, they pull a double negative and disprove that the data are uncorrelated, a process known as rejecting the null hypothesis [source: McLeod].
With such considerations in mind, scientists must carefully design and control their experiments to weed out bias, circular reasoning, self-fulfilling prophecies and hidden variables. They must respect the requirements and limitations of the methods used, draw from representative samples where possible, and not overstate their results.
Ready to read about 10 instances where that wasn't so easy?