Recent research has shown the brain inherently uses something like the scientific method. Although the study specifically assesses the response of the visual cortex to anticipated or novel stimuli, it makes sense that there’s a more general pattern-matching mechanism underlying basic cognition. A commentator even suggests there’s a link between the brain’s success rate at predictive analysis and overall intelligence.

I wonder about something different: I know that humans are very prone to confirmation bias. Perhaps it’s true that the brain’s tendency is to confirm predictions, due to the reward system or simply from the lower energy required to process confirmatory future perceptions. This is a fine point: Not only would it tend to make perditions, but it would also tend to provide more false-positives than false-negatives. In cases where a prediction is proved invalid, insofar as such prediction doesn’t entail negative consequences (or more generally, of emotionally significant consequences), there would be less probability of subsequent re-evaluation and learning.

Confirmation bias is a bane of pseudo-science. Just because we make a prediction it does not follow that we will apprehend subsequent related information with cold impartiality. As soon as we make a prediction, we have a vested interest in its being borne out. I believe this study provides evidence of that (but then again, perhaps it’s just my own confirmation bias!). Add in the notion that an incorrect prediction may not have nearly the emotional consequence of a perceived correct one: When you are correct you are pleased. When others think you’re correct they may even cheer and praise, which can be quite exhilarating. If the topic regarding which you are prognosticating is not, of itself, likely to have an immediate impact on your well-being, the consequences for being wrong may be trifling enough that they are ignored and dismissed as you bask in the laurels of your perceived success.

I find it intriguing that scientists have interpreted their results as supporting the notion that people’s brains are like scientists. I propose it is far more likely that people’s brains are like fortune-tellers: Happy to make any number of unrealistic predictions as long as they’re rewarded for doing so. A neurologic underpinning that is even slightly biased toward dispensing reward signals in response to our predictions ensures we will tend to exhibit and reinforce our biased thinking. A true scientific brain would be completely neutral regarding comparison of future evidence to prior prediction.

How sweet the irony that the study’s authors fell right into the trap they so cleverly exposed!?