One of the main themes of scientific skepticism, at least one of my favorite themes, is that we cannot take the accuracy of our own perceptions for granted. We cannot trust what we remember about what we think we experienced – a principle I call neuropsychological humility. Human brains process information in a complex way, making assumptions and adjustments that are useful most of the time, but introduce multiple opportunities for misperceptions. This is partly why we need objective evidence as a check on our perceptions.
Neuroscientists continue to document the many ways in which our perceptions can be fooled. One category of such phenomena are so-called cross-modal interactions – one sensory modality influencing another. The basic concept here is that our brains are receiving multiple streams of information simultaneously and they weave those streams into one seamless experience of reality. Therefore what we see influences what we hear, and what we hear influences what we see, which influences what we feel, etc.
By exploiting these cross-modal interactions researchers can trick the brain into a false experience – by bending or breaking the rules of these interactions. They do this somewhat like magicians, creating scenarios for which evolution would likely not have prepared us.
A recently published study in the journal Perception demonstrates one type of cross-modal interaction that most people are probably not aware of – that what we see affects what we feel. The article is Visual influence on haptic torque perception. “Haptic” refers to exploring the world immediately around you through touch. The researchers tested the ability of subjects to sense in which direction a stick they were holding upright in one hand is weighted to one side or the other. A weight hanging on the left side of the stick would tend to pull or torque the stick to the left, which we would feel as twisting our wrist counter clockwise.
The researchers found, unsurprisingly, that subjects could feel the direction of torque even without seeing the weight on the stick. But then they used mirrors to flip the apparent side of the weight, so it appeared to be on the right even when it was on the left. This caused some subjects to feel the weight on the side their visual input told them the weight was, even though it was really on the other side. This effect was fairly robust, although was stronger for lighter weights – so there was a threshold beneath which visual input had a greater influence than haptic input.
This is a haptic illusion based upon a cross-modal effect. Interestingly, the illusion works even when subjects were aware of exactly what was going on. Even when they knew the weight was really on the left and mirrors were being used to make it appear as if it were on the right, they still felt the torque to the right.
This means that the effect is involuntary and not affected by awareness, which reflects an important underlying concept of neuroscience. Much of our brains’ processing of information, sensory input and otherwise, occurs below the level of awareness. The processing is automatic, and the results are just “presented” to our surface consciousness. Sometimes we can consciously influence this processing, and sometimes we cannot – we see the illusion so matter what we do.
I find these type of phenomena fascinating – peeking behind the curtain of our own brains and seeing how the kluge works. These narrow specific perceptual phenomena can be extrapolated to the working of our brains in general – our thoughts, feelings, and memories (in addition to our perceptions) are all the result of complex evolved algorithms that contain multiple assumptions, estimations, and calculated trade-offs, and they mostly occur beneath our awareness. It takes a great deal of introspection and metacognition to wrest a small measure of cognitive control from these subconscious processes.
Or perhaps this is just another layer of illusion of control – but at least it’s one level up from just going with the flow of our default processing.