Brain-Sight: Can Touch Allow Us to “See” Better Than Sight?

(Editor’s note: This article is from a past issue of Brain World magazine. If you enjoy this article, please support us with a print or digital subscription!)

Which of the following procedures do you think would produce the most accurate representation of an object: tracing the object; looking at the object while drawing it; or, with your eyes closed, touching and feeling the object and then drawing it, without having ever seen it? Most educators and parents would insist that the range in the quality of the three renditions would match the order in which they are presented.

When we listen to a song, we hear the melody, the beat, the lyrics, the instruments and the voice that make the music. Looking at the squiggly lines on a piece of paper, the letters form words, the words stretch into sentences, and the sentences make up paragraphs. Once read, collectively, not separately, each contributes to meaning.

While it is customary to assert that we see with our eyes, touch with our hands, and hear with our ears, we live in a simultaneous universe where sensory events and their constituent elements have a natural tendency to overlap.

Inside the brain, complex layers of interconnected sensory networks merge together seamlessly to produce a single experience. Horizontal lines, vertical lines, color, shape, size, motion, direction, etc., are fused together, leaving no perceptual holes in an experience. Just as more than 30 separate brain modules participate in constructing a single visual image, when one sensory system has been activated, the other senses do not assume the role of an uninvolved spectator. Susan Kovalik and Karen Olsen have identified 19 human senses, which often combine to produce a perception. It would have been significantly disadvantageous for our senses to evolve completely disconnected from one another, each standing in a queue to deliver disjointed information to the brain. Instead, multiple inputs from divergent brain circuits are processed to generate a single unified experience. The various elements that make up a perception frequently involve pathways located in multiple brain regions. The ability to create constructs in the mind’s eye involves far more complex brain-processing than mere vision. Instead, our perceptions are a collaboration of networks widely distributed throughout the brain.

As we navigate our way around planet Earth, 18 square feet of human skin envelops our bodies. It accounts for 12 to 15 percent of the weight in the average adult human body, constituting the largest of all bodily organs. The skin is a tight-fitting elastic spacesuit, not only serving 24/7 as a reliable defensive barrier, but also doubling as a highly sensitive information-gathering data-recorder.

The protective functions of human skin are inarguable, but does our skin have a more subtle learning purpose? Recent experiments have shown that touch is as important as vision to learning and the subsequent retention of information. Haptics, a relatively new field of study, is revealing how the sense of touch affects the way we interact with the world. It is also suggesting that if educators engage more of the human senses in their classrooms, students might not only learn faster but will more easily recall information by commingling unrelated sensory modalities.

While we are accustomed to saying that we see with eyes, we actually see with the specialized cells in the occipital lobe located in the posterior region of the brain. As we know, blind individuals can learn to read, walk, talk and recognize objects and people without using the retinal-cortical pathways. Sighted individuals can produce visual images in the brain through the sense of touch, where we use our minds, rather than our eyes, to visualize.

The lateral occipital cortex and the right lateral fusiform gyrus are known to be crucial in object recognition. However, input from more distant cortical areas, including the sensory motor cortex and the association cortex, provides additional information for constructing visualizations. New research is suggesting that the areas of the cerebral cortex that are activated when we merely look at illustrations or pictures of specific objects are also activated when we touch the same objects. Dr. Harriett Allen at the University of Birmingham has demonstrated that some areas in the lateral occipital cortex formerly thought to process vision alone can be activated by touch alone. There now appear to be multiple areas in the brain that underlie object recognition. They are highly interconnected in such a fashion that damage inflicted on one area can render other areas vulnerable to their natural ability to recognize objects.

TOUCH

You are sound asleep. An insect whose weight is calibrated in the milligrams makes a gentle but annoying path across your skin. Although a precise measurement of the degree of pressure exerted by these miniature footsteps is next to impossible, it is sufficient enough to trigger a sensory warning alarm, and you suddenly awaken to find yourself hunting for the potentially dangerous trespasser.

Two layers of unassuming skin, barely a few millimeters thick, rest deceptively above a complex network of sensory detectors whose dial is always set to High Alert. Our fear of disease-carrying creatures is a well-earned aversion sponsored by a lengthy and horrifying history with deadly insects. Could it be that our genetic database maintains memory records of the role played by the tiny flea during the bubonic plague, or of the more contemporary link between mosquitoes and malaria? The senses are on guard even as we slumber.

Exquisitely tailored to fit our body, the skin shrouds the muscles, body tissues, bodily fluids and internal organs, keeping the external world where it belongs — beyond our personal periphery. Our internal systems, muscles, soft tissues and blood are shielded safely from injury and the countless dangers posed by toxic microscopic invaders.

Although the skin has the miraculous power to mend itself, even a tiny intrusion into the two- to three-millimeter-thin covering from the proboscis of a mosquito is reported immediately. Details of incursions across the paper-thin border are instantly transmitted to the executive center in the brain, which coordinates a typically aggressive response to the security breach. Any intrusion warrants our conscious attention or a timely defensive response, sometimes occurring in reverse order where our reaction precedes our conscious awareness.

Foreign objects, toxins, air, fluids and living organisms can seldom penetrate the boundary of the human body’s watertight seal. Even when unconscious, sensory systems are seldom completely offline; instead, they remain poised to capture any vital changes around us, even those which are seemingly minor. But what else can be learned by our highly sensitive skin receptors?

The sense of touch is the composite of three sensory qualities — temperature, pain and pressure, which can be experienced individually or in various combinations. Characteristically, the broad swatches of the human skin are classified as either hairy or glabrous (hairless). These categories are best represented by the palms and backs of our hands. Together, they put us in instantaneous contact with the outer world.

When this skin is pressed, poked, vibrated or stroked, there are specialized corpuscles that respond to the four stages of perception: detection, amplification, discrimination (among several stimuli), and adaptation (the reduction in response to a stimulus — e.g., we are only consciously aware of our clothes during the moments we put them on). Over five million touch receptors for experiencing light or heavy pressure, warmth or coldness, pain, etc., cover the body, sending essential information to the brain via a massive sensory expressway. However, the distribution of receptor cells is undemocratically concentrated into those parts of the body that are most involved in direct tactile perception, which partially explains why hands-on learning is so incredibly effective as an educational tool. Wherever the hands go, that is where the brain focuses its attention. For decades, these receptor fields were thought to be fixed and unchanging. Instead, cortical representations and sensory projections are rapidly reorganized following injury or surgical alteration to specific areas of the body.

When it comes to sensory acuity, the hand is to the human sense of touch what the fovea is to our sense of vision. Respectively, both house exceptionally sensitive receptive fields that quickly send the brain a wealth of sensory information with optimum levels of detail and discrimination. The corresponding brain areas for touch and sight dedicate a substantial amount of cortical real estate to each of these senses.

As the hands and fingers move across an object, receptor cells respond to the infinitesimal indentations created on the surface of the skin, giving us priceless data disclosing the shape, texture, hardness and form of that particular object. Interestingly, reading braille does not require abnormally sensitive fingers. On an otherwise completely flat surface, the human fingertip can detect a raised dot 0.04 mm wide and measuring only 0.006 mm high. A typical braille dot is nearly 170 times that height, rendering it an easy read for our fingers.

(Editor’s note: This article is from a past issue of Brain World magazine. If you enjoy this article, please support us with a print or digital subscription!)

1 Trackback / Pingback

  1. Maximize brain activity using print – Adam Gordon Creative Print

Leave a Reply

Your email address will not be published.


*