Technology has always feared and fascinated people. Revered for its role in shaping society and improving lives — feared for the same reasons.
As neuroscience evolves, and technology alongside it, both invasive and noninvasive techniques will be used to view the brain, treat illness, and even enhance cognition. In its October 2016 report titled “The Digital Future of Brain Health” the World Economic Forum said that with technical advances, “health care is increasingly shifting from care of the sick to the prevention of sickness, and from volume-based care to value-based services. ‘Consumerization’ of health care is on the rise, as patients take increasingly active roles in their care experience.”
One effort in bridging neuroscience and technology is with a brain-computer interface (BCI). BCIs are computer technology that interacts with the brain and neural structures to decode neural activity into physical movement. In other words, BCIs translate thought into action. As our understanding of the human brain grows, the next generation of computers may be able to reason and think as humans do. Already some algorithms are able to solve CAPTCHAs, which currently are used by websites to differentiate between human visitors and internet bots. Other algorithms can recognize age and gender from faces, or can recognize pictures as well as nonhuman primates in the lab are able to.
As our understanding of the human brain grows, the next generation of computers may be able to reason and think as humans do. Already some algorithms are able to solve CAPTCHAs, which currently are used by websites to differentiate between human visitors and internet bots. Other algorithms can recognize age and gender from faces, or can recognize pictures as well as nonhuman primates in the lab are able to.
Of course, IBM’s Deep Blue’s success in playing chess grandmaster Garry Kasparov and Watson’s ability to help doctors choose treatments for patients is only the beginning. As technology develops, man and machine will intertwine — and, in some cases, become one.
Brain-computer interfaces allow people to communicate independent of muscle control, which is important for those who are paralyzed or have severe neuromuscular disorders. Neuroprostheses are nothing new, but more direct connections between brain, body, and machine are being developed. BCIs are already helping paralyzed people do “simple” tasks like writing emails — using their thoughts. In a 2014 study by Eric W. Sellers and colleagues, a patient suffering from locked-in syndrome because of a brainstem stroke was able to spell words and have conversations with family members using a BCI system. Another patient, Rick Arnold, suffering several strokes, was paralyzed on the right side of his body. He had one wish: to hold his wife’s hand. Using BCI technology developed by neurosurgeon Eric Leuthardt, Arnold is able to think about moving his hand, which translates into a command for a device connected to Arnold’s hand. The device then moves Arnold’s hand — and is helping to rewire his brain, years after his strokes. With the technology, Arnold is now able to hold his wife’s hand, just as he always wanted to.
The U.S. government’s Defense Advanced Research Projects Agency (DARPA) is also hard at work developing neurally controlled robotic arms that will give amputees both motor and sensory abilities, restoring what they had lost after amputation. The program, called “Revolutionizing Prosthetics,” has developed neurotechnology that lets wearers have direct neural control over the prosthetic systems. The “LUKE” arm system, which can read nerve signals from muscle left after an amputation, allows for a greater range of strength, dexterity, and motion than traditional prosthetics. The program is also working on developing sensors that promote haptic feedback between the prosthetic and the wearer’s brain.
A growing subfield of computer science is machine learning, centered on designing algorithms that self-learn in response to input. This is important: as the algorithms become more sophisticated, they will grow from being able to parse emotions from the capture of faces, to identifying disease markers and potential treatments for patients. With the almost addictive use of smartphone technology in daily lives, an added wearable device could send data to health care providers, who then are given a clearer and wider picture of a patient’s health and life.
Rather than wearing an Apple Watch or a Fitbit, researchers are looking into ways to get even more information about the human body. “Neural dust” is made of microscopic sensors that use ultrasound to record electrical signals from neurons. So far, the experiments have been in the peripheral nervous system (the nerves that work throughout our limbs), but with improvements to the technology could be used in the brain and central nervous system. By recording signals from sensors all over the body, scientists could have a complete view of neural activity — and perhaps a better understanding in how to harness and enhance it.
Brain-computer interfaces are also being developed to treat blindness. When light enters the eye, it stimulates photoreceptors on the retina, which then convert the information into electrical signals that are then sent to the brain. Each image is its own pattern on the retina, and electrical impulses from the retina are sent in coded form. Researchers decoded the neural codes of retinal cells, and then built “artificial retinas” to produce and send the same electrical patterns to the brain. The artificial retinas are chips that transfer information to the brain with almost the same clarity as “real” retinas do.
These advances go beyond medicine. The “Brainternet” is a BCI project that works to convert a user’s brain into a node for a network of “smart” physical devices called the “internet of things.” A headset containing electrodes is attached to the wearer, and neural activity is detected and transmitted using a Raspberry Pi. This then allows the user’s brain to “connect” to the internet, and communicate with other users by using brainwaves detected via electroencephalography (EEG).
Though not advanced enough yet for medical uses, this system illustrates how our thoughts, and perhaps human consciousness, could eventually no longer be locked within the confines of the human body. Perhaps the TV series “Black Mirror” has it right when it tells stories of human minds being uploaded into a networked virtual “nirvana” and human memories being uploaded for use in new bodies.
This article was first published in Brain World Magazine’s Fall 2018 issue.
Related Articles
- Experience on Demand: What Virtual Reality Is, How It Works, and What It Can Do
- Games for Health: Exploring How Video Games Can Improve Health and Health Care
- Google Wants to Know Your Brain (and Not Just Your Searches)
- Living Mobile: 5 Ways Technology Supports Healthier Lifestyles
- The Therapeutic Interconnection: Getting Results with Internet Cognitive Behavior Therapy
- Will We All Be Gamers Someday?