The winter of 1778, the midpoint of the American Revolution, was so cold that a wagon driver and his oxen were frozen solid in midstride near Boston. A ship that sank on Christmas Day near Plymouth with 120 sailors, saw 66 frozen stiff. Many of the survivors lost limbs, especially feet. The American army was driven from New York, the most important city in the rebelling colonies as the British ruled Rhode Island and Philadelphia.
The torch of liberty was but a dream. Chilled by weather and despair, those who wanted change had to find courage they believed was lost. Imagination had to be lit to defeat the cold of despair. Those who lost feet after the Plymouth wreck used crutches or crawled upon their knees, yet rose to eminence in their communities. Refusing to be vanquished, the 13 colonial states prevailed and formed an independent United States of America. From a time of darkness and confusion came the courage to create.
The winter of our present despair seems apparent any time we turn on technology. This brave new frontier appears to have turned on us, the robots we marveled at snatched away jobs, and the advent of social media now allows faceless bullying and hate speech. Violence once considered atrocious is now almost normal, dulled through persistent rotations of the 24-hour news cycle, movies, and video gaming. Lies from fake news sites are circulated by the clicks of the intellectually somnolent.
It is trite to write that winter precedes spring, but it is also true. Adversity is the parent of innovation. It’s worth taking a step back to look at the speed of our new lives. Five thousand years ago, we first started chiseling out the letters that would lead to reading, and now our eyes move rapidly over computers, tablets, smartphones, newspapers, books, persistently consuming knowledge, and sizable amounts of trash, too.
In the first seven years of this century we achieved as much in terms of technological change as in all of the 20th century, and we continue to leapfrog innovation. The pace of “penetration,” or the amount of time it takes for a new technology to be used by 50 million people, is unprecedented. For radio, technological penetration took 38 years; for the telephone, 20 years; for television, 13 years; for the world wide web, 4 years; for iPads, 2 years; and for Google+, 88 days.
We learn faster, yet retain less. Jay N. Giedd, M.D., of the National Institutes of Health says that a propensity to multitask using a myriad of devices may promote “mile wide, inch deep” thinking and a resistance to the patience needed for scholarship. In a study published in the Proceedings of the National Academy of Sciences, more than 250 Stanford University students were asked about their use of digital media. Those who reported high concurrent usage of several types of media were less able to filter out distracting information in their environment, more likely to be distracted by irrelevant information in memory, and less efficient when they were required to quickly switch from one task to another.
“The way adolescents of today learn, play, and interact has changed more in the past 15 years than in the previous 570, since Gutenberg’s popularization of the printing press,” said Giedd. In 2010, U.S. adolescents spent an average of 8.5 hours per day interacting with digital devices, up from 6.5 hours in 2006. Thirty percent of the time they are simultaneously using more than one device, bringing daily total media-exposure time to 11.5 hours. The U.S. Census Bureau says there are 83.1 million people between the ages of 18 to 34, and the average millennial spends 18 hours a day using digital media.
Ninety percent use social media, which is up from 12 percent in 2005, the Pew Research Center reports. While teens tend to use Snapchat and Instagram, their parents are on Facebook. By the third quarter of 2016, Facebook had 1.79 billion users, which with 6.9 billion people in the world means that around one person in five has a Facebook page. It accounts for about a quarter of all the time spent on the internet — that’s you and me, not the kids, idly clicking and liking and loving, showing a wide-faced laugh or a little tear to empathize our days and lives away.
Kaiser Family Foundation research has shown that when teens do homework at the computer, two-thirds of the time they are also doing something else (for example, instant messaging, listening to music, texting, surfing the internet, or viewing social media). Adults exhibit similar conduct.
“Decades of research have found the brain is rapidly shifting between the tasks — and for each switch we pay a metabolic and time toll,” says Giedd. “A high-stakes example of the perils of multitasking is the use of cellphones while driving, which impairs performance to the same degree as driving while intoxicated.”
But increasing technology use has positive aspects, too, beyond the obvious ones. The U.S. is experiencing the lowest level of teenage pregnancies in the 69 years since the first collection of national data (39.1 per 1,000 females aged 15 to 19 years). Abortion rates are dropping and the commencement ages of sexual activity are rising. Teens are spending so much time interfacing with technology, their phones, or video games, that they spend less time physically interacting.
This has also had a marked impact on crime, which has dropped in all developed countries where video games are commonly available. Data from NIH shows that from 1995 to 2008, as sales of video games quadrupled, hours spent playing them doubled and violent content increased, rates for juvenile murders decreased 72 percent, and rates for violent juvenile crime decreased 49 percent to a 30-year low.
In a paper titled “Children, wired – for better and for worse,” NIH researchers Drs. Daphne Bavelier, Shawn Green, and Matthew W.G. Dye say the effects of technology depend on what type of technology is consumed, how much, and for how long: “Even products that seem on the surface to be extremely similar, for instance, the children’s television shows ‘Dora the Explorer’ and ‘Teletubbies,’ can lead to markedly different effects (for example, exposure to ‘Dora the Explorer’ is associated with an increase in vocabulary and expressive language skills in 2-year-olds, while exposure to ‘Teletubbies’ is associated with a decrease in both measures),” the authors write.
What about addiction to technology? Bavelier, Green, and Dye say that about 2 percent of young people have internet addiction. There are no similar studies among adults despite everyone reading this knowing one or more people who anxiously check their phones every few minutes, and with some psychologists reporting it as a real problem among clients. Nonetheless, gaming can lead to higher concentration levels and a higher capacity to notice detail. Laparoscopic surgeons who are habitual video game players, as an example, tend to be better surgeons than their peers who scorn gaming. Likewise, drone pilots are recruited gamers, not aviators.
While technology functions best when it is hardwired, our magnificent brains are infinitely pliable. All intelligence requires the capacity to evolve and transform, and although technology, a product of our superior brainpower, has raced ahead, our wise and ancient brains are taking their time to assess, accept, and discard. Perhaps the real revolution lies in patience.
This article is updated from its initial publication in Brain World Magazine’s Spring 2017 issue.
More From Brain World
- (Anti-)Social Media: How Social Networks Affect Our Neural Networks
- Job Search: 6 Networking Tips for Using Social Media Successfully
- Socializing Via Networks: Using Social Media Wisely
- What Makes Our “Social Brain” Special?
- Will Electronic Tattoos Ever Go Mainstream?
- Why Is Facebook So Hard to Quit?