Falling Out of Focus: What Science Says About ADHD

(Editor’s note: This article from the Winter 2017 issue of Brain World magazineIf you enjoy this article, consider a print or digital subscription!)


I typically wake up when my iPhone reads 8:42 a.m. A full minute later, I’m scanning my inbox on my phone, checking for new messages. I might get lucky and find a few worth responding to, and maybe I will write back right then and there, but, more likely, I’ll start a new draft of my reply, and hightail it to the trending news — it being an election year, I’ve probably done that more than ever before, checking election-forecast maps before I’m off to read, and subsequently write, about the most recent findings in science news.

Somewhere in between, I’ll probably cross over four different websites (minimally), and by midday have at least 12 tabs open on my HP Split touchscreen laptop. A few might even be a surprise when I click across the screen about an hour later, maybe halfway into writing my story. There’s of course always the risk that a good portion of the day will be lost to Facebook. It’s work, I convince myself. If I’m to blog, what’s a little promoting my newest post on an array of groups, and occasionally picking an argument on the world’s biggest social network?

It seems like quite a feat when my posts for the day are finally finished — as is the sense of real accomplishment that comes with getting anything done at all. From the moment I unlock my social media, it seems like the world comes rushing in — and there’s not much anyone can do to stop it. The constant barrage of information turns us all into multitaskers (some better at it than others) — but we’re probably all second-guessing how good we are and how good we can be.

In a world where a mix of news and entertainment comes at you fast, and there’s not much escape from, or difference between, the two, it seems inevitable that the average attention span in 2016 is a short eight seconds — even less than that of a typical goldfish. Attention deficit hyperactivity disorder (ADHD), a cognitive disorder that has only been brought to light in the last two decades, continues to be a rising concern among not only parents, but also for adults themselves — making it one of the most overdiagnosed disorders in the United States.

Nearly 15 percent of American school-age children have been diagnosed with ADHD, a sharp increase from 20 years ago. In addition, an average of 5 percent of schoolchildren in nearly every culture has been diagnosed with ADHD. As of 2013, the World Health Organization estimated the disorder to have affected 39 million people worldwide. Yet, the disorder remains as big a controversy as ever among the general public and many in the medical community — with substantial numbers of people doubting the problem even exists.

IS IT REALLY ADHD?

The techniques of diagnosis remain controversial, with many mental health advocates feeling
that the criterion for an ADHD diagnosis is too narrow, a reason why there’s been such a rapid increase in the number of cases. Do all of them really need help? Or maybe some of them have simply picked up multitasking as second nature?

According to a 2010 study conducted at the University of Michigan, the likelihood of an ADHD diagnosis correlated with the time at which children began kindergarten. Those who started school a year earlier were five times more likely to be diagnosed with ADHD than their peers who waited an additional year before attending. It is likely that the misdiagnosis is just the result of childish behavior in many of these cases.

Those who started a year later were also four times less likely to be medicated for ADHD symptoms. Although no large scale study has been done to highlight the negative long-term effects of stimulant therapy, particularly in cases of misdiagnosis, the side effects of commonly used ADHD medications have been documented — hypertension and slowed growth rates.

A key component to determining whether or not an individual has ADHD is if their mind wanders in two different settings. Usually in class, everyone has a lapse in attention from time to time, but people who suffer from ADHD will experience similar problems at home or in the workplace as well. Even activities that once gave them pleasure become something they are unable to do quietly. In other words, people who suffer from the disorder are typically uncomfortable being quiet or inert for extended periods of time.

In adults, forgetting appointments or paying bills could be symptomatic of the disorder. It is at this stage when the disorder can become costly (as many people who suffer from ADHD also fail to review paperwork accurately and don’t remember to return calls) and also harder to diagnose (as fewer adults may be aware of it as a problem affecting them) — both reasons why those suffering may have difficulty holding on to their jobs. If left untreated, many of those diagnosed as children can also suffer from the symptoms well into adulthood.

SO IS IT REAL?

A lot of people simply dismiss the complications of ADHD as, “These kids today.” Such thinking has us believe that it’s not any disorder that’s the problem, but that it’s our lax moral standards that cause kids to not do their homework, or to talk out of turn. Surprisingly, ADHD isn’t all that new, with psychological literature describing its symptoms dating back to the late 19th century — making the disorder nearly as old as psychology itself. While no cure is known, researchers are somewhat more confident in understanding some of the root causes, with genetics playing a crucial role.

An array of studies among identical twins suggest heredity as a primary cause of ADHD. In one Australian study of nearly 2,000 families with twins and siblings, the concordance rate of ADHD in identical twins was 82 percent, compared with 38 percent in nonidentical twins. Children diagnosed with ADHD also suggest a 5 percent risk that someone in their family will get the disorder in the future, or that a parent or other close relative also suffers from it.

While there is no single cause, the dopamine-active transporter-1 (the primary regulator of dopamine neurotransmission) and dopamine receptor D4 (associated with disorders like Parkinson’s and schizophrenia) are strongly linked to ADHD, perhaps responsible for up to 80 percent of cases. These transmitters are responsible for carrying dopamine — a chemical involved in the brain’s reward circuitry, helping it to recall pleasurable experiences. A failed mutation in the DRD4 replication could cause a number of behavioral changes — among them, are the hyperactivity and impulsiveness associated with ADHD.

There are environmental factors that may play a role in ADHD, too — lead painting can impact a child’s brain in a negative way during the first three years of life, resulting in ADHD among other complications. Smoking during pregnancy can also negatively impact the central nervous system, increasing the risk of ADHD in an unborn child, as does fetal alcohol syndrome. Nearly 30 percent of children who experience traumatic brain injuries, such as seizures, have been known to show signs of ADHD as well.

While food allergies were an early suspect in the cause of ADHD, no link has been established. Nor, for that matter, is there a correlation between ADHD and watching too much television, or electronic media, as a host of studies have roundly refuted. In fact, our more primitive ancestors may have been more responsible for the prevalence of ADHD today than anything else, as natural selection favors a few of its traits.

WHERE WE GO FROM HERE

Our more-impulsive ancestors probably found mates much more easily than the less aggressive hominids, passing down phenotypes that don’t work out so well in a modern-day office. As the earliest humans migrated across the ever-changing landscapes of Africa in search of food, the constant drive associated with ADHD may have helped with endurance and hunting, as would faster reaction times that made our ancestors ideal predators. Because a number of genes have been associated with the disorder, ADHD actually thrives where there’s gene flow in a large population.

In just a short span of human evolution — as we formed tribes, societies, states, and eventually online social networks — our brains have grown considerably. If anything, it’s proven the human brain to be capable of rapid change — neuroplasticity. In addition to the medications available today to treat ADHD, there is also the growing popularity of cognitive behavioral therapy, which has been used historically to treat obsessive-compulsive disorder and depression. Since many adults who also suffer from ADHD show symptoms of depression, cognitive behavioral therapy techniques not designed specifically for ADHD can also work well in these cases, as they require patients to draw out their negative thoughts — thoughts that develop in response to actions and how we see the world around us — and work on refuting them.


While cognitive behavioral therapy does not negate the use of stimulants to treat ADHD, it allows patients who suffer from it to control the environment around them in better ways — such as designing a planner or filing system to better manage their day-to-day lives, allowing them to plan for both the long and short term. While the world around us may have enabled ADHD to thrive and spread, perhaps we can curb it by making the world a better place.

(Editor’s note: This article from the Winter 2017 issue of Brain World magazineIf you enjoy this article, consider a print or digital subscription!)

Advertisements


Be the first to comment

Leave a Reply

Your email address will not be published.


*