The Future of Artificial Intelligence

PeopleScienceStories

Are you familiar with the “Battlestar Galactica” franchise? It’s a military-inspired sci-fi television series that has spanned (sporadically) across several decades, gaining a massive following worldwide. While the plot is rather elaborate and features multiple story arcs with an uncanny amount of detail, the premise is rather simple: A civilization of humans is engaged in war with a cybernetic race that they themselves created. Classic.

Humanity has always been fascinated with the idea of artificial intelligence — if we manage to create it, it would be one of the pinnacles of human innovation, our own version of godhood, a manmade miracle. And, giving credit where it’s due, we have made leaps and bounds in the right direction with recent technologies; just take a look at our sophisticated mobile devices, super-smart cars, and the robots created by tech wizards that can function at the same level as 4-year-olds.

Yet, in order to understand what the future holds for artificial intelligence, it’s important to first understand our own intelligence. How can it be defined, anyway? What determines it, and can it be predictably manipulated?

For a long time, intelligence was thought to be based entirely on heredity. The idea was first strongly advocated by Cyril Burt (1883–1921), a psychologist who used studies of twins to support his theory. He asserted that twins who have been separated at birth and raised in different environments still tended to have virtually the same IQ, meaning that intelligence must have been genetically passed down to them from at least one of the parents (but hopefully both).

While his ideas became widespread and rather popular, his arguments may have been falsely supported. For one, he had the same statistical results across different studies, which, mathematically speaking is rather unlikely. He also argued that all of the twins used in his studies came from very different environments, a claim that was also probably fabricated. One thing remains certain — for Burt, intelligence was an innate ability passed down from parent to offspring.

While there is evidence that suggests genetic influence on intelligence, there are many other points to consider. In recent years, advancements in neuroplasticity research indicate that, with consistent practice, our overall cognitive function can be improved and retained well into the golden years. We can sharpen our focus, memory, and retention, in effect becoming more efficient problem solvers and agile thinkers. Bluntly speaking, the success of brain-training programs is based on the notion that intelligence can be cultivated, but we are only at the beginning stages of learning how to do so.

The Cattell-Horn-Carroll theory of cognitive abilities suggests that there are many different types of intelligence as opposed to just one. Fluid intelligence, which is what brain-training programs target, encompasses our ability to use new information and procedures to form concepts, solve problems and reason on a broad scale. Crystallized intelligence reflects an individual’s ability to communicate, reason, and utilize previously learned skills and experiences — think of it as the depth and breadth of an individual’s acquired knowledge. And then there’s quantitative intelligence, the ability to manipulate numerical symbols derived from the comprehension of quantitative relationships and abstract concepts.

Computer code is also made up of mathematical concepts and numerical symbols that are directly associated with artificial intelligence. Does that mean computers comprehend mathematical concepts just like humans do? Not exactly, but they are inching closer every day. Moore’s Law suggests that a typical computer’s power doubles every two years. With the way technology has evolved over the last decade — think mobile phones, smart technology, Google Glass, and so on — it’s hard but exciting to imagine what lies on the horizon.

Reading and writing ability is also considered to be an aspect of intelligence, as is short-term memory, long-term memory, visual processing, auditory processing, general processing speed, and decision/reaction time to a particular stimuli or task. Yet, we are creating machines that are able to do this at the same, if not at a better, level than us.

The Merriam-Webster dictionary defines intelligence as “the ability to learn or understand or to deal with new or trying situations … the skilled use of reason … the ability to apply knowledge to manipulate one’s environment or to think abstractly as measured by objective criteria.” Interestingly, this particular definition doesn’t say anything about intelligence belonging solely to humans.

Granted, the Christian Science definition states that it is “the basic internal quality of divine mind.” But what about a LiveScience article by Tanya Lewis that talks about the possibility of digital singularity in the near future? Anticipated by futurists as arriving as soon as 2045, the idea proposes that the human mind could be uploaded to a computer and stored there until the end of time. Yes, the idea sounds completely and utterly mad, but is it impossible? Only time will tell. Furthermore, would we consider it a feat of human or artificial intelligence if it ever happens?

“Mainstream Science of Intelligence,” a public statement issued by a group of academic researchers in 1994, had this to say about intelligence: “A very general mental capability that, among other things, involves the ability to reason, plan, solve problems, think abstractly, comprehend complex ideas, learn quickly and learn from experience. It is not merely book learning, a narrow academic skill, or test-taking smarts. Rather, it reflects a broader and deeper capability for comprehending our surroundings, ‘catching on,’ ‘making sense’ of things, or ‘figuring out’ what to do.”

Now, fast forward to 2013: PC Magazine published an article by Stephanie Mlot, who wrote, “ConceptNet 4, an MIT-developed AI system, was put through pre-K boot camp, running the verbal portions of the Wechsler Preschool and Primary Scale of Intelligence Test — a standard IQ assessment for young children … The supersmart computer scored uneven marks across different portions of the test — a red flag for most kids.”

If these results belonged to a real child, they would indicate that something was wrong with his or her cognitive function, but, since they belong to a computer, they essentially mean that scientists and engineers are rather close to building a computer that can complete a real IQ test. Computers are actually showing the ability to reason, problem solve, and perform comparable to a 4-year-old child, albeit slightly subpar. And since IQ tests are really used to predict how someone (or something) will perform in an academic environment, it’s not the best way to test for real intelligence and its various types.

When examining real and artificial intelligence, it’s hard not to think of the works of Douglas Adams, who blended the lines between the two. He wrote about depressed robots, doors with happy dispositions, and flying spaceships with their own unique intelligence and peculiar personalities, as well as the humans who actively coexist, collaborate, and conspire with them. The future, according to Adams, is full of different real and artificially intelligent beings. They are simply a normal part of our not-so-distant and integrated lives, which is a much better scenario than the one posed in “Battlestar Galactica.”

This article was first published in Brain World Magazine’s Fall 2013 issue.

Related Articles

You May Also Like

Negotiation & Neuroscience: What to Take to Your Next Business Meeting
How Fantasy Becomes Reality: Our Media in Everyday Life

Sponsored Link

About Us

A magazine dedicated to the brain.

We believe that neuroscience is the next great scientific frontier, and that advances in understanding the nature of the brain, consciousness, behavior, and health will transform human life in this century.

Education and Training

Newsletter Signup

Subscribe to our newsletter below and never miss the news.

Stay Connected

Pinterest