“If we take a serious look around us, the entire fiction that the IQ … fully measures intelligence rapidly disintegrates.”
Daniel Goleman, author of Emotional Intelligence
To get a firm grasp of the Learning Code, and how to apply it, it is important that we embrace more scientific definitions of intelligence. In this Element we will look at intelligence from the biological perspective and how this view can help you to more successfully survive and thrive in the world. It will also cover how the present outdated view of intelligence actually limits our ability to increase it.
It can be said that the ultimate goal of any scholastic, government or corporate learning institution is to increase an individual’s intelligence – either in general or in a specific area. But before we can hope to increase intelligence, we must be very clear about what we mean by it. Unfortunately, our reliance on what I call Ancient Learning Theory (see “Reasons for Learning Failure in the 21st Century“) has allowed us to labor under some core assumptions about intelligence that no longer serve us.
Read more to learn why your ability to adapt to the world is more important to your success than your IQ, SAT scores or grades on academic tests (Because it’s not easy to quickly comprehend why our existing view of intelligence has been so far off the mark, this section is somewhat in-depth.)
Ever since the fourth century, when the very father of faith, St. Augustine, declared, “The author and prime ruler of the universe is intelligence,” man has been fixated on trying to define and measure intelligence. Ancient Learning Theory (see “Reasons for Learning Failure in the 21st Century“), based on the efforts of Plato, Aristotle, St. Thomas Aquinas, Descartes, and Newton, has led us down the wrong path, where we have come to define intelligence primarily as the ability to do well on pen-and-paper tests – either standardized intelligence tests or tests given in academic settings. Those of us who got high scores were considered more intelligent than those who received lower scores. At the turn of the century, with the development of their 30-question IQ test, French educators Simon and Binet helped propel us toward another false belief: that the scores we received on intelligence tests could accurately predict our future success.
This misconception has led to what one of the world’s leading experts on intelligence, and the creator of the concept of multiple intelligences, Harvard’s Howard Gardner, calls the “IQ way of thinking.” That is, we believe that you are either born smart or not,that your level of linguistic aptitude will determine your future success, and that written tests can predict whether or not you will do well in life. But, under the light of research, our previously held beliefs about what high IQ/SAT scores, good grades, and class position really mean to our future success begin to crumble. Studies indicate that these measures are very good at indicating who will do well within the school system but sadly are very poor indicators about who will do well in the real world.
Gardner has become so discouraged by our emphasis on test scores that he wrote in the 10th anniversary edition of his groundbreaking Frames of Mind, “First of all try to forget that you have ever heard of the concept of intelligence as a single property of the human mind, or that instrument called the intelligence test, which purports to measure intelligence.”
Biological and Evolutionary Perspectives of Intelligence
If the research is telling us that intelligence is not best measured by your school grades and scores on intelligence tests, then what is intelligence? Simply put, it’s best defined as your ability to successfully adapt to the world around you. Systems analysis and the new sciences are forcing leading edge scientists and educators to embrace more biological and evolutionary views of intelligence. With this more encompassing perspective, we can see that for 3.5 billion years, ever since lightning struck the primordial soup and gave rise to life, organisms have increased their intelligence by increasing their ability to adapt their behaviors to “fit” the world.
If you are not good at adapting to your environment, your ability to survive and thrive is severely limited. Forget the test questions posed to you by authority figures. The most important question you need to answer is: “In my present environment, can I adapt my actions well enough and quickly enough to survive and prosper?” Gardner points out that our traditional linguistic-based educational system and intelligence testing “all ignore biology; all fail to come to grips with the higher levels of creativity; and all are insensitive to the range of roles highlighted in human society.” But without a biological and evolutionary perspective, we will continue to dupe ourselves into believing that our scores on a pen-and-paper test can define intelligence and predict our future success in the real world. As Derek Bickerton, author of Language and Human Behavior, says, “Let us look at intelligence in evolutionary terms, hopefully it will soon be regarded as impossible to think about mind and intelligence in any other terms.”
We have become so fixated with the power of the test – IQ, state equivalency, SAT, GRA – to predict our future success that it could be said that the sole goal of our educational system is to increase a student’s ability to do well on a test. In 1923, as pen-and-paper testing was embedding itself into the fabric of the world’s learning systems as the ultimate determinant of intelligence, Edwin Boring, a pragmatic scientist at Harvard, flatly proclaimed the view of intelligence that most of the world still wrongly holds today: that is, intelligence is “the capacity to do well on an intelligence test. Intelligence is what the test tests.” As Peter Relic, past president of the National Association of Independent Schools, says, “Our fanatical emphasis on testing and academic performance sends the wrong message.” A test score is more important to us than who a child is as a human being. That means an immense loss in terms of human potential, because we have defined success too narrowly.
Rethinking Our Previous Position on Intelligence: the Adaptability/Intelligence Factor
We cannot continue to build learning systems that try to increase our ability to succeed in the real world by focusing primarily on how to increase test scores. Therefore, to drive home the evolutionary and biological perspective of intelligence, I use the term Adaptability/Intelligence Factor, or the AI Factor, whenever we discuss intelligence that supports our success in the real world. Until we recognize that our ability to succeed in life is tied to our AI Factor, we will continue to build learning systems that better prepare people for an academic than real world.
The idea that our intelligence test scores and our grades are predictors of our future success is so engrained in our society that it is important here to take the space to see that experts in the field have debunked this previously held position. Most readers would be amazed to find that many experts who take a scientific view of learning (whom we call brain-based educators) strongly reject the value of intelligence and aptitude tests.
Daniel Goleman, author of the Emotional Quotient, nicely sums up the new perspective: “The tests that tyrannized us as we went through school” from the achievement tests that sort us out into those who would be shunted towards technical schools and those destined for college, to the SATs that determined what, if any college we would be allowed to attend – are based on a limited notion of intelligence, one out of touch with the true range of skill and abilities that matter for life over and beyond IQ.” He goes on to say, “One of psychology’s open secrets is the relative inability of grades, IQ or SAT scores, despite their popular success, to predict unerringly who will succeed in life” (see “Why Tests Don’t Work“).
The Fallacy of a Single Test Score
Yale’s Robert Sternberg, who has committed his life to studying intelligence and is the author of over 40 books on the subject, writes in Triarchic Mind: a New Theory of Human Intelligence, “For decades intelligence testers have been selling the public on the notion that a single score … reveals the single basic fact about people’s intelligence. Yet there is little evidence that any scientist studying intelligence – past or present – actually has believed it is just a single thing. All the same the idea has a long history.” He concludes, “If we take a serious look around us, the entire fiction that the IQ test fully measures intelligence rapidly disintegrates.”
Leta Hollingworth, an expert on the study of gifted people, conducted seminal research that revealed a disturbing finding. She discovered that, as IQ increased over a certain point, people’s job performance and salary actually decreased! The Berkeley study on creativity showed a lack of correlation between IQ and independent thought and action; between IQ and the ability to value or possess a good sense of humor; and between IQ and the ability to appreciate beauty, complexity, or novelty. In addition, the study also found that IQ did not even closely correlate with the ability to be reasonable.
Intelligence researchers Fred Fiedler and Thomas Link found that “Cognitive ability tests have been notoriously poor predictors of leadership performance” and that “Leader intelligence under certain conditions correlates negatively with performance.” Jensen Barclay, professor of education, who has written several books on intelligence, notes that those who have IQs in the top 2 percent of test takers, which qualifies them to be part of an organization called MENSA, habitually fail to achieve the high levels of success in the real world that they once achieved on their IQ tests.
The kind of mind that can do well on tests that deal with miscellaneous facts is not necessarily the kind that does well in the real world. When asked why he had not received a promotion in eight years, the all-time money winner on Jeopardy (until they changed the rules in 2004) said flatly, “The Jeopardy mind is not the most useful mind in the real world.” Years ago, on a radio show and later a television show called Quiz Kids, the “brightest and the best” children with genius IQ’s of 140 to 200 were chosen to strut their stuff. But, on following these children through adulthood, Roth Feldman found that their lives were noticeably less distinguished than their IQs should have predicted.
Commenting on intelligence tests, Susan L. Barrett, author of It’s All In Your Head, states, “Why do people keep taking (and giving) them? For one reason, old habits die hard. Our schools are used to testing. They think that test scores can predict the students’ future success – even though this has never been proved!” After reviewing the research, Gordon Dryden and Jeannette Vos conclude in their bestselling book, The Learning Revolution, “Possibly the worst education invention of this century was the so-called intelligence test.”
Good Grades No Predictor of Success
Like intelligence test scores, achieving good grades and high class position are also poor predictors of future success in life. Consider the study that followed valedictorians and salutatorians from the 1981 graduating classes of Illinois high schools. It was found that, while these students had the attributes to ensure school success, these characteristics did not necessarily translate into real world success. By their late 20s, these superior students had reached only average levels of success in life. Only one in four were achieving at the highest levels in their chosen profession and the rest were doing much less well. Karen Arnold professor of education at Boston University, one of the researchers tracking the valedictorians, explains, “To know that a person is a valedictorian is to only know that he or she is exceedingly good at achievement as measured by grades. It tells you nothing about how they react to the vicissitudes of life.”
In the fascinating book, The Millionaire’s Mind, Thomas J. Stanley and Jon Robbin, a Harvard-trained mathematician, did in-depth statistical research to identify which variables caused people to become super wealthy and successful in business. Their research found that, contrary to popular belief, there was no significant statistical correlation between how successful these individuals were later in life and their grades in school, their class position, or their SAT scores.
In another revealing study done in 1998, it was found that 15 percent of the individuals on Fortune’s 400 list of wealthiest people either did not start college or dropped out. Amazingly, these 58 dropouts’ average net worth was not less but more than their contemporaries” and not by a little bit. Their average net worth was $4.8 billion, which turned out to be 167 percent higher than their college-graduating peers, who averaged $1.5 billion. And when these individuals who were not suited to the school system were compared to those who graduated from our most prestigious Ivy League schools, such as Harvard, Yale, and Princeton, it was found that their net worth was 200 percent higher.
The founder of Kinko’s, Paul Orfalea, did poorly in the school system, and he recounts that to bolster his feelings his mother use to tell him, “The A students work for the B students, C students run the business, D students dedicate the buildings.” Consider that five of the most important minds of the 21st century that are shaping the information age did not even finish college: Bill Gates and Paul Allen, founders of Microsoft; Steve Wozniak and Steve Jobs, founders of Apple; and Michael Dell, founder of Dell computers.
“First of all try to forget that you have ever heard of the concept of intelligence as a single property of the human mind, or that instrument called the intelligence test, which purports to measure intelligence.”
Howard Gardner Author of Frames of Mind
The Problems With Academic Performance
In simplest terms, academic performance and intelligence tests fail to predict real world success because the knowledge and traits that they emphasize are not necessarily the things we need to survive and thrive in real life. Unfortunately, abilities like efficient memorization of details, writing essays, and conforming to what authority demands often have very little to do with attaining success outside of the classroom.
As we see in other sections of this web site (“What Is Learning?”), the learning and memory that we use to help us survive and thrive in the real world is created when we experience neurological growth. The fastest way to increase this growth, which increases our Adaptability/Intelligence Factor, is by having physical experience in real world environments. In other sections of this site (see “Experience Beats Linguistic Learning Every Time“), we cover in-depth why experience in the real world is a much better promoter of neurological growth than linguistic-based classroom learning systems.
The amazing discoveries made in the 20th and 21st centuries by quantum physics and complexity science give us one more reason that IQ and grades are such poor indicators of future success. These new sciences reveal an incredibly multifaceted world where predictability is decimated. Newtonian physics maintained that humans were like machines: If we had the right measurements, we could make exact predictions. The classical science view held the whisper of “Ye shall be God.” But quantum physics and complexity science have revealed a world so chock full of variables exerting influences on each and every one of us, in so many different directions, that predicting accurate futures with present information becomes a hopeless and possibly damaging endeavor. If our academic vehicles of prediction worked, young educational misfits such as Darwin, Bell, Edison, Einstein, and Churchill should never have amounted to anything, least of all the greatest contributors of our modern age.
Are We Breeding Brains to Fit the School System, Not the Real World?
Locking a young brain up in a 50-by-50-foot box for 12 to 16 years during its most malleable periods, when special windows of learning are open their widest, forces our neurological structures to automatically adapt their shape, and thus our behaviors, to fit this academic environment. Sadly, as we all have experienced, as we age, the neural networks that were shaped in our childhood are resistant to change. Therefore, most of us exit school and enter early adulthood with large groups of networks hardwired to succeed in the isolated word of academia but not in the teeming sensory-rich everyday complex world. This shaping of our neural networks to “fit” the educational system more effectively than the real world may be one reason why valedictorians do not necessarily achieve real world success and why so many of us exit academia feeling unprepared to deal with life (see “Breeding Out Personal Meaning by Extrinsic Motivation“). College and high school dropouts such as Gates, Allen, Jobs, Wozniak, and Dell may be so successful in the real world because their neural networks spent less time in academia and were less molded to fit the traditional academic setting.
In the book Cracking the Learning Code and in future newsletters you will discover:
How you can accelerate your speed of learning by increasing the activation of specialized “memory molecules,” called CREB.
Why the brain’s NMDA receptors are called Doogie receptors, after child genius and TV character Doogie Howser, M.D.
How your memory capacity can be increased and new neurons created by causing the simultaneous firing of neurons in your brain.
How personally meaningful stimuli can generate the massive neural firing that is the basis of all long-term memory formation.
How, in order to alter your behaviors, you must first induce the massive simultaneous firing of neurons in your brain.