Human beings learned to learn long before some enlightened soul in a powdered wig pronounced the value of “education.”
Formal education was a transformational idea and needs no introduction or defense. It was so good that, down through the generations, its language, culture, and schedules—dividing the day into chunks (classes), insisting on routine, prolonged concentration, monkish devotion—came to define how people presume the brain works most efficiently, or should work.
Only it doesn’t, not even close.
School is a recent invention, and so are its customs. The ancient civilizations we learned about in middle school date from a few thousand years ago, no more. Humans have been around for hundreds of thousands of years, and it was during that endless diaspora in the wild that the nervous system grew up. As we foraged for food and water, the brain adapted to pick up cues on the fly, piecemeal, in the dark and cold, under a scorching sun, over changing terrain.
It became a forager, too, for information, for survival strategies, and for skills.
This gap—between how the brain is presumed to learn, and its foraging nature—is large enough that cognitive and memory scientists have had a field day. In the past decade or so, they have produced a string of findings that violate the standard study advice: to work in a dedicated, quiet place, according to a strict ritual (no distractions allowed!). This research has inspired Silicon Valley startups; generated large-scale experiments in public schools and university classrooms; and, for those in the know, provided an off-the-shelf menu of techniques that can be tailored to deepen learning of specific topics.
Consider foreign languages. When building vocabulary in a new language, it is better to split study time into increments—an hour today, an hour tomorrow, an hour the night before the test—than to put in three hours all at once. This is the kind of advice moms used to dole out regularly, only they never quoted the research: People remember up to two times more French verbs (or English verbs, or state capitals, or bird species) when using incremental, “spaced” study compared with the concentrated kind, scientists have found.
It’s worth stopping to savor that simple result: up to twice the recall on a vocabulary test, with no extra time, and no extra effort, required.
Testing is itself perhaps the strongest learning technique available. Think of testing in this context as self-examination: running through a stack of flashcards, or reciting concepts in front of a mirror, or even playing teacher and explaining the material to a friend. For that matter, sparring with a classmate—who can translate simple English sentences into French fastest, after a cocktail? after three?—also counts as a self-examination.
In scores of careful studies, in various fields, cognitive psychologists find that devoting about half of study time to self-examination ramps up a person’s score on a delayed test—say, a week later—by about 30 percent. Here again: No extra time or effort required.
And that’s only Learning Science 101. As the field deepens, it has begun to turn up many other specific, practical strategies that can be adapted to music, athletics, and technical subjects such as math and engineering.
For example, taking a break during a problem-solving session—allowing yourself a real distraction, such as browsing social media or streaming a favorite TV show—is, in fact, the best way to increase the likelihood you will solve the problem you are stuck on. That means distraction is good, not bad, when engaged at the right time (you do have to come back to the problem, not get lost in the show).
Procrastination can work in one’s favor, too. When facing a large, creative project, the temptation is to put off starting it and let the clouds darken with each passing day. That’s the bad kind of procrastination. The good kind is only slightly different: Start the project, even if you can work for only 15 minutes. Then put it down. That act in itself activates the brain, both consciously and subconsciously, to begin compiling material that is relevant, both from the outside environment—from news reports, for instance, or even a casual chat—and from the jump-started internal conversation in your own head.
The next time you sit down, you will have far more to say than you expected to have.
These aren’t merely “tips” or “techniques.” They are small alterations that have outsize effects on performance, rooted in years of rigorous cognitive science. They also provide the basis for what no one ever seems to have: a strategy that guides how to learn most efficiently, depending on the topic at hand and the time available.
How to formalize learning science is not as easy as it might seem: Individual learning is one kind of process, and group learning is another, with its own chaos and rhythm. In a recent experiment, psychologists at the University of Texas incorporated a kind of self-examination into their introductory course. Performance improved, as a rule, but the students bristled at the added testing, and some dropped out.
Learning Science could become a course of its own, beginning in middle school, when students first become self-directed learners. There, they’ll begin to appreciate both the ancient, foraging brain and the modern, classroom one, and begin coaxing the former to juice the latter.
In a world now swimming in information, ever-advancing technology, and specialized demands, just knowing that primal neural machinery is there—adaptable, congenial, an old soul eager to help—is its own kind of secret sauce: a mental GPS tuned to navigate through any wilderness, even one of our own making.
Benedict Carey is a science reporter for The New York Times and the author of How We Learn: The Surprising Truth About When, Where, and Why It Happens.