Practice Makes Some Perfect, Others Maybe Not

Some brains get a musical head start, research shows

When the late, great B.B. King mused about the roots of his success, he recalled happening on his uncle’s guitar as a small boy and picking out some tunes. Did that jump-start his later musical accomplishments? Or was it singing in the church choir? Maybe it was the death of his young mother—which left him with a real feeling for the blues along with a survivor’s sense of self-reliance.

Then there’s the persistence with which young B.B. pursued every performance opportunity, not to mention his tens of thousands of hours of practice.

It’s probable that all these influences contributed to King’s extraordinary musical achievements, along with one that he didn’t mention: the unique responsiveness of certain areas of his brain. A study published in Cerebral Cortex in July shows that unusual activity in specific neural areas can predict how easily musicians learn their chops.

In their experiment, neuroscientists Robert Zatorre, Sibylle Herholz and Emily Coffey at the Montreal Neurological Institute, along with a colleague in Germany, used functional magnetic resonance imaging to assess how music instruction and practice affect the brain.

They studied 15 healthy young adults who had volunteered for keyboard lessons. None of them had musical training at the outset, nor had they ever played an instrument before. The goal of the experiment was to see what would happen to their brains once they had some instruction and could pick out some popular tunes on the piano.

First, the volunteers lay in the brain scanner and listened to recognizable songs, like “Hey Jude” and “Amazing Grace.” The researchers thus had a measure of the subjects’ baseline neural activity while they were listening to music.

Six weeks of music lessons followed. Though the participants primarily trained with an online program and practiced on their own, their electronic keyboards automatically uploaded every detail of their playing to the lab’s computers. Then the subjects had another fMRI. While in the scanner they heard the same familiar songs from the first round, along with others they now knew how to play.

By comparing the volunteers’ levels of brain activation before and after training, the researchers found that, as they had expected, the brain did change after learning to play music.

“The parietal cortex and the premotor cortex were more active in the trained subjects when hearing a song that they had learned,” said Dr. Zatorre, the principal investigator. Both areas, which are recruited in the perception of sound and in planning and guiding movement, are clearly important when one has to imagine a note and play the right key.

The data also point to a distinct starting advantage in some people—and where that advantage might reside in the brain. A retroactive examination of the first fMRI images predicted who would be the best learners.

Those with a hyperactive Heschl’s gyrus (part of the cerebral cortex that is associated with musical pitch) and with lots of reactivity in their right hippocampus (an area linked to auditory memory) turned out to be more likely to remember tunes they had heard before and, after some practice, play them well.

The “kicker,” said Dr. Zatorre, was finding that neural head start. “That gives you an advantage when you’re learning music, and it’s a completely different system from the parts of the brain that show learning has taken place. It speaks to the idea of 10,000 hours.” In his book “Outliers,” Malcolm Gladwell called 10,000 hours of practice “the magic number of greatness.” Dr. Zatorre disagrees, saying, “Is it really fair to say that everyone’s brain is structured the same way, and that if you practice, you will accomplish the same thing?”

B.B. King was too modest to say so. But I think the answer is no.

How Intelligence Shifts With Age

We seem to get slower and wiser at the same time


Is the saying “older but wiser” just an old wives’ tale? We all know people who are as foolish at 60 as they were at 20 and others whose smarts have cruelly diminished with age. Meanwhile, legions of seniors who used fountain pens as children now deftly tap out texts on their tablets. So what’s the truth about old dogs and new tricks?

A study of adult intelligence, published in March in the journal Psychological Science, pits these maxims against the data. The results challenge some common assumptions—including the idea that mental acuity, like athletic prowess, always declines with age.

“Your physical ability changes over your lifetime. At first you can’t do much,” said Joshua Hartshorne, a postdoctoral fellow at the Massachusetts Institute of Technology and the study’s lead author. From infancy on, we get better at walking, jumping, climbing and running. But in our early 20s our physical abilities begin to decline, he said. Is such waxing and waning also true for mental ability? “There are two competing ideas,” he added. “As you get older you’re slowing down, and as you get older you’re getting wiser.”

The new research suggests that we are getting slower and wiser at the same time.

More From Susan Pinker

Dr. Hartshorne and his colleague, Laura Germine at the Massachusetts General Hospital, took a close look at the development of cognitive abilities as we age and discovered that different skills have different timetables. Some abilities mature early, such as how fast we recall names and faces. Others, like vocabulary and background knowledge, are late bloomers.

This nuanced view of human smarts is based on separating our thinking process into discrete slices—as opposed to viewing intelligence as a single unit, typically called “G.” This study’s findings conflict with most previous research, which shows that “G” is fairly stable across the lifespan. One study followed up on 87,500 Scottish children tested at age 11 and found that their intelligence hadn’t changed much 65 years later.

In this new study, the researchers did both retrospective sleuthing and online testing. First they reanalyzed the scores generated by those who took standardized Wechsler IQ and memory tests in the early 1990s—right at the time these tests were created. By dividing the 2,500 adults who first took these tests into 13 age groups, the researchers were able to chart the trajectory of individual skills, from adolescence to retirement and beyond. To get a more textured picture, the scientists added survey and Internet-based tests of reasoning, memory and social intelligence.

The results showed that our intellectual capacities shift as we age. Processing speed—how fast we absorb and rejig numbers, names and facts—peaks around 18, then “drops off a cliff,” said Dr. Hartshorne.

How much we can remember and manipulate at one time—the size of that mental notepad, which is called working memory—is at its prime in our mid-20s and plateaus around age 35. Then it moves from center stage.

That’s when our emotional intelligence kicks in. The ability to assess people’s emotional states from a photo of their eyes peaks around age 40 and doesn’t decline until our 60s. One form of wisdom, the ability to guess people’s internal states from very little information, is as useful around the kitchen table at it is in the boardroom, and “the difference between a 20-year-old and a 40-year-old is just huge,” Dr. Hartshorne said.

The gains don’t end there. As our hair becomes silver (and sometimes falls out), our vocabularies continue to grow, peaking as late as age 70, said Dr. Hartshorne, adding that 20 years ago tests of vocabulary showed that it crested much earlier, at age 50. “Given the way we’ve chosen to define intelligence, people are getting smarter,” he said.

This is an encouraging sign. If humans continue to learn into their seventh decade, then at least one platitude is true. You can teach an old dog new tricks.

For a Longer Life, Get Social

Research suggests that more than our eating and exercise habits, our social lives can predict how long we’ll live




Jun. 25, 2015 14:53 p.m. ET

See the column on the Wall Street Journal site

Personal independence is such an iconic American value today that few of us question it. In previous generations, retirees lived with family, but now that a large swath of older people can afford to live on their own, that’s what they choose. The convenience of digital devices means that we can now work, shop and pay our bills online, without dealing directly with other people. According to the U.S. Census, 10% of Americans work alone in remote offices and over 13% live alone, the highest rate of solo living in American history.

But is this go-it-alone ideal good for us? New research suggests that, even if you enjoy being by yourself, it just might kill you—or at least shorten your life.

A team led by Julianne Holt-Lunstad at Brigham Young University showed that living alone, or simply spending a lot of your time on your own, can compromise your physical and psychological resilience—whether or not you like your solitude. Published in Psychological Science in March, their data show that how much real social interaction you get is a good predictor of how long you will live.

If you fit into one of three categories—living alone, spending much of your time alone or often feeling lonely—your risk of dying within the next seven years is about 30% higher than it is for people who are otherwise like you. Based on a meta-analysis comprising 70 studies and over 3.4 million adults, the team’s findings reinforce a growing consensus: In-person interaction has physiological effects.

Scientists have long known that loners are likely to die well before their more gregarious neighbors. A landmark longitudinal study published in the American Journal of Epidemiology in 1979 followed nearly every resident of a northern California town for nine years; its results showed that people who not only had intimate partners but met regularly with others to play bridge or volunteer at church were twice as likely to outlive those who led solitary lives. Still, critics wondered whether social contact was the key. Perhaps the social butterflies were healthier to begin with, or the more isolated people had hidden problems, such as depression or disability, that cut their lives short.

Dr. Holt-Lunstad’s team controlled for these confounding factors. What’s more, they discovered that the effect isn’t always a matter of preference or state of mind. We used to think that subjective experience was all that mattered. You could be single or married, spend your days alone or in a throng of people; if you often felt lonely, the thinking went, your blood pressure would spike and your immune function would suffer.

The new research found, however, that objective measures of the amount of human contact you get are as critical to your survival as your opinion of your social life. “I’ve spent almost my whole career studying social support, and I absolutely know the strong effects that our perceptions have on our physiology,” Dr. Holt-Lunstad told me. “But there are other determinants of health that are independent of our perceptions. Even if we hate exercise or broccoli, they’re still good for you.” Our intuitions don’t always point us in the right direction either, she added. “There are things that we enjoy greatly that are bad for our health, like eating those rich, fatty desserts or that big, juicy burger. We take great pleasure in things that are not that great for our health.”

In fact, in a study from 2010 in PLOS Medicine, Dr. Holt-Lunstad showed that our social lives are more faithful predictors of how long we’ll last than our eating and exercise habits.

We are still left with the why question. One piece of the puzzle has to do with hormones. Oxytocin and vasopressin are secreted when we are near enough to hug someone—even if we just shake hands. These hormones reduce stress, kill pain and allow us to let down our guard, all of which contribute to long-term resilience. Real encounters can also switch on and off genes that control our immunity and the rate of tumor growth. But it isn’t all about neurochemistry. As Dr. Holt-Lunstad notes, “just having someone around to call 911 can be a matter of life or death.”