Bystanders who Intervene in an Attack

https://www.wsj.com/articles/bystanders-who-intervene-in-an-attack-11563464806?mod=searchresults&page=1&pos=1

In 1964, the Kitty Genovese case taught the world that strangers wouldn’t come to a victim’s aid. New research suggests that, in fact, they usually do.

ILLUSTRATION: TOMASZ WALENTA

If you were assaulted in a public place, do you think anyone would intervene? Or would they just look down at their shoes and walk on by?

Most people expect very little help from strangers, especially in the big cities to which vast populations in the modern world have migrated over the past century. Having once lived overwhelmingly in far-flung rural hamlets, Americans have long seen cities as anonymous, dangerous places.

In 1964, the fatal stabbing of Kitty Genovese made this sense of threat more palpable. Genovese, a 28-year-old woman returning from a night shift, was brutally attacked in the entrance of her Queens, N.Y. apartment building. Thirty-eight of her neighbors purportedly heard her screams and did nothing to help. Her story launched a new psychological term: the Bystander Effect, which refers to the idea that the greater the number of bystanders, the less likely people are to act as good Samaritans.

But there was a problem with both the term and the story. Many of the details of the Kitty Genovese story turned out to be false. She did die at the hands of a violent stranger, but subsequent sleuthing revealed that several bystanders did, in fact, try to intervene. And a study published last month in the journal American Psychologist confirms that bystanders aren’t as passive as we once thought. Not only will an observer often step forward in a real crisis, but the more people are present, the more likely the victim is to get assistance.

“It only takes one person to help the victim,” said Richard Philpot, the paper’s first author and a research fellow in psychology at Lancaster University and the University of Copenhagen. Working with three colleagues, Dr. Philpot broke away from previous approaches to documenting the Bystander Effect. Instead of simulating a violent emergency while groups of various sizes looked on, this new study analyzed closed circuit TV footage of real people interacting in public spaces.

In other words, Dr. Philpot’s team focused on genuine conflict—ranging from animated disagreement to physical violence—that spontaneously arose in public places in Amsterdam, Cape Town and Lancaster, U.K.

Four trained coders pored over footage from these cities’ surveillance cameras, culling 219 aggressive incidents from 1,225 video clips. The coders looked at the size of the crowd and zeroed in on any effort to calm the aggressor, to block contact between the two parties or pull the aggressor away, or to provide practical help to the victim.

What emerged was surprising, in more ways than one. Strangers intervened in nine out of 10 violent incidents. And the more people were around, the more likely it was that someone in trouble would get help. The consistency of the findings was remarkable: “South Africa was the only place were we saw weapons such as machetes, axes or knives,” said Dr. Philpot. “But victims were equally likely to be helped in conflicts there as they would be in the U.K. or the Netherlands.”

Some questions remain: How do people know when it’s safe to intervene, and would these findings hold in less populated places? But the new study is reassuring, at least for city-dwellers. There is indeed strength in numbers. It’s too late for Kitty Genovese, but there’s still time for the rest of us, who could be that one person in a crowd who steps forward to help.

 

Envy at its Worst: In the Future Tense

https://www.wsj.com/articles/the-worst-form-of-envy-in-the-future-tense-11560527404

A new study shows that we’re more jealous of our friends’ plans and prospects than their past accomplishments.

ILLUSTRATION: TOMASZ WALENTA

It is better to be envied than to be pitied, wrote the Greek historian Herodotus, and in our use of social media it’s clear that most of us agree. After all, why post selfies of yourself and your sweetheart lifting champagne flutes en route to Thailand if not to induce an eat-your-heart-out response in your friends?

Ubiquitous public displays of everyone’s happy moments—with the low points edited out—are one reason, according to a 2017 study, that most of us believe other people lead richer social lives than we do. Research shows that most of us think we are better looking, smarter, more competent and of course less biased than other people. But our perspectives do a 180 when it comes to our social lives.

When we hear about our friends’ plans to share a summer cottage on the seashore, for example, we may feel green with envy. We imagine that they’ll have a blast—the group dinners on the screened-in porch, swimming together at the pond, the bike rides to get ice cream. The idyllic possibilities are enough to make a working stiff with one week of holiday gag.

Now a new study, published last month in the journal Psychological Science, adds a counterintuitive twist to this familiar story: Other people’s plans for the future irritate us much more than experiences they’ve had in the past. At first this idea might seem strange. Don’t we feel envy for people’s accomplishments and possessions, such as their expensive cars? We do, but the study shows that jealousy stings even more when we think about the good things that lie in their future. In fact, our envious feelings ramp up as someone else’s fortunate occasion gets closer—and plummet once the event is over. Apparently envy is time-stamped, like an email or a tray of raw chicken from the supermarket.

In the study, researchers asked 620 participants to imagine their best friend in each of five circumstances: on a dream vacation, a dream date or in the ideal job, car or house. To make the exercise more personal, the participants were asked for the initials of their best friend. Then they had to rate, on a scale of 1 to 7, how they would feel during the days and weeks leading up to each drool-worthy event, as well as the days and weeks that followed. “Imagine the 10 days and nights of your friend in Maui next month. How do you feel about this?” said Ed O’Brien, one of the study’s lead authors and a professor at the University of Chicago. “Now roll back a year. How does the trip make you feel now?”

The researchers discovered that envy includes negative emotions, like malice, but can also be a source of positive feelings, like increased motivation. Each feeling has its own timetable: “While negative reactions decrease over time once the friend has achieved something we want, time has no effect on the positive form of envy, the motivation to try to do that sort of thing in the future,” explained Prof. O’Brien.

The finding that envy is more predictably triggered by other people’s future prospects than by their past achievements fits into a larger argument. The human brain evolved to have an emotional bias toward the future. according to American social psychologist Roy Baumeister and his colleagues. First we focus on what we want, and then a second, more cautious stage of thinking comes into play so we can figure out what we must do to get it.

So if we want a kinder social media world, we should stick to posting about events that have come and gone, rather than inciting envy with things we’re looking forward to. Got plans for a long, lazy summer frolicking with friends at the beach? For now, you might want to keep that to yourself.

Pollution that Discriminates by Gender

https://www.wsj.com/articles/pollution-that-discriminates-by-gender-11556812419

A new study shows that boys’ brains are more vulnerable than girls’ to lead exposure in early childhood

ILLUSTRATION: TOMASZ WALENTA

While in Australia last month I learned that female green sea turtles on the Great Barrier Reef now outnumber males by 116 to 1. Biologists blame it on the rising temperature of the sand on Australian nesting beaches: The warmer the sand, the more females hatch. In Sarnia, Ontario—known as Chemical Valley due to its 36 local petrochemical plants—emissions and runoff have halved the number of boys born in the area since the early 1990s, according to studies published in Environmental Health Perspectives.

Now a new study, published online last month in the journal Economics and Human Biology, shows that U.S. counties where lead in the topsoil exceeds the national average had twice the number of five-year-old boys with long-term cognitive problems. Five-year-old girls weren’t affected. Right from conception, it seems that environmental stress, especially pollution, discriminates on the basis of sex.

Edson Severnini, a professor of economics and public policy at Carnegie Mellon University, and his colleagues Karen Clay and Margarita Portnykh began with the United States Geological Survey’s recorded levels of lead in topsoil in 252 of the largest counties in the U.S. in 2000. They then turned to parents’ responses to a question on the 2000 census: Had their five-year-old experienced difficulties, for at least six months, with learning, memory, focus or decision making? The parents of over 77,000 children replied with a yes or a no.

We’ve long known lead to be dangerous, and adding the heavy metal to gasoline, house paint and pesticide has been banned now for decades. Nonetheless, we’re still living with lead’s legacy. Over the 20th century more than 6.5 million tons were released into the environment across the U.S., most of it still blowing around or sticking to soil particles. That is alarming because lead is a neurotoxin: It starves the brain—especially the frontal lobe of the developing brain—of protein and energy, and it doesn’t decompose.

To make matters worse, lead on painted windowsills and in garden soil tastes sugary. Innocently ingesting even tiny amounts of lead can translate to lower IQs and attentional and behavioral problems later on, researchers have found.

There is even evidence that higher levels of lead in the bloodstream can predict antisocial behavior and violence in adolescence and early adulthood, according to a 2012 study led by Tulane medical researcher Howard Mielke published in the journal Environment International.

The new Carnegie Mellon study reinforces the link between a child’s early lead exposure and an uncertain future. Preschoolers’ exposure to lead in their first five years of life increased their probability of compromised cognitive function, including a weaker ability to learn, solve problems and control one’s impulses. The study also adds two fascinating twists to the existing mound of scary data: Boys are twice as vulnerable as girls to early neural damage, and even levels of lead that are currently considered acceptable can exert a deleterious effect.

Our problems with lead aren’t history, this study shows. But it turns out that education can dampen lead’s harmful effects. “We actually show that if boys had some schooling, the [negative] effect was much smaller,” said Dr. Severnini. Except for restricting the lead in small aircraft fuel, which is still unregulated, and remediating contaminated soil where children live and play, we may not be able to control how much lead is still hanging around. But there is something we can do to protect our children’s brains, and it is called preschool

 

 

Women Have Younger Brains than Men

https://www.wsj.com/articles/women-have-younger-brains-than-men-11553708268

A new study of brain metabolism reveals a gender gap with important implications for the way we age

ILLUSTRATION: TOMASZ WALENTA

Women tend to live longer than men. This is one of the most robust findings in biological science, and the evidence isn’t hard to find. In the U.S., women outlive men by almost five years, on average, while the gap is as wide as 10 years in Latvia and Vietnam. Now there is fresh evidence that women not only have a longevity advantage; their brains seem to be more youthful throughout adulthood, too.

The new study, published last month in the Proceedings of the National Academy of Science, was led by radiologist Manu Goyal and neurologist Marcus Raichle, both at the Washington University School of Medicine. It shows that, when measuring brain metabolism—that is, the rate at which it uses glucose and oxygen to power its energy-hungry activities—the brains of adult women consistently appear about three years younger than men’s brains do. This brain-age gender gap mirrors the difference in longevity, and it may tell us something important about how sex differences in neural development affect how long we keep our marbles—and, ultimately, how long we live.

In the early 1970s, Dr. Raichle was one of the first neuroscientists to use PET scans to look at cognitive function in a living person’s brain. Now 82, he is still at it. In this study, the Goyal-Raichle research team deployed PET scans to assess how much energy is consumed by an adult’s brain and exactly where in the brain the demand is greatest. “Glucose is like coal. It burns up in the brain and produces energy,” Dr. Raichle said. “But when you burn up things in the brain you produce byproducts that the brain doesn’t want hanging around,” he explained. Glucose makes energy and also does the mopping up afterward. Thus, how much glucose is used to power the brain’s daily activities, including its cleanup functions, will tell you something about how vigorous and youthful that brain is.

In the new study, 205 healthy adults between the ages of 20 and 82 underwent PET scans while lying quietly in the scanner with their eyes closed. The researchers used the scans to assess blood flow to various neural regions, and also to track two forms of glucose uptake: one that burns oxygen (oxydative) and one that doesn’t (non-oxydative). Based on a study they published in 2017, the researchers already knew that our brains use glucose in both ways when we’re young, but that non-oxydative glucose consumption takes a nose dive as we age. They also knew that, after puberty, blood flow in the brain declines less in women than it does in men.

These persisting gender differences in brain metabolism meant that women’s brains often looked younger than those of men their age. The researchers used machine learning to detect distinctive patterns in the brains they studied. “When we trained it on males and tested it on females, then it guessed the female’s brain age to be three to four years younger than the women’s chronological age,” said Dr. Goyal. Conversely, when the machine was trained to see female metabolic patterns as the standard, it guessed men’s brains to be two to three years older than they actually were. That difference in metabolic brain age added up to approximately a three year advantage for women.

These brain age differences persisted across the adult lifespan and were visible even when people’s brains showed the harbingers of Alzheimer’s disease. “These new findings provide yet more evidence, as if more were needed, of just how ubiquitous sex influences on brain function have proven to be, often showing up in places we least expect them,” said Larry Cahill, a neuroscientist who studies sex differences in the brain at University of California, Irvine. “The fact that we often struggle to understand what they mean—as happens in the rest of neuroscience—does not make them less important.”

Inheriting the Trauma of Genocide

Research shows that atrocities witnessed by Tutsi survivors in Rwanda can leave marks on their children, born years later

Inheriting the Trauma of Genocide
ILLUSTRATION: TOMASZ WALENTA

A century ago, we called it shell shock—the legacy of trauma that wrought havoc on the bodies and minds of some World War I veterans. Today, when survivors of terrible events experience flashbacks and fears that disrupt their daily lives, long after the actual threat is gone, it is called post-traumatic stress disorder, or PTSD. Whatever you call it, we now know that direct exposure to extreme deprivation, violence, dislocation or torture can transform not only those who experience it but also their future offspring.

Scores of studies have shown that the adult children of Holocaust survivors who suffered from PTSD are at high risk of developing the disorder themselves. Now, 25 years after the genocide against the Tutsi of Rwanda, an unusual collaboration between Rwandan and Israeli researchers has uncovered similar intergenerational effects. The atrocities witnessed by surviving Tutsis have left enduring marks on their adult children, the vast majority of whom hadn’t been born when the genocide took place in 1994.

The study, published last month in the journal Psychiatry Research, was led by Amit Shrira, a professor of psycho-gerontology at Bar Ilan University, working with Benjamin Mollov, a Bar Ilan social scientist, and Chantal Mudahogora, a Canadian researcher and a survivor of the genocide in Rwanda. The team studied 60 pairs of Tutsi survivor parents and their adult children, all of whom completed two questionnaires about their psychological state.

Not all families who share a brutal history are the same, the researchers discovered. According to their responses, the Tutsi parents could be divided into three groups.

The first group (33.3%) suffered from complex PTSD, a new diagnostic category that includes panic attacks, recurring nightmares and intrusive memories. They perceived themselves as helpless and had difficulty maintaining close relationships. The second group of parents (26.7%) had simple PTSD: They continued to relive the traumatic events and personal losses they had experienced and were haunted by a sense of threat. The third group (40%) didn’t have any clinical symptoms of PTSD. Though it is impossible not to experience distress after surviving a genocide, this group of parents seemed to be surprisingly resilient; their feelings of grief didn’t disrupt their lives to the degree that they merited a clinical diagnosis.

The more severe a parent’s symptoms, it turned out, the more severe were those of the adult child. “The children of parents with complex PTSD suffered the highest level of secondary traumatization, with symptoms related to the parental trauma,” said Dr. Shrira. “These children were born after the genocide, but they had nightmares about it and were more restless and hypervigilant than the children in the other two groups.”

Ms. Mudahogora noted that her own three adult children have been affected by her experience as a survivor. “I never grew up with any grandparents, so I miss that a lot,” said her son Chris Mucyo, a 23-year-old university student who was born after the genocide and participated in the study as a subject. “I have great parents, but it is a lot of pressure. We didn’t have a period of innocence as kids.”

That self-awareness is telling. Having parents who modeled coping skills while talking openly about their losses can make a big difference, said Dr. Shrira, who has observed the same resilience in the families of many Holocaust survivors. It is unclear why some survivors transmit trauma to the next generation while others do not. We know that the vulnerability of parents to trauma is likely passed on to their children through genetic and epigenetic means, but how they face adversity is also a factor, speculates Dr. Shrira. “After all, the transmission of trauma is also the way the story is told.”

Aggression in Boys Is a Family Matter

A new study of children shows that problematic behavior can be identified in infancy, if not before, by looking at their background and circumstances

Aggression in Boys Is a Family Matter
ILLUSTRATION: TOMASZ WALENTA

In 25 years of clinical practice as a psychologist I’ve seen my share of aggressive boys. They ended up in my office because they kicked classmates, poked children with scissors or, in one unforgettable case, set another student’s collar on fire to get his attention. (It worked.) These boys were usually between the ages of 4 and 10, and most of them came from chaotic homes with few clear expectations, closely spaced siblings and an overwhelmed mother. The father was often absent, literally or psychologically. They had been referred by their teachers, who wanted the aggression to stop.

It was a tall order, and I was only successful some of the time. Now I know that intervention should have started much earlier. A new study published in the JAMA Network in December suggests that children who are most likely to be aggressive throughout childhood and adolescence can be identified in infancy, if not before, by looking at their family histories.

The longitudinal study, led by Richard E. Tremblay of Université de Montréal, followed the development of 2,200 randomly selected babies born in Quebec in the late 1990s, using reports from parents, teachers and the children themselves. In a previous study, the research team found that many children try to use physical force during their first two years of life, by hitting, biting or kicking others. Normally, such aggression peaks between 2.5 and 3.5 years old, and then peters out as children mature and gradually learn society’s rules for social interaction—just in time for them to enter school.

But aggression persists in a small group of children, and the study suggests that they often share a similar home environment. “Children who show problems early are from very specific families,” said Prof. Tremblay. The predictors include one or both parents having been physically aggressive as children or having failed in school. This background is likely to keep families trapped in a dysfunctional cycle. “If you’re physically aggressive and you fail in school, the likelihood of getting a job to get out of poverty is almost nil,” Prof. Tremblay explains.

“These chronically aggressive families could be identified by obstetricians.” Prof. Tremblay says, “so interventions can start close to conception. Starting that early would have a much better impact than waiting until a child is in school.” Yet most aggressive children aren’t identified until they start kindergarten. Teachers are left to wonder, “Why didn’t the parents tell me this child has a problem?”

The answer to that question is one of the study’s most intriguing findings. Unusually aggressive boys were seldom identified by their mothers early on, the researchers discovered. Even if these boys were poised to hurt others and struggle in school, they had to wait for teachers to take note in order to get assistance. In contrast, mothers are more likely to get help for girls, who at any age are far less likely to be physically aggressive than boys.

Whether due to helplessness or denial, this blind spot can be remedied. Some states already offer support such as nurse-family partnership programs, which match vulnerable first-time mothers with registered nurses who visit them up to 60 times during pregnancy and in the child’s first years of life. The nurses offer solid, nonjudgmental advice on subjects like breastfeeding, communicating and bonding with infants and, if necessary, how to quit smoking or drinking.

Through this ongoing relationship, these mothers learn how to sustain a healthy pregnancy and respond to their developing child before he becomes intransigently aggressive, not after. Prof. Tremblay thinks that is wise. “You have to invest in prevention based on the known predictors in the family, and not wait until the second, third or fourth child is born.”

Resilient Teens in a Dangerous World

Resilient Teens in a Dangerous World

A study shows that strong executive function in the brain helps young people’s bodies resist the stress of living in a rough neighborhood

Resilient Teens in a Dangerous World
ILLUSTRATION: TOMASZ WALENTA

“That which doesn’t kill me makes me stronger,” Friedrich Nietzsche famously wrote. But the truth is likely to be the reverse: If an ordeal doesn’t break you, it is probably because you were stronger in the first place.

So says a paper just published in the journal Proceedings of the National Academy of Sciences by Greg Miller, a psychology professor at Northwestern University’s Institute for Policy Research, and his colleagues. By looking at the health of teenagers living in Chicago’s rougher neighborhoods, they explored why some young people are more resilient than others. Why is it that we don’t all react to adversity the same way?

The team already knew when they started that proximity to crime can damage the health of young people. On the night after a violent crime, for example, local teens’ sleep patterns are often disrupted and their cortisol levels spike, according to a study published in 2017. Even if they don’t witness it directly, many teens’ awareness of a local homicide or assault leaves a distinct biological signature in the form of an elevated heart rate and altered behavior.

These clues show that fear is literally getting under the skins of young people, engendering long-term cardiovascular risks. “Kids who live where there is violent crime have more health issues than [those] in higher income neighborhoods. But it’s not just about income. Violence per se is related to higher blood pressure, blood sugar, more obesity and all the precursors of diabetes and heart disease,” Prof. Miller told me.

Yet some children who live in the same areas seem to be shielded from crime’s corrosive effects. To find out what protects them, researchers recruited a diverse group of 218 eighth-graders in Chicago. The researchers tracked the incidence of crime in each teenager’s neighborhood by using data from local police departments. On average, the adolescents in the study lived in areas where the murder rate is 142% higher than elsewhere in the U.S.

The researchers brought the teens into the lab for two testing sessions. First, their weight, blood pressure, abdominal fat, blood insulin and glucose levels were measured—all of which can be signs of cardiovascular risk. The second time, the young people underwent brain imaging, with fMRIs to assess the strength and efficiency of certain resting state neural networks. The goal was to check for any differences between the brains of the children with cardiovascular risk factors and those of the hardier ones. Could the more resilient participants be identified by their brain scans alone?

Indeed they could. The study found that young people with a stronger, more efficient central executive network (CEN)—the brain areas that govern the regulation of emotion, and the vigilance that accompanies fear—were also the ones who demonstrated fewer cardiovascular risk factors. “We used statistical measures to rule out the effect of the family’s income, their area’s level of pollution or segregation, or the availability of fresh food,” Prof. Miller told me. “The kids with low CEN connectivity had worse cardiovascular health.”

What this means is that participants whose brains allowed them to suppress intrusive, unpleasant thoughts and to reappraise threatening events were also physically more resilient. They were thinner, with lower biological measures of stress, and their hearts were stronger.

This was an observational study, so we can’t say for certain that a teenager’s brain architecture is what’s keeping their damaging physical reactions to violence in check. Perhaps some kids are just born resilient. Their more measured cognitive response to life’s risks comes packaged with the signs of robust cardiovascular health. I wouldn’t say that these kids have all the luck, but they do seem biologically equipped to face stress with equanimity. And that’s a good start.

When It Comes to Sleep, One Size Fits All

When It Comes to Sleep, One Size Fits All

A massive new study shows that every adult needs 7-8 hours a night, or else their cognitive abilities will suffer

When It Comes to Sleep, One Size Fits All
ILLUSTRATION: TOMASZ WALENTA

I’d always thought that our need for sleep, like our appetite for food, drink or social contact, was a personal matter: Some people need more, some need less. Age, lifestyle, work and metabolism combine to determine how much sleep a person needs to function, and if some people thrive on five hours a night and others require seven, chalk it up to different strokes for different folks, right?

Wrong. A new study of the sleep habits of more than 10,000 people around the world suggests that the amount of sleep adults need is universal. The massive survey, published in the journal Sleep, demonstrates that adults everywhere need 7-8 hours a night—no more and no less—in order to be mentally limber. When we habitually stint on those hours, higher-order cognitive processing—such as the ability to see complex patterns and solve problems—is compromised.

The lead author of the study, Conor Wild, a research associate at the Brain and Mind Institute at the University of Western Ontario, explained how a research team directed by the neuroscientist Adrian Owen recruited thousands of people for the online study. “We did a lot of social media advertising and radio interviews. And the BBC put together a two-minute spot on Dr. Owen’s sleep research,” Dr. Wild said.

This blitz prompted more than 40,000 people from 150 countries to sign up. Ultimately, two thirds of them were eliminated due to technical glitches, incomplete questionnaires, or extreme responses—such as reporting that they sleep zero hours a night, or more than 16.

The 10,886 adults who remained filled out a detailed online questionnaire about their backgrounds, medical histories and sleep patterns. How long was their average night’s sleep? How often was their rest interrupted, and how consistent were their nights?

Once their sleep habits were recorded, the participants completed a battery of 12 cognitive tests. Puzzle-like tasks assessed their spatial, verbal and short-term memories, as well as their capacity for deductive reasoning, sustained attention, planning and clear expression.

The findings that emerged were startling. Half the sample averaged less than 6.4 hours of sleep a night—a pattern that was associated with impaired problem-solving, reasoning and verbal acuity. Those who routinely slept six hours a night or less flubbed more questions based on spatial rotation or grammatical reasoning and left more tasks incomplete than those who got a full night’s rest.

Surprisingly, mere sleep deprivation—that is, one or two nights with little or no sleep—did not alter reasoning or verbal skills, though it did hobble short-term memory. This finding is reassuring, given that many professionals—think of hospital residents and airline pilots—have to make life-or-death decisions based on exactly this kind of erratic sleep schedule. If their short-term memory is compromised, they can always look up facts on their phones. But for split-second, life-changing decisions, they are on their own.

Regularly sleeping too little seems to be much more damaging than having one or two bad nights. Getting four hours of sleep or less for an extended period is equivalent to adding eight years to one’s age when it comes to test performance, the study shows—a significant decline. But a single good night can repair some of the damage: People who slept more than usual the night before they were tested ended up acing more of the cognitive tests.

So if you want to be articulate, solve a pesky problem, parallel park or organize an effective team—or even your closet—you’ll definitely need that nightly 7-8 hours of sleep.

When a Better Neighborhood Is Bad for Boys

Research shows that when poor families move into more expensive housing, girls’ lives improve while boys’ get worse. What explains the difference?

By

SUSAN PINKER

Sept. 26, 2018 11:10 a.m. ET

See the column on the Wall Street Journal site

Imagine you’re a single mother living at or below the poverty line in a troubled neighborhood. If you want to shield your teenager from drinking and mental distress, should you try to move to a better area or stay put? The answer depends on whether your teen is a boy or a girl, according to a new paper published in the journal Addiction.

The lead author of the study, University of Minnesota epidemiologist Theresa Osypuk, investigated the drinking habits and mental health of teenagers whose families lived in public housing in the late 1990s. About two-thirds of the families were randomly chosen to receive housing vouchers, allowing them to move into better areas.

Between four and seven years later, the researchers found, adolescent girls who had moved into more expensive neighborhoods were far less likely to drink to excess than girls who remained in public housing. But boys whose families had moved binged more. This surprising finding challenges the assumption that behavioral risks increase with economic hardship and that poverty affects women and men the same way.

It all started with a controversial social experiment called Moving to Opportunity. The goal was to give the mostly female-led families living in public housing a leg-up in the labor market, not by improving their skills but by improving their housing. From 1994 to 1998, almost 5,000 low- income families in five cities—New York, Boston, Chicago, L.A. and Baltimore—were offered the chance to participate in a lottery.

Those who opted in were randomly assigned to one of three groups. The first group received a voucher that tripled their rent budget. With this windfall they were expected to move into a nicer neighborhood. A second group got the same voucher along with relocation counseling. A third was the control group: They stayed in public housing and presumably nothing would change for them.

The results were disappointing at first. To the chagrin of the policy wonks who designed the program, improving where women lived had absolutely no effect on their employment. But it had a big impact on their health. “Rates of obesity were lower, markers of diabetes were better, mental health was better,” Prof. Osypuk said.

The second eye-opener was that moving to better neighborhoods affected men and women differently. “The households were mainly led by moms, who saw mental health benefits, and their girls did, too. But the boys saw no mental health effects, or negative effects,” said Prof. Osypuk.

The key factor was how vulnerable people were before the move. Boys are developmentally more fragile than girls, with higher rates of learning and behavior problems. That’s one reason why the well-being of the boys in the voucher groups tanked, according to Prof. Osypuk. Boys who moved out of public housing not only drank more but also showed higher rates of distress, depression and behavior problems, according to a 2012 paper that she and her team published in the journal Pediatrics.

“Boys have mental health disadvantages, and the stress of moving adds insult to injury,” Prof. Osypuk said. Just when these vulnerable boys most needed predictability, their social worlds were upended. “They moved down in the social hierarchy and hung out with riskier boys,” speculated Prof. Osypuk. Meanwhile, girls who moved to better neighborhoods experienced fewer sexual stressors and adapted to their new circumstances more easily.

When it comes to moving out of poverty, it would seem that equal treatment for everyone is only fair. This research, however, hammers home the idea that one size does not fit all.

An Unforgettable Memory Expert Muses at 100

Brenda Milner is celebrated for her insight into recollections as a feature of neurobiology; the man who could only live in the present

By

SUSAN PINKER

Updated Aug. 23, 2018 10:28 a.m. ET

See the column on the Wall Street Journal site

If you’ve ever wondered where your memory has gone, ask Brenda Milner. The British-Canadian, who just turned 100, was one of the first researchers to discover how memories are stashed in the brain. Having spent the last 68 years investigating how we consolidate new knowledge, you could say that she knows a thing or two about remembering.

Dr. Milner began her career as one of a handful of women admitted to study mathematics at Cambridge University in 1936. Her determination was evident even then. “Cambridge was associated with mathematics and physics—you know Isaac Newton went there. That’s where I wanted to

go and nowhere else,” she told me in 2007 (I recently interviewed her again by email).

This tenacity served Dr. Milner well when she moved from crunching numbers at the British Defense Ministry to Montreal in 1944, to pursue a Ph.D. in psychology. There she worked with the neurologist Wilder Penfield at McGill’s Montreal Neurological Institute. Their research on the post-surgical brain function of epileptic patients led her to reject the then-fashionable theories that memory was a product of Freudian urges or behaviorist stimulus-response chains. Her key insight was to see memory as a feature of human neurobiology.

Dr. Milner is now considered one of the founders of cognitive neuroscience, which links the mind—perceiving, thinking, remembering—to the brain. One of the current leaders in the field, Michael Gazzaniga of the University of California, Santa Barbara, calls her “a true pioneer.”

When she started in the 1950s, the only way to localize mental activity was to see what had changed after injuries or surgery. One patient was Henry Molaison, a 24-year-old

from Connecticut who suffered from debilitating epilepsy. H.M., as he was known until his death in 2008, underwent surgery to remove parts of his temporal lobe—including his hippocampus—which the doctors thought to be the locus of his seizures.

Dr. Milner tested his cognitive function after surgery and in a 1957 paper described what happened next. Though H.M.’s personality and intelligence seemed unchanged “there has been one striking and totally unexpected behavioral result: a grave loss of recent memory. After the operation this young man could no longer recognize the hospital staff nor find his way to the bathroom.” H.M. remembered events from his distant past and with practice could learn new motor skills, but without his hippocampus, any novel experience—who he just met or what he ate for lunch—never jelled into a long-term memory.

H.M. was forced to live in the present, which despite its Zen billing, had its downsides. He had to learn of his father’s death over and over again. Each time he grieved anew. Ultimately he kept a reminder in his pocket as a form of self-protection.

By showing how distinct types of memory are stored in different brain systems—how to ride a bike or sing a Broadway tune is stored differently than the name of your third-grade teacher—Dr. Milner revamped neuroscience’s atlas of

memory. Knowing how to do something does not require the hippocampus. Knowing that you’ve learned something does.

Dr. Milner still goes to the lab a few days a week. Though most people associate her with H.M., she is “more excited about my frontal lobe work,” which helped to define the seat of self-control, planning and decision-making.

“Brain imaging is a huge thing,” she said in a recent email, when I asked what had changed in 60 years. “Back then, you had to wait until the subject died because the only way to see the brain was to dissect it.” Now you can assess healthy young adults. “To see the brain images of a living person while testing them is extremely exciting.” After all, she added, “we all go downhill after our mid-40s.”

Clearly, Brenda Milner is the ultimate exception to that rule.